Sample records for instrumental variable methods

  1. Tutorial in Biostatistics: Instrumental Variable Methods for Causal Inference*

    PubMed Central

    Baiocchi, Michael; Cheng, Jing; Small, Dylan S.

    2014-01-01

    A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment and instead an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. PMID:24599889

  2. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    PubMed

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  3. Instrumental Variable Analysis with a Nonlinear Exposure–Outcome Relationship

    PubMed Central

    Davies, Neil M.; Thompson, Simon G.

    2014-01-01

    Background: Instrumental variable methods can estimate the causal effect of an exposure on an outcome using observational data. Many instrumental variable methods assume that the exposure–outcome relation is linear, but in practice this assumption is often in doubt, or perhaps the shape of the relation is a target for investigation. We investigate this issue in the context of Mendelian randomization, the use of genetic variants as instrumental variables. Methods: Using simulations, we demonstrate the performance of a simple linear instrumental variable method when the true shape of the exposure–outcome relation is not linear. We also present a novel method for estimating the effect of the exposure on the outcome within strata of the exposure distribution. This enables the estimation of localized average causal effects within quantile groups of the exposure or as a continuous function of the exposure using a sliding window approach. Results: Our simulations suggest that linear instrumental variable estimates approximate a population-averaged causal effect. This is the average difference in the outcome if the exposure for every individual in the population is increased by a fixed amount. Estimates of localized average causal effects reveal the shape of the exposure–outcome relation for a variety of models. These methods are used to investigate the relations between body mass index and a range of cardiovascular risk factors. Conclusions: Nonlinear exposure–outcome relations should not be a barrier to instrumental variable analyses. When the exposure–outcome relation is not linear, either a population-averaged causal effect or the shape of the exposure–outcome relation can be estimated. PMID:25166881

  4. A selective review of the first 20 years of instrumental variables models in health-services research and medicine.

    PubMed

    Cawley, John

    2015-01-01

    The method of instrumental variables (IV) is useful for estimating causal effects. Intuitively, it exploits exogenous variation in the treatment, sometimes called natural experiments or instruments. This study reviews the literature in health-services research and medical research that applies the method of instrumental variables, documents trends in its use, and offers examples of various types of instruments. A literature search of the PubMed and EconLit research databases for English-language journal articles published after 1990 yielded a total of 522 original research articles. Citations counts for each article were derived from the Web of Science. A selective review was conducted, with articles prioritized based on number of citations, validity and power of the instrument, and type of instrument. The average annual number of papers in health services research and medical research that apply the method of instrumental variables rose from 1.2 in 1991-1995 to 41.8 in 2006-2010. Commonly-used instruments (natural experiments) in health and medicine are relative distance to a medical care provider offering the treatment and the medical care provider's historic tendency to administer the treatment. Less common but still noteworthy instruments include randomization of treatment for reasons other than research, randomized encouragement to undertake the treatment, day of week of admission as an instrument for waiting time for surgery, and genes as an instrument for whether the respondent has a heritable condition. The use of the method of IV has increased dramatically in the past 20 years, and a wide range of instruments have been used. Applications of the method of IV have in several cases upended conventional wisdom that was based on correlations and led to important insights about health and healthcare. Future research should pursue new applications of existing instruments and search for new instruments that are powerful and valid.

  5. [Instruments for quantitative methods of nursing research].

    PubMed

    Vellone, E

    2000-01-01

    Instruments for quantitative nursing research are a mean to objectify and measure a variable or a phenomenon in the scientific research. There are direct instruments to measure concrete variables and indirect instruments to measure abstract concepts (Burns, Grove, 1997). Indirect instruments measure the attributes by which a concept is made of. Furthermore, there are instruments for physiologic variables (e.g. for the weight), observational instruments (Check-lists e Rating Scales), interviews, questionnaires, diaries and the scales (Check-lists, Rating Scales, Likert Scales, Semantic Differential Scales e Visual Anologue Scales). The choice to select an instrument or another one depends on the research question and design. Instruments research are very useful in research both to describe the variables and to see statistical significant relationships. Very carefully should be their use in the clinical practice for diagnostic assessment.

  6. Power calculator for instrumental variable analysis in pharmacoepidemiology

    PubMed Central

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-01-01

    Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313

  7. Influence of Strategy of Learning and Achievement Motivation of Learning Achievement Class VIII Students of State Junior High School in District Blitar

    ERIC Educational Resources Information Center

    Ayundawati, Dyah; Setyosari, Punaji; Susilo, Herawati; Sihkabuden

    2016-01-01

    This study aims for know influence of problem-based learning strategies and achievement motivation on learning achievement. The method used in this research is quantitative method. The instrument used in this study is two fold instruments to measure moderator variable (achievement motivation) and instruments to measure the dependent variable (the…

  8. Network Mendelian randomization: using genetic variants as instrumental variables to investigate mediation in causal pathways

    PubMed Central

    Burgess, Stephen; Daniel, Rhian M; Butterworth, Adam S; Thompson, Simon G

    2015-01-01

    Background: Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. Methods: We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. Results: These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. Conclusions: These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes. PMID:25150977

  9. Dynamic modal estimation using instrumental variables

    NASA Technical Reports Server (NTRS)

    Salzwedel, H.

    1980-01-01

    A method to determine the modes of dynamical systems is described. The inputs and outputs of a system are Fourier transformed and averaged to reduce the error level. An instrumental variable method that estimates modal parameters from multiple correlations between responses of single input, multiple output systems is applied to estimate aircraft, spacecraft, and off-shore platform modal parameters.

  10. Cesarean delivery rates among family physicians versus obstetricians: a population-based cohort study using instrumental variable methods

    PubMed Central

    Dawe, Russell Eric; Bishop, Jessica; Pendergast, Amanda; Avery, Susan; Monaghan, Kelly; Duggan, Norah; Aubrey-Bassler, Kris

    2017-01-01

    Background: Previous research suggests that family physicians have rates of cesarean delivery that are lower than or equivalent to those for obstetricians, but adjustments for risk differences in these analyses may have been inadequate. We used an econometric method to adjust for observed and unobserved factors affecting the risk of cesarean delivery among women attended by family physicians versus obstetricians. Methods: This retrospective population-based cohort study included all Canadian (except Quebec) hospital deliveries by family physicians and obstetricians between Apr. 1, 2006, and Mar. 31, 2009. We excluded women with multiple gestations, and newborns with a birth weight less than 500 g or gestational age less than 20 weeks. We estimated the relative risk of cesarean delivery using instrumental-variable-adjusted and logistic regression. Results: The final cohort included 776 299 women who gave birth in 390 hospitals. The risk of cesarean delivery was 27.3%, and the mean proportion of deliveries by family physicians was 26.9% (standard deviation 23.8%). The relative risk of cesarean delivery for family physicians versus obstetricians was 0.48 (95% confidence interval [CI] 0.41-0.56) with logistic regression and 1.27 (95% CI 1.02-1.57) with instrumental-variable-adjusted regression. Interpretation: Our conventional analyses suggest that family physicians have a lower rate of cesarean delivery than obstetricians, but instrumental variable analyses suggest the opposite. Because instrumental variable methods adjust for unmeasured factors and traditional methods do not, the large discrepancy between these estimates of risk suggests that clinical and/or sociocultural factors affecting the decision to perform cesarean delivery may not be accounted for in our database. PMID:29233843

  11. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    PubMed

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  12. Data Combination and Instrumental Variables in Linear Models

    ERIC Educational Resources Information Center

    Khawand, Christopher

    2012-01-01

    Instrumental variables (IV) methods allow for consistent estimation of causal effects, but suffer from poor finite-sample properties and data availability constraints. IV estimates also tend to have relatively large standard errors, often inhibiting the interpretability of differences between IV and non-IV point estimates. Lastly, instrumental…

  13. Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error

    NASA Astrophysics Data System (ADS)

    Miller, Austin

    In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.

  14. Critical evaluation of connectivity-based point of care testing systems of glucose in a hospital environment.

    PubMed

    Floré, Katelijne M J; Fiers, Tom; Delanghe, Joris R

    2008-01-01

    In recent years a number of point of care testing (POCT) glucometers were introduced on the market. We investigated the analytical variability (lot-to-lot variation, calibration error, inter-instrument and inter-operator variability) of glucose POCT systems in a university hospital environment and compared these results with the analytical needs required for tight glucose monitoring. The reference hexokinase method was compared to different POCT systems based on glucose oxidase (blood gas instruments) or glucose dehydrogenase (handheld glucometers). Based upon daily internal quality control data, total errors were calculated for the various glucose methods and the analytical variability of the glucometers was estimated. The total error of the glucometers exceeded by far the desirable analytical specifications (based on a biological variability model). Lot-to-lot variation, inter-instrument variation and inter-operator variability contributed approximately equally to total variance. As in a hospital environment, distribution of hematocrit values is broad, converting blood glucose into plasma values using a fixed factor further increases variance. The percentage of outliers exceeded the ISO 15197 criteria in a broad glucose concentration range. Total analytical variation of handheld glucometers is larger than expected. Clinicians should be aware that the variability of glucose measurements obtained by blood gas instruments is lower than results obtained with handheld glucometers on capillary blood.

  15. Treating pre-instrumental data as "missing" data: using a tree-ring-based paleoclimate record and imputations to reconstruct streamflow in the Missouri River Basin

    NASA Astrophysics Data System (ADS)

    Ho, M. W.; Lall, U.; Cook, E. R.

    2015-12-01

    Advances in paleoclimatology in the past few decades have provided opportunities to expand the temporal perspective of the hydrological and climatological variability across the world. The North American region is particularly fortunate in this respect where a relatively dense network of high resolution paleoclimate proxy records have been assembled. One such network is the annually-resolved Living Blended Drought Atlas (LBDA): a paleoclimate reconstruction of the Palmer Drought Severity Index (PDSI) that covers North America on a 0.5° × 0.5° grid based on tree-ring chronologies. However, the use of the LBDA to assess North American streamflow variability requires a model by which streamflow may be reconstructed. Paleoclimate reconstructions have typically used models that first seek to quantify the relationship between the paleoclimate variable and the environmental variable of interest before extrapolating the relationship back in time. In contrast, the pre-instrumental streamflow is here considered as "missing" data. A method of imputing the "missing" streamflow data, prior to the instrumental record, is applied through multiple imputation using chained equations for streamflow in the Missouri River Basin. In this method, the distribution of the instrumental streamflow and LBDA is used to estimate sets of plausible values for the "missing" streamflow data resulting in a ~600 year-long streamflow reconstruction. Past research into external climate forcings, oceanic-atmospheric variability and its teleconnections, and assessments of rare multi-centennial instrumental records demonstrate that large temporal oscillations in hydrological conditions are unlikely to be captured in most instrumental records. The reconstruction of multi-centennial records of streamflow will enable comprehensive assessments of current and future water resource infrastructure and operations under the existing scope of natural climate variability.

  16. Regularization Methods for High-Dimensional Instrumental Variables Regression With an Application to Genetical Genomics

    PubMed Central

    Lin, Wei; Feng, Rui; Li, Hongzhe

    2014-01-01

    In genetical genomics studies, it is important to jointly analyze gene expression data and genetic variants in exploring their associations with complex traits, where the dimensionality of gene expressions and genetic variants can both be much larger than the sample size. Motivated by such modern applications, we consider the problem of variable selection and estimation in high-dimensional sparse instrumental variables models. To overcome the difficulty of high dimensionality and unknown optimal instruments, we propose a two-stage regularization framework for identifying and estimating important covariate effects while selecting and estimating optimal instruments. The methodology extends the classical two-stage least squares estimator to high dimensions by exploiting sparsity using sparsity-inducing penalty functions in both stages. The resulting procedure is efficiently implemented by coordinate descent optimization. For the representative L1 regularization and a class of concave regularization methods, we establish estimation, prediction, and model selection properties of the two-stage regularized estimators in the high-dimensional setting where the dimensionality of co-variates and instruments are both allowed to grow exponentially with the sample size. The practical performance of the proposed method is evaluated by simulation studies and its usefulness is illustrated by an analysis of mouse obesity data. Supplementary materials for this article are available online. PMID:26392642

  17. Instrumental variable methods in comparative safety and effectiveness research.

    PubMed

    Brookhart, M Alan; Rassen, Jeremy A; Schneeweiss, Sebastian

    2010-06-01

    Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial.

  18. Instrumental variables and Mendelian randomization with invalid instruments

    NASA Astrophysics Data System (ADS)

    Kang, Hyunseung

    Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with inferential results that are robust to mis-specifications of the covariate-outcome model. We also provide a sensitivity analysis should the instrument turn out to be invalid, specifically violate (A3). Fourth, in application work, we study the causal effect of malaria on stunting among children in Ghana. Previous studies on the effect of malaria and stunting were observational and contained various unobserved confounders, most notably nutritional deficiencies. To infer causality, we use the sickle cell genotype, a trait that confers some protection against malaria and was randomly assigned at birth, as an IV and apply our nonparametric IV method. We find that the risk of stunting increases by 0.22 (95% CI: 0.044,1) for every malaria episode and is sensitive to unmeasured confounders.

  19. Do Two or More Multicomponent Instruments Measure the Same Construct? Testing Construct Congruence Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.; Tong, Bing

    2016-01-01

    A latent variable modeling procedure is discussed that can be used to test if two or more homogeneous multicomponent instruments with distinct components are measuring the same underlying construct. The method is widely applicable in scale construction and development research and can also be of special interest in construct validation studies.…

  20. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample

    PubMed Central

    Wang, Ching-Yun; Song, Xiao

    2017-01-01

    SUMMARY Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women’s Health Initiative. PMID:27546625

  1. Instrumental variable methods in comparative safety and effectiveness research†

    PubMed Central

    Brookhart, M. Alan; Rassen, Jeremy A.; Schneeweiss, Sebastian

    2010-01-01

    Summary Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial. PMID:20354968

  2. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample.

    PubMed

    Wang, Ching-Yun; Song, Xiao

    2016-11-01

    Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women's Health Initiative. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.

    PubMed

    Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack

    2017-12-01

    Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  4. Analysis of Observational Studies in the Presence of Treatment Selection Bias: Effects of Invasive Cardiac Management on AMI Survival Using Propensity Score and Instrumental Variable Methods

    PubMed Central

    Stukel, Thérèse A.; Fisher, Elliott S; Wennberg, David E.; Alter, David A.; Gottlieb, Daniel J.; Vermeulen, Marian J.

    2007-01-01

    Context Comparisons of outcomes between patients treated and untreated in observational studies may be biased due to differences in patient prognosis between groups, often because of unobserved treatment selection biases. Objective To compare 4 analytic methods for removing the effects of selection bias in observational studies: multivariable model risk adjustment, propensity score risk adjustment, propensity-based matching, and instrumental variable analysis. Design, Setting, and Patients A national cohort of 122 124 patients who were elderly (aged 65–84 years), receiving Medicare, and hospitalized with acute myocardial infarction (AMI) in 1994–1995, and who were eligible for cardiac catheterization. Baseline chart reviews were taken from the Cooperative Cardiovascular Project and linked to Medicare health administrative data to provide a rich set of prognostic variables. Patients were followed up for 7 years through December 31, 2001, to assess the association between long-term survival and cardiac catheterization within 30 days of hospital admission. Main Outcome Measure Risk-adjusted relative mortality rate using each of the analytic methods. Results Patients who received cardiac catheterization (n=73 238) were younger and had lower AMI severity than those who did not. After adjustment for prognostic factors by using standard statistical risk-adjustment methods, cardiac catheterization was associated with a 50% relative decrease in mortality (for multivariable model risk adjustment: adjusted relative risk [RR], 0.51; 95% confidence interval [CI], 0.50–0.52; for propensity score risk adjustment: adjusted RR, 0.54; 95% CI, 0.53–0.55; and for propensity-based matching: adjusted RR, 0.54; 95% CI, 0.52–0.56). Using regional catheterization rate as an instrument, instrumental variable analysis showed a 16% relative decrease in mortality (adjusted RR, 0.84; 95% CI, 0.79–0.90). The survival benefits of routine invasive care from randomized clinical trials are between 8% and 21 %. Conclusions Estimates of the observational association of cardiac catheterization with long-term AMI mortality are highly sensitive to analytic method. All standard risk-adjustment methods have the same limitations regarding removal of unmeasured treatment selection biases. Compared with standard modeling, instrumental variable analysis may produce less biased estimates of treatment effects, but is more suited to answering policy questions than specific clinical questions. PMID:17227979

  5. Regression calibration for models with two predictor variables measured with error and their interaction, using instrumental variables and longitudinal data.

    PubMed

    Strand, Matthew; Sillau, Stefan; Grunwald, Gary K; Rabinovitch, Nathan

    2014-02-10

    Regression calibration provides a way to obtain unbiased estimators of fixed effects in regression models when one or more predictors are measured with error. Recent development of measurement error methods has focused on models that include interaction terms between measured-with-error predictors, and separately, methods for estimation in models that account for correlated data. In this work, we derive explicit and novel forms of regression calibration estimators and associated asymptotic variances for longitudinal models that include interaction terms, when data from instrumental and unbiased surrogate variables are available but not the actual predictors of interest. The longitudinal data are fit using linear mixed models that contain random intercepts and account for serial correlation and unequally spaced observations. The motivating application involves a longitudinal study of exposure to two pollutants (predictors) - outdoor fine particulate matter and cigarette smoke - and their association in interactive form with levels of a biomarker of inflammation, leukotriene E4 (LTE 4 , outcome) in asthmatic children. Because the exposure concentrations could not be directly observed, we used measurements from a fixed outdoor monitor and urinary cotinine concentrations as instrumental variables, and we used concentrations of fine ambient particulate matter and cigarette smoke measured with error by personal monitors as unbiased surrogate variables. We applied the derived regression calibration methods to estimate coefficients of the unobserved predictors and their interaction, allowing for direct comparison of toxicity of the different pollutants. We used simulations to verify accuracy of inferential methods based on asymptotic theory. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Comparison of variance estimators for meta-analysis of instrumental variable estimates

    PubMed Central

    Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F

    2016-01-01

    Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262

  7. A Direct Latent Variable Modeling Based Method for Point and Interval Estimation of Coefficient Alpha

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A direct approach to point and interval estimation of Cronbach's coefficient alpha for multiple component measuring instruments is outlined. The procedure is based on a latent variable modeling application with widely circulated software. As a by-product, using sample data the method permits ascertaining whether the population discrepancy…

  8. Introduction to meteorological measurements and data handling for solar energy applications. Task IV. Development of an isolation handbook and instrument package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The following are covered: the Sun and its radiation, solar radiation and atmospheric interaction, solar radiation measurement methods, spectral irradiance measurements of natural sources, the measurement of infrared radiation, the measurement of circumsolar radiation, some empirical properties of solar radiation and related parameters, duration of sunshine, and meteorological variables related to solar energy. Included in appendices are manufacturers and distributors of solar radiation measuring instruments and an approximate method for quality control of solar radiation instruments. (MHR)

  9. Invited Commentary: Using Financial Credits as Instrumental Variables for Estimating the Causal Relationship Between Income and Health.

    PubMed

    Pega, Frank

    2016-05-01

    Social epidemiologists are interested in determining the causal relationship between income and health. Natural experiments in which individuals or groups receive income randomly or quasi-randomly from financial credits (e.g., tax credits or cash transfers) are increasingly being analyzed using instrumental variable analysis. For example, in this issue of the Journal, Hamad and Rehkopf (Am J Epidemiol. 2016;183(9):775-784) used an in-work tax credit called the Earned Income Tax Credit as an instrument to estimate the association between income and child development. However, under certain conditions, the use of financial credits as instruments could violate 2 key instrumental variable analytic assumptions. First, some financial credits may directly influence health, for example, through increasing a psychological sense of welfare security. Second, financial credits and health may have several unmeasured common causes, such as politics, other social policies, and the motivation to maximize the credit. If epidemiologists pursue such instrumental variable analyses, using the amount of an unconditional, universal credit that an individual or group has received as the instrument may produce the most conceptually convincing and generalizable evidence. However, other natural income experiments (e.g., lottery winnings) and other methods that allow better adjustment for confounding might be more promising approaches for estimating the causal relationship between income and health. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Latent class instrumental variables: A clinical and biostatistical perspective

    PubMed Central

    Baker, Stuart G.; Kramer, Barnett S.; Lindeman, Karen S.

    2015-01-01

    In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. PMID:26239275

  11. A systematic review of statistical methods used to test for reliability of medical instruments measuring continuous variables.

    PubMed

    Zaki, Rafdzah; Bulgiba, Awang; Nordin, Noorhaire; Azina Ismail, Noor

    2013-06-01

    Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice. In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. The Intra-class Correlation Coefficient (ICC) is the most popular method with 25 (60%) studies having used this method followed by the comparing means (8 or 19%). Out of 25 studies using the ICC, only 7 (28%) reported the confidence intervals and types of ICC used. Most studies (71%) also tested the agreement of instruments. This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  12. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  13. Combining fixed effects and instrumental variable approaches for estimating the effect of psychosocial job quality on mental health: evidence from 13 waves of a nationally representative cohort study.

    PubMed

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis

    2017-06-23

    Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P < 0.001). When the fixed effects was combined with the instrumental variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  14. Here Be Dragons: Effective (X-ray) Timing with the Cospectrum

    NASA Astrophysics Data System (ADS)

    Huppenkothen, Daniela; Bachetti, Matteo

    2018-01-01

    In recent years, the cross spectrum has received considerable attention as a means of characterising the variability of astronomical sources as a function of wavelength. While much has been written about the statistics of time and phase lags, the cospectrum—the real part of the cross spectrum—has only recently been understood as means of mitigating instrumental effects dependent on temporal frequency in astronomical detectors, as well as a method of characterizing the coherent variability in two wavelength ranges on different time scales. In this talk, I will present recent advances made in understanding the statistical properties of cospectra, leading to much improved inferences for periodic and quasi-periodic signals. I will also present a new method to reliably mitigate instrumental effects such as dead time in X-ray detectors, and show how we can use the cospectrum to model highly variable sources such as X-ray binaries or Active Galactic Nuclei.

  15. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  16. Specifying and Refining a Complex Measurement Model.

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    This paper aims to describe a Bayesian approach to modeling and estimating cognitive models both in terms of statistical machinery and actual instrument development. Such a method taps the knowledge of experts to provide initial estimates for the probabilistic relationships among the variables in a multivariate latent variable model and refines…

  17. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding

    PubMed Central

    Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James

    2014-01-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259

  18. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding.

    PubMed

    MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James

    2014-06-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.

  19. Construction of the descriptive system for the assessment of quality of life AQoL-6D utility instrument

    PubMed Central

    2012-01-01

    Background Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. Methods The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. Results and Discussion The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. Conclusions The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs. PMID:22507254

  20. Interpreting findings from Mendelian randomization using the MR-Egger method.

    PubMed

    Burgess, Stephen; Thompson, Simon G

    2017-05-01

    Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.

  1. Use of allele scores as instrumental variables for Mendelian randomization

    PubMed Central

    Burgess, Stephen; Thompson, Simon G

    2013-01-01

    Background An allele score is a single variable summarizing multiple genetic variants associated with a risk factor. It is calculated as the total number of risk factor-increasing alleles for an individual (unweighted score), or the sum of weights for each allele corresponding to estimated genetic effect sizes (weighted score). An allele score can be used in a Mendelian randomization analysis to estimate the causal effect of the risk factor on an outcome. Methods Data were simulated to investigate the use of allele scores in Mendelian randomization where conventional instrumental variable techniques using multiple genetic variants demonstrate ‘weak instrument’ bias. The robustness of estimates using the allele score to misspecification (for example non-linearity, effect modification) and to violations of the instrumental variable assumptions was assessed. Results Causal estimates using a correctly specified allele score were unbiased with appropriate coverage levels. The estimates were generally robust to misspecification of the allele score, but not to instrumental variable violations, even if the majority of variants in the allele score were valid instruments. Using a weighted rather than an unweighted allele score increased power, but the increase was small when genetic variants had similar effect sizes. Naive use of the data under analysis to choose which variants to include in an allele score, or for deriving weights, resulted in substantial biases. Conclusions Allele scores enable valid causal estimates with large numbers of genetic variants. The stringency of criteria for genetic variants in Mendelian randomization should be maintained for all variants in an allele score. PMID:24062299

  2. Latent class instrumental variables: a clinical and biostatistical perspective.

    PubMed

    Baker, Stuart G; Kramer, Barnett S; Lindeman, Karen S

    2016-01-15

    In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Toward a clearer portrayal of confounding bias in instrumental variable applications.

    PubMed

    Jackson, John W; Swanson, Sonja A

    2015-07-01

    Recommendations for reporting instrumental variable analyses often include presenting the balance of covariates across levels of the proposed instrument and levels of the treatment. However, such presentation can be misleading as relatively small imbalances among covariates across levels of the instrument can result in greater bias because of bias amplification. We introduce bias plots and bias component plots as alternative tools for understanding biases in instrumental variable analyses. Using previously published data on proposed preference-based, geography-based, and distance-based instruments, we demonstrate why presenting covariate balance alone can be problematic, and how bias component plots can provide more accurate context for bias from omitting a covariate from an instrumental variable versus non-instrumental variable analysis. These plots can also provide relevant comparisons of different proposed instruments considered in the same data. Adaptable code is provided for creating the plots.

  4. Evaluation of Scale Reliability with Binary Measures Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Dimitrov, Dimiter M.; Asparouhov, Tihomir

    2010-01-01

    A method for interval estimation of scale reliability with discrete data is outlined. The approach is applicable with multi-item instruments consisting of binary measures, and is developed within the latent variable modeling methodology. The procedure is useful for evaluation of consistency of single measures and of sum scores from item sets…

  5. Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach

    NASA Astrophysics Data System (ADS)

    Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.

    2018-03-01

    Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.

  6. Density dependence and climate effects in Rocky Mountain elk: an application of regression with instrumental variables for population time series with sampling error.

    PubMed

    Creel, Scott; Creel, Michael

    2009-11-01

    1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results predict a substantial reduction in the limiting effect of snow accumulation on Montana elk populations in the coming decades. If other limiting factors do not operate with greater force, population growth rates would increase substantially.

  7. The use of a combination of instrumental methods to assess change in sensory crispness during storage of a "Honeycrisp" apple breeding family.

    PubMed

    Chang, Hsueh-Yuan; Vickers, Zata M; Tong, Cindy B S

    2018-04-01

    Loss of crispness in apple fruit during storage reduces the fruit's fresh sensation and consumer acceptance. Apple varieties that maintain crispness thus have higher potential for longer-term consumer appeal. To efficiently phenotype crispness, several instrumental methods have been tested, but variable results were obtained when different apple varieties were assayed. To extend these studies, we assessed the extent to which instrumental measurements correlate to and predict sensory crispness, with a focus on crispness maintenance. We used an apple breeding family derived from a cross between "Honeycrisp" and "MN1764," which segregates for crispness maintenance. Three types of instrumental measurements (puncture, snapping, and mechanical-acoustic tests) and sensory evaluation were performed on fruit at harvest and after 8 weeks of cold storage. Overall, 20 genotypes from the family and the 2 parents were characterized by 19 force and acoustic measures. In general, crispness was more related to force than to acoustic measures. Force linear distance and maximum force as measured by the mechanical-acoustic test were best correlated with sensory crispness and change in crispness, respectively. The correlations varied by apple genotype. The best multiple linear regression model to predict change in sensory crispness between harvest and storage of fruit of this breeding family incorporated both force and acoustic measures. This work compared the abilities of instrumental tests to predict sensory crispness maintenance of apple fruit. The use of an instrumental method that is highly correlated to sensory crispness evaluation can enhance the efficiency and reduce the cost of measuring crispness for breeding purposes. This study showed that sensory crispness and change in crispness after storage of an apple breeding family were reliably predicted with a combination of instrumental measurements and multiple variable analyses. The strategy potentially can be applied to other apple varieties for more accurate interpretation of crispness maintenance measured instrumentally. © 2018 Wiley Periodicals, Inc.

  8. High resolution geochemical proxy record of the last 600yr in a speleothem from the northwest Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Iglesias González, Miguel; Pisonero, Jorge; Cheng, Hai; Edwards, R. Lawrence; Stoll, Heather

    2017-04-01

    In meteorology and climatology, the instrumental period is the period where we have measured directly by instrumentation, different meteorological data along the surface which allow us to determinate the evolution of the climate during the last 150 years over the world. At the beginning, the density of this data were very low, so we have to wait until the last 75-100 years to have a good network in most of the parts of the surface. This time period is very small if we want to analyze the relationship between geochemical and instrumental variability in any speleothem. So a very high resolution data is needed to determinate the connection between both of them in the instrumental period, to try to determinate de evolution of climate in the last 600 years. Here we present a high resolution speleothem record from a cave located in the middle of the Cantabrian Mountains without any anthropologic influence and with no CO2 seasonal variability. This 600yr stalagmite, dated with U/Th method with a growth rate from 100 to 200 micrometers/yr calculated with Bchron model, provide us accurate information of the climate conditions near the cave. Trace elements are analyzed at 8 micrometers intervals by Laser Ablation ICP-MS which resolves even monthly resolution during the last 600 years with special attention with Sr, Mg, Al and Si. This data, without seasonal variability and with the presence of a river inside the cave, give us very valuable information about the extreme flood events inside the cave during the whole period, which is related with the precipitations and the snow fusion events outside the cave. We identify more extremely flood events during the Little Ice Age than in the last 100yr. As well, we have trace elements data with spatial resolution of 0.2mm analyzed with ICP-AES which allow us to compare the geochemical variability with both technics. We also analyze stable isotope d13C and d18O with a spatial resolution of 0.2mm, so we are able to identify variations and all possible correlations between them, trace elements and instrumental records from the different weather stations located near the cave. We use instrumental data, and the statistical correlation between our proxy and them, to calibrate and analyze the variability along the 600yr which provide us a lot of information about the climate variability. In spite of the significate global warming during the last 25 years, we have less variability during this period than along the transition between the Medieval Warm Period and the Little Ice Age. We also analyze this variability along the 600 years with wavelet analysis, with special attention in the instrumental period. With this mathematical method, we can identify several cycles both in trace elements and stable isotopes at special scales compatible with the decadal and multidecadal variability with a value similar to very important climate index like AMO.

  9. The added value of time-variable microgravimetry to the understanding of how volcanoes work

    USGS Publications Warehouse

    Carbone, Daniele; Poland, Michael; Greco, Filippo; Diament, Michel

    2017-01-01

    During the past few decades, time-variable volcano gravimetry has shown great potential for imaging subsurface processes at active volcanoes (including some processes that might otherwise remain “hidden”), especially when combined with other methods (e.g., ground deformation, seismicity, and gas emissions). By supplying information on changes in the distribution of bulk mass over time, gravimetry can provide information regarding processes such as magma accumulation in void space, gas segregation at shallow depths, and mechanisms driving volcanic uplift and subsidence. Despite its potential, time-variable volcano gravimetry is an underexploited method, not widely adopted by volcano researchers or observatories. The cost of instrumentation and the difficulty in using it under harsh environmental conditions is a significant impediment to the exploitation of gravimetry at many volcanoes. In addition, retrieving useful information from gravity changes in noisy volcanic environments is a major challenge. While these difficulties are not trivial, neither are they insurmountable; indeed, creative efforts in a variety of volcanic settings highlight the value of time-variable gravimetry for understanding hazards as well as revealing fundamental insights into how volcanoes work. Building on previous work, we provide a comprehensive review of time-variable volcano gravimetry, including discussions of instrumentation, modeling and analysis techniques, and case studies that emphasize what can be learned from campaign, continuous, and hybrid gravity observations. We are hopeful that this exploration of time-variable volcano gravimetry will excite more scientists about the potential of the method, spurring further application, development, and innovation.

  10. Additional information for “TREMOR: A Wireless, MEMS Accelerograph for Dense Arrays” (Evans et al., 2003)

    USGS Publications Warehouse

    Evans, John R.; Hamstra, Robert H.; Spudich, Paul; Kundig, Christoph; Camina, Patrick; Rogers, John A.

    2003-01-01

    The length of Evans et al. (2003) necessitated transfer of several less germane sections to this alternate forum to meet that venue’s needs. These sections include a description of the development of Figure 1, the plot of spatial variability so critical to the argument for dense arrays of strong-motion instruments; the description of the rapid, integer, computational method for PGV used in the TREMOR instrument (the Oakland instrument, the commercial prototype, and the commercial instrument); siting methods and strategies used for Class B TREMOR instruments and those that can be used for Class C instruments to preserve the cost advantages of such systems; and some general discussion of MEMS accelerometers, including a comparative Table with representative examples of Class A, B and C MEMS devices. (“MEMS” means “Micro-ElectroMechanical” Systems—“micromachined” sensors, generally of silicon. Classes A, B, and C are defined in Table 1.)

  11. Provider payments and patient charges as policy tools for cost-containment: How successful are they in high-income countries?

    PubMed Central

    Carrin, Guy; Hanvoravongchai, Piya

    2003-01-01

    In this paper, we focus on those policy instruments with monetary incentives that are used to contain public health expenditure in high-income countries. First, a schematic view of the main cost-containment methods and the variables in the health system they intend to influence is presented. Two types of instruments to control the level and growth of public health expenditure are considered: (i) provider payment methods that influence the price and quantity of health care, and (ii) cost-containment measures that influence the behaviour of patients. Belonging to the first type of instruments, we have: fee-for-service, per diem payment, case payment, capitation, salaries and budgets. The second type of instruments consists of patient charges and reference price systems for pharmaceuticals. Secondly, we provide an overview of experience in high-income countries that use or have used these particular instruments. Finally, the paper assesses the overall potential of these instruments in cost-containment policies. PMID:12914661

  12. Application of Influence Diagrams in Identifying Soviet Satellite Missions

    DTIC Science & Technology

    1990-12-01

    Probabilities Comparison ......................... 58 35. Continuous Model Variables ............................ 59 36. Sample Inclination Data...diagramming is a method which allows the simple construction of a model to illustrate the interrelationships which exist among variables by capturing an...environmental monitoring systems. The module also contained an array of instruments for geophysical and astrophysical experimentation . 4.3.14.3 Soyuz. The Soyuz

  13. An introduction to instrumental variables analysis: part 1.

    PubMed

    Bennett, Derrick A

    2010-01-01

    There are several examples in the medical literature where the associations of treatment effects predicted by observational studies have been refuted by evidence from subsequent large-scale randomised trials. This is because of the fact that non-experimental studies are subject to confounding - and confounding cannot be entirely eliminated even if all known confounders have been measured in the study as there may be unknown confounders. The aim of this 2-part methodological primer is to introduce an emerging methodology for estimating treatment effects using observational data in the absence of good randomised evidence known as the method of instrumental variables. Copyright © 2010 S. Karger AG, Basel.

  14. Instrumental Variable Methods for Continuous Outcomes That Accommodate Nonignorable Missing Baseline Values.

    PubMed

    Ertefaie, Ashkan; Flory, James H; Hennessy, Sean; Small, Dylan S

    2017-06-15

    Instrumental variable (IV) methods provide unbiased treatment effect estimation in the presence of unmeasured confounders under certain assumptions. To provide valid estimates of treatment effect, treatment effect confounders that are associated with the IV (IV-confounders) must be included in the analysis, and not including observations with missing values may lead to bias. Missing covariate data are particularly problematic when the probability that a value is missing is related to the value itself, which is known as nonignorable missingness. In such cases, imputation-based methods are biased. Using health-care provider preference as an IV method, we propose a 2-step procedure with which to estimate a valid treatment effect in the presence of baseline variables with nonignorable missing values. First, the provider preference IV value is estimated by performing a complete-case analysis using a random-effects model that includes IV-confounders. Second, the treatment effect is estimated using a 2-stage least squares IV approach that excludes IV-confounders with missing values. Simulation results are presented, and the method is applied to an analysis comparing the effects of sulfonylureas versus metformin on body mass index, where the variables baseline body mass index and glycosylated hemoglobin have missing values. Our result supports the association of sulfonylureas with weight gain. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. The prediction of nonlinear dynamic loads on helicopters from flight variables using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Cook, A. B.; Fuller, C. R.; O'Brien, W. F.; Cabell, R. H.

    1992-01-01

    A method of indirectly monitoring component loads through common flight variables is proposed which requires an accurate model of the underlying nonlinear relationships. An artificial neural network (ANN) model learns relationships through exposure to a database of flight variable records and corresponding load histories from an instrumented military helicopter undergoing standard maneuvers. The ANN model, utilizing eight standard flight variables as inputs, is trained to predict normalized time-varying mean and oscillatory loads on two critical components over a range of seven maneuvers. Both interpolative and extrapolative capabilities are demonstrated with agreement between predicted and measured loads on the order of 90 percent to 95 percent. This work justifies pursuing the ANN method of predicting loads from flight variables.

  16. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    PubMed

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  17. Sources of Biased Inference in Alcohol and Drug Services Research: An Instrumental Variable Approach

    PubMed Central

    Schmidt, Laura A.; Tam, Tammy W.; Larson, Mary Jo

    2012-01-01

    Objective: This study examined the potential for biased inference due to endogeneity when using standard approaches for modeling the utilization of alcohol and drug treatment. Method: Results from standard regression analysis were compared with those that controlled for endogeneity using instrumental variables estimation. Comparable models predicted the likelihood of receiving alcohol treatment based on the widely used Aday and Andersen medical care–seeking model. Data were from the National Epidemiologic Survey on Alcohol and Related Conditions and included a representative sample of adults in households and group quarters throughout the contiguous United States. Results: Findings suggested that standard approaches for modeling treatment utilization are prone to bias because of uncontrolled reverse causation and omitted variables. Compared with instrumental variables estimation, standard regression analyses produced downwardly biased estimates of the impact of alcohol problem severity on the likelihood of receiving care. Conclusions: Standard approaches for modeling service utilization are prone to underestimating the true effects of problem severity on service use. Biased inference could lead to inaccurate policy recommendations, for example, by suggesting that people with milder forms of substance use disorder are more likely to receive care than is actually the case. PMID:22152672

  18. EVALUATION OF METHODS FOR THE DETERMINATION OF DIESEL-GENERATED FINE PARTICULATE MATTER: PHYSICAL CHARACTERIZATION OF RESULTS

    EPA Science Inventory

    A multi-phase instrument comparison study was conducted on two different diesel engines on a dynamometer to compare commonly used particulate matter (PM) measurement techniques while sampling the same diesel exhaust aerosol and to evaluate inter- and intra-method variability. In...

  19. College Quality and Early Adult Outcomes

    ERIC Educational Resources Information Center

    Long, Mark C.

    2008-01-01

    This paper estimates the effects of various college qualities on several early adult outcomes, using panel data from the National Education Longitudinal Study. I compare the results using ordinary least squares with three alternative methods of estimation, including instrumental variables, and the methods used by Dale and Krueger [(2002).…

  20. Evaluation of Criterion Validity for Scales with Congeneric Measures

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2007-01-01

    A method for estimating criterion validity of scales with homogeneous components is outlined. It accomplishes point and interval estimation of interrelationship indices between composite scores and criterion variables and is useful for testing hypotheses about criterion validity of measurement instruments. The method can also be used with missing…

  1. A novel technique for fetal heart rate estimation from Doppler ultrasound signal

    PubMed Central

    2011-01-01

    Background The currently used fetal monitoring instrumentation that is based on Doppler ultrasound technique provides the fetal heart rate (FHR) signal with limited accuracy. It is particularly noticeable as significant decrease of clinically important feature - the variability of FHR signal. The aim of our work was to develop a novel efficient technique for processing of the ultrasound signal, which could estimate the cardiac cycle duration with accuracy comparable to a direct electrocardiography. Methods We have proposed a new technique which provides the true beat-to-beat values of the FHR signal through multiple measurement of a given cardiac cycle in the ultrasound signal. The method consists in three steps: the dynamic adjustment of autocorrelation window, the adaptive autocorrelation peak detection and determination of beat-to-beat intervals. The estimated fetal heart rate values and calculated indices describing variability of FHR, were compared to the reference data obtained from the direct fetal electrocardiogram, as well as to another method for FHR estimation. Results The results revealed that our method increases the accuracy in comparison to currently used fetal monitoring instrumentation, and thus enables to calculate reliable parameters describing the variability of FHR. Relating these results to the other method for FHR estimation we showed that in our approach a much lower number of measured cardiac cycles was rejected as being invalid. Conclusions The proposed method for fetal heart rate determination on a beat-to-beat basis offers a high accuracy of the heart interval measurement enabling reliable quantitative assessment of the FHR variability, at the same time reducing the number of invalid cardiac cycle measurements. PMID:21999764

  2. High-Frequency X-ray Variability Detection in A Black Hole Transient with USA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabad, Gayane

    2000-10-16

    Studies of high-frequency variability (above {approx}100 Hz) in X-ray binaries provide a unique opportunity to explore the fundamental physics of spacetime and matter, since the orbital timescale on the order of several milliseconds is a timescale of the motion of matter through the region located in close proximity to a compact stellar object. The detection of weak high-frequency signals in X-ray binaries depends on how well we understand the level of Poisson noise due to the photon counting statistics, i.e. how well we can understand and model the detector deadtime and other instrumental systematic effects. We describe the preflight timingmore » calibration work performed on the Unconventional Stellar Aspect (USA) X-ray detector to study deadtime and timing issues. We developed a Monte Carlo deadtime model and deadtime correction methods for the USA experiment. The instrumental noise power spectrum can be estimated within {approx}0.1% accuracy in the case when no energy-dependent instrumental effect is present. We also developed correction techniques to account for an energy-dependent instrumental effect. The developed methods were successfully tested on USA Cas A and Cygnus X-1 data. This work allowed us to make a detection of a weak signal in a black hole candidate (BHC) transient.« less

  3. Do radio frequencies of medical instruments common in the operating room interfere with near-infrared spectroscopy signals?

    NASA Astrophysics Data System (ADS)

    Shadgan, Babak; Molavi, Behnam; Reid, W. D.; Dumont, Guy; Macnab, Andrew J.

    2010-02-01

    Background: Medical and diagnostic applications of near infrared spectroscopy (NIRS) are increasing, especially in operating rooms (OR). Since NIRS is an optical technique, radio frequency (RF) interference from other instruments is unlikely to affect the raw optical data, however, NIRS data processing and signal output could be affected. Methods: We investigated the potential for three common OR instruments: an electrical cautery, an orthopaedic drill and an imaging system, to generate electromagnetic interference (EMI) that could potentially influence NIRS signals. The time of onset and duration of every operation of each device was recorded during surgery. To remove the effects of slow changing physiological variables, we first used a lowpass filter and then selected 2 windows with variable lengths around the moment of device onset. For each instant, variances (energy) and means of the signals in the 2 windows were compared. Results: Twenty patients were studied during ankle surgery. Analysis shows no statistically significant difference in the means and variance of the NIRS signals (p < 0.01) during operation of any of the three devices for all surgeries. Conclusion: This method confirms the instruments evaluated caused no significant interference. NIRS can potentially be used without EMI in clinical environments such as the OR.

  4. Clinical evaluation of a chemomechanical method for caries removal in children and adolescents.

    PubMed

    Peric, Tamara; Markovic, Dejan; Petrovic, Bojan

    2009-01-01

    The purpose of this study was to make a clinical comparison of the chemomechanical method for caries removal and the conventional rotary instruments technique when used in children and adolescents. The study comprised 120 patients aged 3-17 years randomized into two groups: caries were removed chemomechanically in 60 patients and 60 patients received conventional treatment with rotary instruments. The outcome variables were: clinically complete caries removal, pain during caries removal, need for local anesthesia, treatment time, preferences of patients, and clinical success of the restorations during the 12-month evaluation period. Complete caries removal was achieved in 92% of chemomechanically treated teeth and in all teeth treated with rotary instruments (p>0.05). The chemomechanical method significantly reduced the need for local anesthesia (p<0.001). Eighty-five percent of patients treated with Carisolv and 47% treated with rotary instruments were satisfied with the treatment (p<0.05). The mean time for chemomechanical caries removal was 11.2 ± 3.3 min and 5.2 ± 2.8 min for caries removal with rotary instruments (p<0.001). At the end of the 12-month evaluation period, there was no observed influence of the caries removal method on the survival of the restorations. The chemomechanical caries removal technique is an adequate alternative to the conventional rotary instruments method and is advantageous in pediatric dentistry.

  5. Implementation of Instrumental Variable Bounds for Data Missing Not at Random.

    PubMed

    Marden, Jessica R; Wang, Linbo; Tchetgen, Eric J Tchetgen; Walter, Stefan; Glymour, M Maria; Wirth, Kathleen E

    2018-05-01

    Instrumental variables are routinely used to recover a consistent estimator of an exposure causal effect in the presence of unmeasured confounding. Instrumental variable approaches to account for nonignorable missing data also exist but are less familiar to epidemiologists. Like instrumental variables for exposure causal effects, instrumental variables for missing data rely on exclusion restriction and instrumental variable relevance assumptions. Yet these two conditions alone are insufficient for point identification. For estimation, researchers have invoked a third assumption, typically involving fairly restrictive parametric constraints. Inferences can be sensitive to these parametric assumptions, which are typically not empirically testable. The purpose of our article is to discuss another approach for leveraging a valid instrumental variable. Although the approach is insufficient for nonparametric identification, it can nonetheless provide informative inferences about the presence, direction, and magnitude of selection bias, without invoking a third untestable parametric assumption. An important contribution of this article is an Excel spreadsheet tool that can be used to obtain empirical evidence of selection bias and calculate bounds and corresponding Bayesian 95% credible intervals for a nonidentifiable population proportion. For illustrative purposes, we used the spreadsheet tool to analyze HIV prevalence data collected by the 2007 Zambia Demographic and Health Survey (DHS).

  6. Construction of the descriptive system for the Assessment of Quality of Life AQoL-6D utility instrument.

    PubMed

    Richardson, Jeffrey R J; Peacock, Stuart J; Hawthorne, Graeme; Iezzi, Angelo; Elsworth, Gerald; Day, Neil A

    2012-04-17

    Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs.

  7. Too much ado about instrumental variable approach: is the cure worse than the disease?

    PubMed

    Baser, Onur

    2009-01-01

    To review the efficacy of instrumental variable (IV) models in addressing a variety of assumption violations to ensure standard ordinary least squares (OLS) estimates are consistent. IV models gained popularity in outcomes research because of their ability to consistently estimate the average causal effects even in the presence of unmeasured confounding. However, in order for this consistent estimation to be achieved, several conditions must hold. In this article, we provide an overview of the IV approach, examine possible tests to check the prerequisite conditions, and illustrate how weak instruments may produce inconsistent and inefficient results. We use two IVs and apply Shea's partial R-square method, the Anderson canonical correlation, and Cragg-Donald tests to check for weak instruments. Hall-Peixe tests are applied to see if any of these instruments are redundant in the analysis. A total of 14,952 asthma patients from the MarketScan Commercial Claims and Encounters Database were examined in this study. Patient health care was provided under a variety of fee-for-service, fully capitated, and partially capitated health plans, including preferred provider organizations, point of service plans, indemnity plans, and health maintenance organizations. We used controller-reliever copay ratio and physician practice/prescribing patterns as an instrument. We demonstrated that the former was a weak and redundant instrument producing inconsistent and inefficient estimates of the effect of treatment. The results were worse than the results from standard regression analysis. Despite the obvious benefit of IV models, the method should not be used blindly. Several strong conditions are required for these models to work, and each of them should be tested. Otherwise, bias and precision of the results will be statistically worse than the results achieved by simply using standard OLS.

  8. Genetic instrumental variable regression: Explaining socioeconomic and health outcomes in nonexperimental data

    PubMed Central

    DiPrete, Thomas A.; Burik, Casper A. P.; Koellinger, Philipp D.

    2018-01-01

    Identifying causal effects in nonexperimental data is an enduring challenge. One proposed solution that recently gained popularity is the idea to use genes as instrumental variables [i.e., Mendelian randomization (MR)]. However, this approach is problematic because many variables of interest are genetically correlated, which implies the possibility that many genes could affect both the exposure and the outcome directly or via unobserved confounding factors. Thus, pleiotropic effects of genes are themselves a source of bias in nonexperimental data that would also undermine the ability of MR to correct for endogeneity bias from nongenetic sources. Here, we propose an alternative approach, genetic instrumental variable (GIV) regression, that provides estimates for the effect of an exposure on an outcome in the presence of pleiotropy. As a valuable byproduct, GIV regression also provides accurate estimates of the chip heritability of the outcome variable. GIV regression uses polygenic scores (PGSs) for the outcome of interest which can be constructed from genome-wide association study (GWAS) results. By splitting the GWAS sample for the outcome into nonoverlapping subsamples, we obtain multiple indicators of the outcome PGSs that can be used as instruments for each other and, in combination with other methods such as sibling fixed effects, can address endogeneity bias from both pleiotropy and the environment. In two empirical applications, we demonstrate that our approach produces reasonable estimates of the chip heritability of educational attainment (EA) and show that standard regression and MR provide upwardly biased estimates of the effect of body height on EA. PMID:29686100

  9. Genetic instrumental variable regression: Explaining socioeconomic and health outcomes in nonexperimental data.

    PubMed

    DiPrete, Thomas A; Burik, Casper A P; Koellinger, Philipp D

    2018-05-29

    Identifying causal effects in nonexperimental data is an enduring challenge. One proposed solution that recently gained popularity is the idea to use genes as instrumental variables [i.e., Mendelian randomization (MR)]. However, this approach is problematic because many variables of interest are genetically correlated, which implies the possibility that many genes could affect both the exposure and the outcome directly or via unobserved confounding factors. Thus, pleiotropic effects of genes are themselves a source of bias in nonexperimental data that would also undermine the ability of MR to correct for endogeneity bias from nongenetic sources. Here, we propose an alternative approach, genetic instrumental variable (GIV) regression, that provides estimates for the effect of an exposure on an outcome in the presence of pleiotropy. As a valuable byproduct, GIV regression also provides accurate estimates of the chip heritability of the outcome variable. GIV regression uses polygenic scores (PGSs) for the outcome of interest which can be constructed from genome-wide association study (GWAS) results. By splitting the GWAS sample for the outcome into nonoverlapping subsamples, we obtain multiple indicators of the outcome PGSs that can be used as instruments for each other and, in combination with other methods such as sibling fixed effects, can address endogeneity bias from both pleiotropy and the environment. In two empirical applications, we demonstrate that our approach produces reasonable estimates of the chip heritability of educational attainment (EA) and show that standard regression and MR provide upwardly biased estimates of the effect of body height on EA. Copyright © 2018 the Author(s). Published by PNAS.

  10. Versailles Project on Advanced Materials and Standards Interlaboratory Study on Measuring the Thickness and Chemistry of Nanoparticle Coatings Using XPS and LEIS.

    PubMed

    Belsey, Natalie A; Cant, David J H; Minelli, Caterina; Araujo, Joyce R; Bock, Bernd; Brüner, Philipp; Castner, David G; Ceccone, Giacomo; Counsell, Jonathan D P; Dietrich, Paul M; Engelhard, Mark H; Fearn, Sarah; Galhardo, Carlos E; Kalbe, Henryk; Won Kim, Jeong; Lartundo-Rojas, Luis; Luftman, Henry S; Nunney, Tim S; Pseiner, Johannes; Smith, Emily F; Spampinato, Valentina; Sturm, Jacobus M; Thomas, Andrew G; Treacy, Jon P W; Veith, Lothar; Wagstaffe, Michael; Wang, Hai; Wang, Meiling; Wang, Yung-Chen; Werner, Wolfgang; Yang, Li; Shard, Alexander G

    2016-10-27

    We report the results of a VAMAS (Versailles Project on Advanced Materials and Standards) inter-laboratory study on the measurement of the shell thickness and chemistry of nanoparticle coatings. Peptide-coated gold particles were supplied to laboratories in two forms: a colloidal suspension in pure water and; particles dried onto a silicon wafer. Participants prepared and analyzed these samples using either X-ray photoelectron spectroscopy (XPS) or low energy ion scattering (LEIS). Careful data analysis revealed some significant sources of discrepancy, particularly for XPS. Degradation during transportation, storage or sample preparation resulted in a variability in thickness of 53 %. The calculation method chosen by XPS participants contributed a variability of 67 %. However, variability of 12 % was achieved for the samples deposited using a single method and by choosing photoelectron peaks that were not adversely affected by instrumental transmission effects. The study identified a need for more consistency in instrumental transmission functions and relative sensitivity factors, since this contributed a variability of 33 %. The results from the LEIS participants were more consistent, with variability of less than 10 % in thickness and this is mostly due to a common method of data analysis. The calculation was performed using a model developed for uniform, flat films and some participants employed a correction factor to account for the sample geometry, which appears warranted based upon a simulation of LEIS data from one of the participants and comparison to the XPS results.

  11. Measuring organizational and individual factors thought to influence the success of quality improvement in primary care: a systematic review of instruments

    PubMed Central

    2012-01-01

    Background Continuous quality improvement (CQI) methods are widely used in healthcare; however, the effectiveness of the methods is variable, and evidence about the extent to which contextual and other factors modify effects is limited. Investigating the relationship between these factors and CQI outcomes poses challenges for those evaluating CQI, among the most complex of which relate to the measurement of modifying factors. We aimed to provide guidance to support the selection of measurement instruments by systematically collating, categorising, and reviewing quantitative self-report instruments. Methods Data sources: We searched MEDLINE, PsycINFO, and Health and Psychosocial Instruments, reference lists of systematic reviews, and citations and references of the main report of instruments. Study selection: The scope of the review was determined by a conceptual framework developed to capture factors relevant to evaluating CQI in primary care (the InQuIRe framework). Papers reporting development or use of an instrument measuring a construct encompassed by the framework were included. Data extracted included instrument purpose; theoretical basis, constructs measured and definitions; development methods and assessment of measurement properties. Analysis and synthesis: We used qualitative analysis of instrument content and our initial framework to develop a taxonomy for summarising and comparing instruments. Instrument content was categorised using the taxonomy, illustrating coverage of the InQuIRe framework. Methods of development and evidence of measurement properties were reviewed for instruments with potential for use in primary care. Results We identified 186 potentially relevant instruments, 152 of which were analysed to develop the taxonomy. Eighty-four instruments measured constructs relevant to primary care, with content measuring CQI implementation and use (19 instruments), organizational context (51 instruments), and individual factors (21 instruments). Forty-one instruments were included for full review. Development methods were often pragmatic, rather than systematic and theory-based, and evidence supporting measurement properties was limited. Conclusions Many instruments are available for evaluating CQI, but most require further use and testing to establish their measurement properties. Further development and use of these measures in evaluations should increase the contribution made by individual studies to our understanding of CQI and enhance our ability to synthesise evidence for informing policy and practice. PMID:23241168

  12. FFT Deconvultion of Be Star Hα Line Profiles

    NASA Astrophysics Data System (ADS)

    Austin, S. J.

    2005-12-01

    We have been monitoring the spectroscopic variability of Be stars using the UCA Fiber Fed Spectrograph. The spectra are 0.8 Angstrom/pixel resolution of the Hα line. The observed line profiles are a convolution of the actual profile and the instrumental profile. A Fast Fourier Transform (FFT) method has been used to deconvolve the observed profiles, given the instrument profile obtained by observing the narrow lines from the HgNe wavelength calibration lamp. The long-term monitoring of the spectroscopic variability of Be stars is crucial for testing the various Be star models. Deconvolved H-α line profiles, velocities, and variability are shown for gamma Cas, delta Sco, chi Oph, eta PsA, 48 Lib, and upsilon Sgr (HD181615). Funding has been provided by the UCA University Research Council and the Arkansas Space Grant Consortium.

  13. Literature review of some selected types of results and statistical analyses of total-ozone data. [for the ozonosphere

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1976-01-01

    The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.

  14. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  15. Turbidity threshold sampling: Methods and instrumentation

    Treesearch

    Rand Eads; Jack Lewis

    2001-01-01

    Traditional methods for determining the frequency of suspended sediment sample collection often rely on measurements, such as water discharge, that are not well correlated to sediment concentration. Stream power is generally not a good predictor of sediment concentration for rivers that transport the bulk of their load as fines, due to the highly variable routing of...

  16. An instrument for rapid, accurate, determination of fuel moisture content

    Treesearch

    Stephen S. Sackett

    1980-01-01

    Moisture contents of dead and living fuels are key variables in fire behavior. Accurate, real-time fuel moisture data are required for prescribed burning and wildfire behavior predictions. The convection oven method has become the standard for direct fuel moisture content determination. Efforts to quantify fuel moisture through indirect methods have not been...

  17. Intrajudge and Interjudge Reliability of the Stuttering Severity Instrument-Fourth Edition.

    PubMed

    Davidow, Jason H; Scott, Kathleen A

    2017-11-08

    The Stuttering Severity Instrument (SSI) is a tool used to measure the severity of stuttering. Previous versions of the instrument have known limitations (e.g., Lewis, 1995). The present study examined the intra- and interjudge reliability of the newest version, the Stuttering Severity Instrument-Fourth Edition (SSI-4) (Riley, 2009). Twelve judges who were trained on the SSI-4 protocol participated. Judges collected SSI-4 data while viewing 4 videos of adults who stutter at Time 1 and 4 weeks later at Time 2. Data were analyzed for intra- and interjudge reliability of the SSI-4 subscores (for Frequency, Duration, and Physical Concomitants), total score, and final severity rating. Intra- and interjudge reliability across the subscores and total score concurred with the manual's reported reliability when reliability was calculated using the methods described in the manual. New calculations of judge agreement produced different values from those in the manual-for the 3 subscores, total score, and final severity rating-and provided data absent from the manual. Clinicians and researchers who use the SSI-4 should carefully consider the limitations of the instrument. Investigation into the multitasking demands of the instrument may provide information on whether separating the collection of data for specific variables will improve intra- and interjudge reliability of those variables.

  18. Compliance-Effect Correlation Bias in Instrumental Variables Estimators

    ERIC Educational Resources Information Center

    Reardon, Sean F.

    2010-01-01

    Instrumental variable estimators hold the promise of enabling researchers to estimate the effects of educational treatments that are not (or cannot be) randomly assigned but that may be affected by randomly assigned interventions. Examples of the use of instrumental variables in such cases are increasingly common in educational and social science…

  19. Evidence for a possible modern and mid-Holocene solar influence on climate from Lake Titicaca, South America

    NASA Astrophysics Data System (ADS)

    Theissen, K. M.; Dunbar, R. B.

    2005-12-01

    In tropical regions, there are few paleoclimate archives with the necessary resolution to investigate climate variability at interannual-to-decadal timescales prior to the onset of the instrumental record. Interannual variability associated with the El Niño Southern Oscillation (ENSO) is well documented in the instrumental record and the importance of the precessional forcing of millennial variability has been established in studies of tropical paleoclimate records. In contrast, decade-to-century variability is still poorly understood. Here, we examine interannual to decadal variability in the northern Altiplano of South America using digital image analysis of a floating interval of varved sediments of middle Holocene age (~6160-6310 yr BP) from Lake Titicaca. Multi-taper method (MTM) and wavelet frequency-domain analyses were performed on a time series generated from a gray-scaled digital image of the mm-thick laminations. Our results indicate significant power at a decadal periodicity (10-12 years) associated with the Schwabe cycle of solar activity. Frequency-domain analysis also indicates power at 2-2.5 year periodicities associated with ENSO. Similarly, spectral analysis of a 75 year instrumental record of Titicaca lake level shows significant power at both solar and ENSO periodicities. Although both of the examined records are short, our results imply that during both the mid-Holocene and modern times, solar and ENSO variability may have contributed to high frequency climate fluctuations over the northern Altiplano. We suspect that solar influence on large-scale atmospheric circulation features may account for the decadal variability in the mid-Holocene and present-day water balance of the Altiplano.

  20. Flexible Method for Developing Tactics, Techniques, and Procedures for Future Capabilities

    DTIC Science & Technology

    2009-02-01

    levels of ability, military experience, and motivation, (b) number and type of significant events, and (c) other sources of natural variability...research has developed a number of specific instruments designed to aid in this process. Second, the iterative, feed-forward nature of the method allows...FLEX method), but still lack the structured KE approach and iterative, feed-forward nature of the FLEX method. To facilitate decision making

  1. Variables selection methods in near-infrared spectroscopy.

    PubMed

    Xiaobo, Zou; Jiewen, Zhao; Povey, Malcolm J W; Holmes, Mel; Hanpin, Mao

    2010-05-14

    Near-infrared (NIR) spectroscopy has increasingly been adopted as an analytical tool in various fields, such as the petrochemical, pharmaceutical, environmental, clinical, agricultural, food and biomedical sectors during the past 15 years. A NIR spectrum of a sample is typically measured by modern scanning instruments at hundreds of equally spaced wavelengths. The large number of spectral variables in most data sets encountered in NIR spectral chemometrics often renders the prediction of a dependent variable unreliable. Recently, considerable effort has been directed towards developing and evaluating different procedures that objectively identify variables which contribute useful information and/or eliminate variables containing mostly noise. This review focuses on the variable selection methods in NIR spectroscopy. Selection methods include some classical approaches, such as manual approach (knowledge based selection), "Univariate" and "Sequential" selection methods; sophisticated methods such as successive projections algorithm (SPA) and uninformative variable elimination (UVE), elaborate search-based strategies such as simulated annealing (SA), artificial neural networks (ANN) and genetic algorithms (GAs) and interval base algorithms such as interval partial least squares (iPLS), windows PLS and iterative PLS. Wavelength selection with B-spline, Kalman filtering, Fisher's weights and Bayesian are also mentioned. Finally, the websites of some variable selection software and toolboxes for non-commercial use are given. Copyright 2010 Elsevier B.V. All rights reserved.

  2. A Polychoric Instrumental Variable (PIV) Estimator for Structural Equation Models with Categorical Variables

    ERIC Educational Resources Information Center

    Bollen, Kenneth A.; Maydeu-Olivares, Albert

    2007-01-01

    This paper presents a new polychoric instrumental variable (PIV) estimator to use in structural equation models (SEMs) with categorical observed variables. The PIV estimator is a generalization of Bollen's (Psychometrika 61:109-121, 1996) 2SLS/IV estimator for continuous variables to categorical endogenous variables. We derive the PIV estimator…

  3. Spectrophotometer-Based Color Measurements

    DTIC Science & Technology

    2017-10-24

    public release; distribution is unlimited. AD U.S. ARMY ARMAMENT RESEARCH , DEVELOPMENT AND ENGINEERING CENTER Weapons and Software Engineering Center...for public release; distribution is unlimited. UNCLASSIFIED i CONTENTS Page Summary 1 Introduction 1 Methods , Assumptions, and Procedures 1...Values for Federal Color Standards 15 Distribution List 25 TABLES 1 Instrument precision 3 2 Method precision and operator variability 4 3

  4. Alternative Approaches to Evaluation in Empirical Microeconomics

    ERIC Educational Resources Information Center

    Blundell, Richard; Dias, Monica Costa

    2009-01-01

    This paper reviews some of the most popular policy evaluation methods in empirical microeconomics: social experiments, natural experiments, matching, instrumental variables, discontinuity design, and control functions. It discusses identification of traditionally used average parameters and more complex distributional parameters. The adequacy,…

  5. Variable Structure Control of a Hand-Launched Glider

    NASA Technical Reports Server (NTRS)

    Anderson, Mark R.; Waszak, Martin R.

    2005-01-01

    Variable structure control system design methods are applied to the problem of aircraft spin recovery. A variable structure control law typically has two phases of operation. The reaching mode phase uses a nonlinear relay control strategy to drive the system trajectory to a pre-defined switching surface within the motion state space. The sliding mode phase involves motion along the surface as the system moves toward an equilibrium or critical point. Analysis results presented in this paper reveal that the conventional method for spin recovery can be interpreted as a variable structure controller with a switching surface defined at zero yaw rate. Application of Lyapunov stability methods show that deflecting the ailerons in the direction of the spin helps to insure that this switching surface is stable. Flight test results, obtained using an instrumented hand-launched glider, are used to verify stability of the reaching mode dynamics.

  6. An extended Kalman-Bucy filter for atmospheric temperature profile retrieval with a passive microwave sounder

    NASA Technical Reports Server (NTRS)

    Ledsham, W. H.; Staelin, D. H.

    1978-01-01

    An extended Kalman-Bucy filter has been implemented for atmospheric temperature profile retrievals from observations made using the Scanned Microwave Spectrometer (SCAMS) instrument carried on the Nimbus 6 satellite. This filter has the advantage that it requires neither stationary statistics in the underlying processes nor linear production of the observed variables from the variables to be estimated. This extended Kalman-Bucy filter has yielded significant performance improvement relative to multiple regression retrieval methods. A multi-spot extended Kalman-Bucy filter has also been developed in which the temperature profiles at a number of scan angles in a scanning instrument are retrieved simultaneously. These multi-spot retrievals are shown to outperform the single-spot Kalman retrievals.

  7. Semiparametric methods for estimation of a nonlinear exposure-outcome relationship using instrumental variables with application to Mendelian randomization.

    PubMed

    Staley, James R; Burgess, Stephen

    2017-05-01

    Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure-outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure-outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure-outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  8. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    PubMed Central

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  9. Experimental variability and data pre-processing as factors affecting the discrimination power of some chemometric approaches (PCA, CA and a new algorithm based on linear regression) applied to (+/-)ESI/MS and RPLC/UV data: Application on green tea extracts.

    PubMed

    Iorgulescu, E; Voicu, V A; Sârbu, C; Tache, F; Albu, F; Medvedovici, A

    2016-08-01

    The influence of the experimental variability (instrumental repeatability, instrumental intermediate precision and sample preparation variability) and data pre-processing (normalization, peak alignment, background subtraction) on the discrimination power of multivariate data analysis methods (Principal Component Analysis -PCA- and Cluster Analysis -CA-) as well as a new algorithm based on linear regression was studied. Data used in the study were obtained through positive or negative ion monitoring electrospray mass spectrometry (+/-ESI/MS) and reversed phase liquid chromatography/UV spectrometric detection (RPLC/UV) applied to green tea extracts. Extractions in ethanol and heated water infusion were used as sample preparation procedures. The multivariate methods were directly applied to mass spectra and chromatograms, involving strictly a holistic comparison of shapes, without assignment of any structural identity to compounds. An alternative data interpretation based on linear regression analysis mutually applied to data series is also discussed. Slopes, intercepts and correlation coefficients produced by the linear regression analysis applied on pairs of very large experimental data series successfully retain information resulting from high frequency instrumental acquisition rates, obviously better defining the profiles being compared. Consequently, each type of sample or comparison between samples produces in the Cartesian space an ellipsoidal volume defined by the normal variation intervals of the slope, intercept and correlation coefficient. Distances between volumes graphically illustrates (dis)similarities between compared data. The instrumental intermediate precision had the major effect on the discrimination power of the multivariate data analysis methods. Mass spectra produced through ionization from liquid state in atmospheric pressure conditions of bulk complex mixtures resulting from extracted materials of natural origins provided an excellent data basis for multivariate analysis methods, equivalent to data resulting from chromatographic separations. The alternative evaluation of very large data series based on linear regression analysis produced information equivalent to results obtained through application of PCA an CA. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Health insurance for the poor: impact on catastrophic and out-of-pocket health expenditures in Mexico

    PubMed Central

    Galárraga, Omar; Salinas-Rodríguez, Aarón; Sesma-Vázquez, Sergio

    2009-01-01

    The goal of Seguro Popular (SP) in Mexico was to improve the financial protection of the uninsured population against excessive health expenditures. This paper estimates the impact of SP on catastrophic health expenditures (CHE), as well as out-of-pocket (OOP) health expenditures, from two different sources. First, we use the SP Impact Evaluation Survey (2005–2006), and compare the instrumental variables (IV) results with the experimental benchmark. Then, we use the same IV methods with the National Health and Nutrition Survey (ENSANUT 2006). We estimate naïve models, assuming exogeneity, and contrast them with IV models that take advantage of the specific SP implementation mechanisms for identification. The IV models estimated included two-stage least squares (2SLS), bivariate probit, and two-stage residual inclusion (2SRI) models. Instrumental variables estimates resulted in comparable estimates against the “gold standard.” Instrumental variables estimates indicate a reduction of 54% in catastrophic expenditures at the national level. SP beneficiaries also had lower expenditures on outpatient and medicine expenditures. The selection-corrected protective effect is found not only in the limited experimental dataset, but also at the national level. PMID:19756796

  11. Health insurance for the poor: impact on catastrophic and out-of-pocket health expenditures in Mexico.

    PubMed

    Galárraga, Omar; Sosa-Rubí, Sandra G; Salinas-Rodríguez, Aarón; Sesma-Vázquez, Sergio

    2010-10-01

    The goal of Seguro Popular (SP) in Mexico was to improve the financial protection of the uninsured population against excessive health expenditures. This paper estimates the impact of SP on catastrophic health expenditures (CHE), as well as out-of-pocket (OOP) health expenditures, from two different sources. First, we use the SP Impact Evaluation Survey (2005-2006), and compare the instrumental variables (IV) results with the experimental benchmark. Then, we use the same IV methods with the National Health and Nutrition Survey (ENSANUT 2006). We estimate naïve models, assuming exogeneity, and contrast them with IV models that take advantage of the specific SP implementation mechanisms for identification. The IV models estimated included two-stage least squares (2SLS), bivariate probit, and two-stage residual inclusion (2SRI) models. Instrumental variables estimates resulted in comparable estimates against the "gold standard." Instrumental variables estimates indicate a reduction of 54% in catastrophic expenditures at the national level. SP beneficiaries also had lower expenditures on outpatient and medicine expenditures. The selection-corrected protective effect is found not only in the limited experimental dataset, but also at the national level.

  12. Caloric Beverage Intake Among Adult Supplemental Nutrition Assistance Program Participants

    PubMed Central

    2014-01-01

    Objectives. We compared sugar-sweetened beverage (SSB), alcohol, and other caloric beverage (juice and milk) consumption of Supplemental Nutrition Assistance Program (SNAP) participants with that of low-income nonparticipants. Methods. We used 1 day of dietary intake data from the 2005–2008 National Health and Nutrition Examination Survey for 4594 adults aged 20 years and older with household income at or below 250% of the federal poverty line. We used bivariate and multivariate methods to compare the probability of consuming and the amount of calories consumed for each beverage type across 3 groups: current SNAP participants, former participants, and nonparticipants. We used instrumental variable methods to control for unobservable differences in participant groups. Results. After controlling for observable characteristics, SNAP participants were no more likely to consume SSBs than were nonparticipants. Instrumental variable estimates showed that current participants consumed fewer calories from SSBs than did similar nonparticipants. We found no differences in alcoholic beverage consumption, which cannot be purchased with SNAP benefits. Conclusions. SNAP participants are not unique in their consumption of SSBs or alcoholic beverages. Purchase restrictions may have little effect on SSB consumption. PMID:25033141

  13. Calibration of the COBE FIRAS instrument

    NASA Technical Reports Server (NTRS)

    Fixsen, D. J.; Cheng, E. S.; Cottingham, D. A.; Eplee, R. E., Jr.; Hewagama, T.; Isaacman, R. B.; Jensen, K. A.; Mather, J. C.; Massa, D. L.; Meyer, S. S.

    1994-01-01

    The Far-Infrared Absolute Spectrophotometer (FIRAS) instrument on the Cosmic Background Explorer (COBE) satellite was designed to accurately measure the spectrum of the cosmic microwave background radiation (CMBR) in the frequency range 1-95/cm with an angular resolution of 7 deg. We describe the calibration of this instrument, including the method of obtaining calibration data, reduction of data, the instrument model, fitting the model to the calibration data, and application of the resulting model solution to sky observations. The instrument model fits well for calibration data that resemble sky condition. The method of propagating detector noise through the calibration process to yield a covariance matrix of the calibrated sky data is described. The final uncertainties are variable both in frequency and position, but for a typical calibrated sky 2.6 deg square pixel and 0.7/cm spectral element the random detector noise limit is of order of a few times 10(exp -7) ergs/sq cm/s/sr cm for 2-20/cm, and the difference between the sky and the best-fit cosmic blackbody can be measured with a gain uncertainty of less than 3%.

  14. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care.

    PubMed

    Kowalski, Amanda

    2016-01-02

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.

  15. Optical wavelength selection for portable hemoglobin determination by near-infrared spectroscopy method

    NASA Astrophysics Data System (ADS)

    Tian, Han; Li, Ming; Wang, Yue; Sheng, Dinggao; Liu, Jun; Zhang, Linna

    2017-11-01

    Hemoglobin concentration is commonly used in clinical medicine to diagnose anemia, identify bleeding, and manage red blood cell transfusions. The golden standard method for determining hemoglobin concentration in blood requires reagent. Spectral methods were advantageous at fast and non-reagent measurement. However, model calibration with full spectrum is time-consuming. Moreover, it is necessary to use a few variables considering size and cost of instrumentation, especially for a portable biomedical instrument. This study presents different wavelength selection methods for optical wavelengths for total hemoglobin concentration determination in whole blood. The results showed that modelling using only two wavelengths combination (1143 nm, 1298 nm) can keep on the fine predictability with full spectrum. It appears that the proper selection of optical wavelengths can be more effective than using the whole spectra for determination hemoglobin in whole blood. We also discussed the influence of water absorptivity on the wavelength selection. This research provides valuable references for designing portable NIR instruments determining hemoglobin concentration, and may provide some experience for noninvasive hemoglobin measurement by NIR methods.

  16. Validation of ocean color sensors using a profiling hyperspectral radiometer

    NASA Astrophysics Data System (ADS)

    Ondrusek, M. E.; Stengel, E.; Rella, M. A.; Goode, W.; Ladner, S.; Feinholz, M.

    2014-05-01

    Validation measurements of satellite ocean color sensors require in situ measurements that are accurate, repeatable and traceable enough to distinguish variability between in situ measurements and variability in the signal being observed on orbit. The utility of using a Satlantic Profiler II equipped with HyperOCR radiometers (Hyperpro) for validating ocean color sensors is tested by assessing the stability of the calibration coefficients and by comparing Hyperpro in situ measurements to other instruments and between different Hyperpros in a variety of water types. Calibration and characterization of the NOAA Satlantic Hyperpro instrument is described and concurrent measurements of water-leaving radiances conducted during cruises are presented between this profiling instrument and other profiling, above-water and moored instruments. The moored optical instruments are the US operated Marine Optical BuoY (MOBY) and the French operated Boussole Buoy. In addition, Satlantic processing versions are described in terms of accuracy and consistency. A new multi-cast approach is compared to the most commonly used single cast method. Analysis comparisons are conducted in turbid and blue water conditions. Examples of validation matchups with VIIRS ocean color data are presented. With careful data collection and analysis, the Satlantic Hyperpro profiling radiometer has proven to be a reliable and consistent tool for satellite ocean color validation.

  17. Treatment Effect Estimation Using Nonlinear Two-Stage Instrumental Variable Estimators: Another Cautionary Note.

    PubMed

    Chapman, Cole G; Brooks, John M

    2016-12-01

    To examine the settings of simulation evidence supporting use of nonlinear two-stage residual inclusion (2SRI) instrumental variable (IV) methods for estimating average treatment effects (ATE) using observational data and investigate potential bias of 2SRI across alternative scenarios of essential heterogeneity and uniqueness of marginal patients. Potential bias of linear and nonlinear IV methods for ATE and local average treatment effects (LATE) is assessed using simulation models with a binary outcome and binary endogenous treatment across settings varying by the relationship between treatment effectiveness and treatment choice. Results show that nonlinear 2SRI models produce estimates of ATE and LATE that are substantially biased when the relationships between treatment and outcome for marginal patients are unique from relationships for the full population. Bias of linear IV estimates for LATE was low across all scenarios. Researchers are increasingly opting for nonlinear 2SRI to estimate treatment effects in models with binary and otherwise inherently nonlinear dependent variables, believing that it produces generally unbiased and consistent estimates. This research shows that positive properties of nonlinear 2SRI rely on assumptions about the relationships between treatment effect heterogeneity and choice. © Health Research and Educational Trust.

  18. Factors Associated with Physical Activity Literacy among Foster Parents

    ERIC Educational Resources Information Center

    Dominick, Gregory M.; Friedman, Daniela B.; Saunders, Ruth P.; Hussey, Jim R.; Watkins, Ken W.; W.

    2012-01-01

    Objectives: To explore associations between physical activity (PA) literacy and psychosocial constructs for providing instrumental social support for youth PA. Methods: Ninety-one foster parents completed surveys assessing PA literacy (overall and specific), perceptions of child PA, coordination, PA enjoyment, psychosocial variables:…

  19. Hopes and Cautions for Instrument-Based Evaluation of Consent Capacity: Results of a Construct Validity Study of Three Instruments

    PubMed Central

    Moye, Jennifer; Azar, Annin R.; Karel, Michele J.; Gurrera, Ronald J.

    2016-01-01

    Does instrument based evaluation of consent capacity increase the precision and validity of competency assessment or does ostensible precision provide a false sense of confidence without in fact improving validity? In this paper we critically examine the evidence for construct validity of three instruments for measuring four functional abilities important in consent capacity: understanding, appreciation, reasoning, and expressing a choice. Instrument based assessment of these abilities is compared through investigation of a multi-trait multi-method matrix in 88 older adults with mild to moderate dementia. Results find variable support for validity. There appears to be strong evidence for good hetero-method validity for the measurement of understanding, mixed evidence for validity in the measurement of reasoning, and strong evidence for poor hetero-method validity for the concepts of appreciation and expressing a choice, although the latter is likely due to extreme range restrictions. The development of empirically based tools for use in capacity evaluation should ultimately enhance the reliability and validity of assessment, yet clearly more research is needed to define and measure the constructs of decisional capacity. We would also emphasize that instrument based assessment of capacity is only one part of a comprehensive evaluation of competency which includes consideration of diagnosis, psychiatric and/or cognitive symptomatology, risk involved in the situation, and individual and cultural differences. PMID:27330455

  20. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care

    PubMed Central

    Kowalski, Amanda

    2015-01-01

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member’s injury to induce variation in an individual’s own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from −0.76 to −1.49, which are an order of magnitude larger than previous estimates. PMID:26977117

  1. Physicians' prescribing preferences were a potential instrument for patients' actual prescriptions of antidepressants☆

    PubMed Central

    Davies, Neil M.; Gunnell, David; Thomas, Kyla H.; Metcalfe, Chris; Windmeijer, Frank; Martin, Richard M.

    2013-01-01

    Objectives To investigate whether physicians' prescribing preferences were valid instrumental variables for the antidepressant prescriptions they issued to their patients. Study Design and Setting We investigated whether physicians' previous prescriptions of (1) tricyclic antidepressants (TCAs) vs. selective serotonin reuptake inhibitors (SSRIs) and (2) paroxetine vs. other SSRIs were valid instruments. We investigated whether the instrumental variable assumptions are likely to hold and whether TCAs (vs. SSRIs) were associated with hospital admission for self-harm or death by suicide using both conventional and instrumental variable regressions. The setting for the study was general practices in the United Kingdom. Results Prior prescriptions were strongly associated with actual prescriptions: physicians who previously prescribed TCAs were 14.9 percentage points (95% confidence interval [CI], 14.4, 15.4) more likely to prescribe TCAs, and those who previously prescribed paroxetine were 27.7 percentage points (95% CI, 26.7, 28.8) more likely to prescribe paroxetine, to their next patient. Physicians' previous prescriptions were less strongly associated with patients' baseline characteristics than actual prescriptions. We found no evidence that the estimated association of TCAs with self-harm/suicide using instrumental variable regression differed from conventional regression estimates (P-value = 0.45). Conclusion The main instrumental variable assumptions held, suggesting that physicians' prescribing preferences are valid instruments for evaluating the short-term effects of antidepressants. PMID:24075596

  2. Developing an instrument to measure heart failure disease management program intensity and complexity.

    PubMed

    Riegel, Barbara; Lee, Christopher S; Sochalski, Julie

    2010-05-01

    Comparing disease management programs and their effects is difficult because of wide variability in program intensity and complexity. The purpose of this effort was to develop an instrument that can be used to describe the intensity and complexity of heart failure (HF) disease management programs. Specific composition criteria were taken from the American Heart Association (AHA) taxonomy of disease management and hierarchically scored to allow users to describe the intensity and complexity of the domains and subdomains of HF disease management programs. The HF Disease Management Scoring Instrument (HF-DMSI) incorporates 6 of the 8 domains from the taxonomy: recipient, intervention content, delivery personnel, method of communication, intensity/complexity, and environment. The 3 intervention content subdomains (education/counseling, medication management, and peer support) are described separately. In this first test of the HF-DMSI, overall intensity (measured as duration) and complexity were rated using an ordinal scoring system. Possible scores reflect a clinical rationale and differ by category, with zero given only if the element could potentially be missing (eg, surveillance by remote monitoring). Content validity was evident as the instrument matches the existing AHA taxonomy. After revision and refinement, 2 authors obtained an inter-rater reliability intraclass correlation coefficient score of 0.918 (confidence interval, 0.880 to 0.944, P<0.001) in their rating of 12 studies. The areas with most variability among programs were delivery personnel and method of communication. The HF-DMSI is useful for describing the intensity and complexity of HF disease management programs.

  3. Quantitative Detection of Trace Explosive Vapors by Programmed Temperature Desorption Gas Chromatography-Electron Capture Detector

    PubMed Central

    Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.

    2014-01-01

    The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416

  4. Quantitative detection of trace explosive vapors by programmed temperature desorption gas chromatography-electron capture detector.

    PubMed

    Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L

    2014-07-25

    The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.

  5. Statistical methods used to test for agreement of medical instruments measuring continuous variables in method comparison studies: a systematic review.

    PubMed

    Zaki, Rafdzah; Bulgiba, Awang; Ismail, Roshidi; Ismail, Noor Azina

    2012-01-01

    Accurate values are a must in medicine. An important parameter in determining the quality of a medical instrument is agreement with a gold standard. Various statistical methods have been used to test for agreement. Some of these methods have been shown to be inappropriate. This can result in misleading conclusions about the validity of an instrument. The Bland-Altman method is the most popular method judging by the many citations of the article proposing this method. However, the number of citations does not necessarily mean that this method has been applied in agreement research. No previous study has been conducted to look into this. This is the first systematic review to identify statistical methods used to test for agreement of medical instruments. The proportion of various statistical methods found in this review will also reflect the proportion of medical instruments that have been validated using those particular methods in current clinical practice. Five electronic databases were searched between 2007 and 2009 to look for agreement studies. A total of 3,260 titles were initially identified. Only 412 titles were potentially related, and finally 210 fitted the inclusion criteria. The Bland-Altman method is the most popular method with 178 (85%) studies having used this method, followed by the correlation coefficient (27%) and means comparison (18%). Some of the inappropriate methods highlighted by Altman and Bland since the 1980s are still in use. This study finds that the Bland-Altman method is the most popular method used in agreement research. There are still inappropriate applications of statistical methods in some studies. It is important for a clinician or medical researcher to be aware of this issue because misleading conclusions from inappropriate analyses will jeopardize the quality of the evidence, which in turn will influence quality of care given to patients in the future.

  6. Statistical Methods Used to Test for Agreement of Medical Instruments Measuring Continuous Variables in Method Comparison Studies: A Systematic Review

    PubMed Central

    Zaki, Rafdzah; Bulgiba, Awang; Ismail, Roshidi; Ismail, Noor Azina

    2012-01-01

    Background Accurate values are a must in medicine. An important parameter in determining the quality of a medical instrument is agreement with a gold standard. Various statistical methods have been used to test for agreement. Some of these methods have been shown to be inappropriate. This can result in misleading conclusions about the validity of an instrument. The Bland-Altman method is the most popular method judging by the many citations of the article proposing this method. However, the number of citations does not necessarily mean that this method has been applied in agreement research. No previous study has been conducted to look into this. This is the first systematic review to identify statistical methods used to test for agreement of medical instruments. The proportion of various statistical methods found in this review will also reflect the proportion of medical instruments that have been validated using those particular methods in current clinical practice. Methodology/Findings Five electronic databases were searched between 2007 and 2009 to look for agreement studies. A total of 3,260 titles were initially identified. Only 412 titles were potentially related, and finally 210 fitted the inclusion criteria. The Bland-Altman method is the most popular method with 178 (85%) studies having used this method, followed by the correlation coefficient (27%) and means comparison (18%). Some of the inappropriate methods highlighted by Altman and Bland since the 1980s are still in use. Conclusions This study finds that the Bland-Altman method is the most popular method used in agreement research. There are still inappropriate applications of statistical methods in some studies. It is important for a clinician or medical researcher to be aware of this issue because misleading conclusions from inappropriate analyses will jeopardize the quality of the evidence, which in turn will influence quality of care given to patients in the future. PMID:22662248

  7. The Use of Linear Instrumental Variables Methods in Health Services Research and Health Economics: A Cautionary Note

    PubMed Central

    Terza, Joseph V; Bradford, W David; Dismuke, Clara E

    2008-01-01

    Objective To investigate potential bias in the use of the conventional linear instrumental variables (IV) method for the estimation of causal effects in inherently nonlinear regression settings. Data Sources Smoking Supplement to the 1979 National Health Interview Survey, National Longitudinal Alcohol Epidemiologic Survey, and simulated data. Study Design Potential bias from the use of the linear IV method in nonlinear models is assessed via simulation studies and real world data analyses in two commonly encountered regression setting: (1) models with a nonnegative outcome (e.g., a count) and a continuous endogenous regressor; and (2) models with a binary outcome and a binary endogenous regressor. Principle Findings The simulation analyses show that substantial bias in the estimation of causal effects can result from applying the conventional IV method in inherently nonlinear regression settings. Moreover, the bias is not attenuated as the sample size increases. This point is further illustrated in the survey data analyses in which IV-based estimates of the relevant causal effects diverge substantially from those obtained with appropriate nonlinear estimation methods. Conclusions We offer this research as a cautionary note to those who would opt for the use of linear specifications in inherently nonlinear settings involving endogeneity. PMID:18546544

  8. Scale Reliability Evaluation with Heterogeneous Populations

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  9. A Discourse on Human Systems Integration

    DTIC Science & Technology

    2010-09-01

    criterion for one of the study tasks. 2. Research Design This was a mixed methods study. The first-phase quantitative portion of the study...used a quasi-experimental, posttest-only with nonequivalent groups design . The independent variables were defined as follows: • The categorical... Research Design ...............................................................................287 3. Instruments

  10. Predictors of Life Satisfaction in Individuals with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Miller, S. M.; Chan, F.

    2008-01-01

    Background: The purpose of this study was to examine factors that predict life satisfaction in individuals with intellectual disabilities (ID). Two groups of variables were studied: life skills (interpersonal, instrumental and leisure) and higher-order predictors (social support, self-determination and productivity). Method: Fifty-six participants…

  11. Association of Body Mass Index with Depression, Anxiety and Suicide—An Instrumental Variable Analysis of the HUNT Study

    PubMed Central

    Bjørngaard, Johan Håkon; Carslake, David; Lund Nilsen, Tom Ivar; Linthorst, Astrid C. E.; Davey Smith, George; Gunnell, David; Romundstad, Pål Richard

    2015-01-01

    Objective While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health. Methods We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index. Results Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43) for depression, 1.10 (95% CI: 0.95, 1.27) for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63). Conclusion The present study’s results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not for anxiety. PMID:26167892

  12. The Trojan Lifetime Champions Health Survey: development, validity, and reliability.

    PubMed

    Sorenson, Shawn C; Romano, Russell; Scholefield, Robin M; Schroeder, E Todd; Azen, Stanley P; Salem, George J

    2015-04-01

    Self-report questionnaires are an important method of evaluating lifespan health, exercise, and health-related quality of life (HRQL) outcomes among elite, competitive athletes. Few instruments, however, have undergone formal characterization of their psychometric properties within this population. To evaluate the validity and reliability of a novel health and exercise questionnaire, the Trojan Lifetime Champions (TLC) Health Survey. Descriptive laboratory study. A large National Collegiate Athletic Association Division I university. A total of 63 university alumni (age range, 24 to 84 years), including former varsity collegiate athletes and a control group of nonathletes. Participants completed the TLC Health Survey twice at a mean interval of 23 days with randomization to the paper or electronic version of the instrument. Content validity, feasibility of administration, test-retest reliability, parallel-form reliability between paper and electronic forms, and estimates of systematic and typical error versus differences of clinical interest were assessed across a broad range of health, exercise, and HRQL measures. Correlation coefficients, including intraclass correlation coefficients (ICCs) for continuous variables and κ agreement statistics for ordinal variables, for test-retest reliability averaged 0.86, 0.90, 0.80, and 0.74 for HRQL, lifetime health, recent health, and exercise variables, respectively. Correlation coefficients, again ICCs and κ, for parallel-form reliability (ie, equivalence) between paper and electronic versions averaged 0.90, 0.85, 0.85, and 0.81 for HRQL, lifetime health, recent health, and exercise variables, respectively. Typical measurement error was less than the a priori thresholds of clinical interest, and we found minimal evidence of systematic test-retest error. We found strong evidence of content validity, convergent construct validity with the Short-Form 12 Version 2 HRQL instrument, and feasibility of administration in an elite, competitive athletic population. These data suggest that the TLC Health Survey is a valid and reliable instrument for assessing lifetime and recent health, exercise, and HRQL, among elite competitive athletes. Generalizability of the instrument may be enhanced by additional, larger-scale studies in diverse populations.

  13. An overview of the laser ranging method of space laser altimeter

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Chen, Yuwei; Hyyppä, Juha; Li, Song

    2017-11-01

    Space laser altimeter is an active remote sensing instrument to measure topographic map of Earth, Moon and planetary. The space laser altimeter determines the range between the instrument and laser footprint by measuring round trip time of laser pulse. The return pulse reflected from ground surface is gathered by the receiver of space laser altimeter, the pulsewidth and amplitude of which are changeable with the variability of the ground relief. Meantime, several kinds of noise overlapped on the return pulse signal affect its signal-to-noise ratio. To eliminate the influence of these factors that cause range walk and range uncertainty, the reliable laser ranging methods need to be implemented to obtain high-precision range results. Based on typical space laser altimeters in the past few decades, various ranging methods are expounded in detail according to the operational principle of instruments and timing method. By illustrating the concrete procedure of determining time of flight of laser pulse, this overview provides the comparison of the employed technologies in previous and undergoing research programs and prospect innovative technology for space laser altimeters in future.

  14. A rapid leaf-disc sampler for psychrometric water potential measurements.

    PubMed

    Wullschleger, S D; Oosterhuis, D M

    1986-06-01

    An instrument was designed which facilitates faster and more accurate sampling of leaf discs for psychrometric water potential measurements. The instrument consists of an aluminum housing, a spring-loaded plunger, and a modified brass-plated cork borer. The leaf-disc sampler was compared with the conventional method of sampling discs for measurement of leaf water potential with thermocouple psychrometers on a range of plant material including Gossypium hirsutum L., Zea mays L., and Begonia rex-cultorum L. The new sampler permitted a leaf disc to be excised and inserted into the psychrometer sample chamber in less than 7 seconds, which was more than twice as fast as the conventional method. This resulted in more accurate determinations of leaf water potential due to reduced evaporative water losses. The leaf-disc sampler also significantly reduced sample variability between individual measurements. This instrument can be used for many other laboratory and field measurements that necessitate leaf disc sampling.

  15. A dynamic response and eye scanning data base useful in the development of theories and methods for the description of control/display relationships

    NASA Technical Reports Server (NTRS)

    Klein, R.

    1972-01-01

    A set of specially prepared digital tapes is reported which contain synchronized measurements of pilot scanning behavior, control response, and vehicle response obtained during instrument landing system approaches made in a fixed-base DC-8 transport simulator. The objective of the master tape is to provide a common data base which can be used by the research community to test theories, models, and methods for describing and analyzing control/display relations and interactions. The experimental conditions and tasks used to obtain the data and the detailed format of the tapes are described. Conventional instrument panel and controls were used, with simulated vertical gust and glide slope beam bend forcing functions. Continuous pilot eye fixations and scan traffic on the panel were measured. Both flight director and standard localizer/glide slope types of approaches were made, with both fixed and variable instrument range sensitivities.

  16. A method and instruments to identify the torque, the power and the efficiency of an internal combustion engine of a wheeled vehicle

    NASA Astrophysics Data System (ADS)

    Egorov, A. V.; Kozlov, K. E.; Belogusev, V. N.

    2018-01-01

    In this paper, we propose a new method and instruments to identify the torque, the power, and the efficiency of internal combustion engines in transient conditions. This method, in contrast to the commonly used non-demounting methods based on inertia and strain gauge dynamometers, allows controlling the main performance parameters of internal combustion engines in transient conditions without inaccuracy connected with the torque loss due to its transfer to the driving wheels, on which the torque is measured with existing methods. In addition, the proposed method is easy to create, and it does not use strain measurement instruments, the application of which does not allow identifying the variable values of the measured parameters with high measurement rate; and therefore the use of them leads to the impossibility of taking into account the actual parameters when engineering the wheeled vehicles. Thus the use of this method can greatly improve the measurement accuracy and reduce costs and laboriousness during testing of internal combustion engines. The results of experiments showed the applicability of the proposed method for identification of the internal combustion engines performance parameters. In this paper, it was determined the most preferred transmission ratio when using the proposed method.

  17. [Development and validation of a questionnaire about the main variables affecting the individual investor's behavior in the Stock Exchange].

    PubMed

    Pascual-Ezama, David; San Martín Castellanos, Rafael; Gil-Gómez de Liaño, Beatriz; Scandroglio, Bárbara

    2010-11-01

    Development and validation of a questionnaire about the main variables affecting the individual investor's behavior in the Stock Exchange. There is a considerable lack of information about the methodology usually used in most of the studies about individual investor's behavior. The studies reviewed do not show the method used in the selection of the items or the psychometric properties of the questionnaires. Because of the importance of investment in the Stock Exchange nowadays, it seems relevant to obtain a reliable instrument to understand individual investor's behavior in the Stock Exchange. Therefore, the goal of the present work is to validate a questionnaire about the main variables involved in individual investors' behavior in the Stock Exchange. Based on previous studies, we elaborated a questionnaire using the Delphi methodology with a group of experts. The internal consistency (Cronbach alpha=.934) and validity evidence of the questionnaire show that it may be an effective instrument and can be applied with some assurance.

  18. Application of six sigma and AHP in analysis of variable lead time calibration process instrumentation

    NASA Astrophysics Data System (ADS)

    Rimantho, Dino; Rahman, Tomy Abdul; Cahyadi, Bambang; Tina Hernawati, S.

    2017-02-01

    Calibration of instrumentation equipment in the pharmaceutical industry is an important activity to determine the true value of a measurement. Preliminary studies indicated that occur lead-time calibration resulted in disruption of production and laboratory activities. This study aimed to analyze the causes of lead-time calibration. Several methods used in this study such as, Six Sigma in order to determine the capability process of the calibration instrumentation of equipment. Furthermore, the method of brainstorming, Pareto diagrams, and Fishbone diagrams were used to identify and analyze the problems. Then, the method of Hierarchy Analytical Process (AHP) was used to create a hierarchical structure and prioritize problems. The results showed that the value of DPMO around 40769.23 which was equivalent to the level of sigma in calibration equipment approximately 3,24σ. This indicated the need for improvements in the calibration process. Furthermore, the determination of problem-solving strategies Lead Time Calibration such as, shortens the schedule preventive maintenance, increase the number of instrument Calibrators, and train personnel. Test results on the consistency of the whole matrix of pairwise comparisons and consistency test showed the value of hierarchy the CR below 0.1.

  19. Multi-Epoch Mid-Infrared Interferometric Observations of the Oxygen-rich Mira Variable Star RR Aql with the VLTI/MIDI Instrument

    DTIC Science & Technology

    2011-01-01

    VLTI/ MIDI Instrument I. Karovicova,1,3 M. Wittkowski,1 D. A. Boboltz,2 E. Fossat,3 K. Ohnaka,4 and M. Scholz5,6 1European Southern Observatory...the oxygen-rich Mira variable RR Aql at 13 epochs covering 4 pulsation cycles with the MIDI instrument at the VLTI. We modeled the observed data...Variable Star RR Aql with the VLTI/ MIDI Instrument 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  20. Quantifying the process and outcomes of person-centered planning.

    PubMed

    Holburn, S; Jacobson, J W; Vietze, P M; Schwartz, A A; Sersen, E

    2000-09-01

    Although person-centered planning is a popular approach in the field of developmental disabilities, there has been little systematic assessment of its process and outcomes. To measure person-centered planning, we developed three instruments designed to assess its various aspects. We then constructed variables comprising both a Process and an Outcome Index using a combined rational-empirical method. Test-retest reliability and measures of internal consistency appeared adequate. Variable correlations and factor analysis were generally consistent with our conceptualization and resulting item and variable classifications. Practical implications for intervention integrity, program evaluation, and organizational performance are discussed.

  1. Using the Nobel Laureates in Economics to Teach Quantitative Methods

    ERIC Educational Resources Information Center

    Becker, William E.; Greene, William H.

    2005-01-01

    The authors show how the work of Nobel Laureates in economics can enhance student understanding and bring them up to date on topics such as probability, uncertainty and decision theory, hypothesis testing, regression to the mean, instrumental variable techniques, discrete choice modeling, and time-series analysis. (Contains 2 notes.)

  2. The effect of the earth's and stray magnetic fields on mobile mass spectrometer systems.

    PubMed

    Bell, Ryan J; Davey, Nicholas G; Martinsen, Morten; Short, R Timothy; Gill, Chris G; Krogh, Erik T

    2015-02-01

    Development of small, field-portable mass spectrometers has enabled a rapid growth of in-field measurements on mobile platforms. In such in-field measurements, unexpected signal variability has been observed by the authors in portable ion traps with internal electron ionization. The orientation of magnetic fields (such as the Earth's) relative to the ionization electron beam trajectory can significantly alter the electron flux into a quadrupole ion trap, resulting in significant changes in the instrumental sensitivity. Instrument simulations and experiments were performed relative to the earth's magnetic field to assess the importance of (1) nonpoint-source electron sources, (2) vertical versus horizontal electron beam orientation, and (3) secondary magnetic fields created by the instrument itself. Electron lens focus effects were explored by additional simulations, and were paralleled by experiments performed with a mass spectrometer mounted on a rotating platform. Additionally, magnetically permeable metals were used to shield (1) the entire instrument from the Earth's magnetic field, and (2) the electron beam from both the Earth's and instrument's magnetic fields. Both simulation and experimental results suggest the predominant influence on directionally dependent signal variability is the result of the summation of two magnetic vectors. As such, the most effective method for reducing this effect is the shielding of the electron beam from both magnetic vectors, thus improving electron beam alignment and removing any directional dependency. The improved ionizing electron beam alignment also allows for significant improvements in overall instrument sensitivity.

  3. An instrumental variable random-coefficients model for binary outcomes

    PubMed Central

    Chesher, Andrew; Rosen, Adam M

    2014-01-01

    In this paper, we study a random-coefficients model for a binary outcome. We allow for the possibility that some or even all of the explanatory variables are arbitrarily correlated with the random coefficients, thus permitting endogeneity. We assume the existence of observed instrumental variables Z that are jointly independent with the random coefficients, although we place no structure on the joint determination of the endogenous variable X and instruments Z, as would be required for a control function approach. The model fits within the spectrum of generalized instrumental variable models, and we thus apply identification results from our previous studies of such models to the present context, demonstrating their use. Specifically, we characterize the identified set for the distribution of random coefficients in the binary response model with endogeneity via a collection of conditional moment inequalities, and we investigate the structure of these sets by way of numerical illustration. PMID:25798048

  4. Discriminant Validity Assessment: Use of Fornell & Larcker criterion versus HTMT Criterion

    NASA Astrophysics Data System (ADS)

    Hamid, M. R. Ab; Sami, W.; Mohmad Sidek, M. H.

    2017-09-01

    Assessment of discriminant validity is a must in any research that involves latent variables for the prevention of multicollinearity issues. Fornell and Larcker criterion is the most widely used method for this purpose. However, a new method has emerged for establishing the discriminant validity assessment through heterotrait-monotrait (HTMT) ratio of correlations method. Therefore, this article presents the results of discriminant validity assessment using these methods. Data from previous study was used that involved 429 respondents for empirical validation of value-based excellence model in higher education institutions (HEI) in Malaysia. From the analysis, the convergent, divergent and discriminant validity were established and admissible using Fornell and Larcker criterion. However, the discriminant validity is an issue when employing the HTMT criterion. This shows that the latent variables under study faced the issue of multicollinearity and should be looked into for further details. This also implied that the HTMT criterion is a stringent measure that could detect the possible indiscriminant among the latent variables. In conclusion, the instrument which consisted of six latent variables was still lacking in terms of discriminant validity and should be explored further.

  5. Instrumental variable specifications and assumptions for longitudinal analysis of mental health cost offsets.

    PubMed

    O'Malley, A James

    2012-12-01

    Instrumental variables (IVs) enable causal estimates in observational studies to be obtained in the presence of unmeasured confounders. In practice, a diverse range of models and IV specifications can be brought to bear on a problem, particularly with longitudinal data where treatment effects can be estimated for various functions of current and past treatment. However, in practice the empirical consequences of different assumptions are seldom examined, despite the fact that IV analyses make strong assumptions that cannot be conclusively tested by the data. In this paper, we consider several longitudinal models and specifications of IVs. Methods are applied to data from a 7-year study of mental health costs of atypical and conventional antipsychotics whose purpose was to evaluate whether the newer and more expensive atypical antipsychotic medications lead to a reduction in overall mental health costs.

  6. ESTIMATING PERSON-CENTERED TREATMENT (PeT) EFFECTS USING INSTRUMENTAL VARIABLES: AN APPLICATION TO EVALUATING PROSTATE CANCER TREATMENTS

    PubMed Central

    BASU, ANIRBAN

    2014-01-01

    SUMMARY This paper builds on the methods of local instrumental variables developed by Heckman and Vytlacil (1999, 2001, 2005) to estimate person-centered treatment (PeT) effects that are conditioned on the person’s observed characteristics and averaged over the potential conditional distribution of unobserved characteristics that lead them to their observed treatment choices. PeT effects are more individualized than conditional treatment effects from a randomized setting with the same observed characteristics. PeT effects can be easily aggregated to construct any of the mean treatment effect parameters and, more importantly, are well suited to comprehend individual-level treatment effect heterogeneity. The paper presents the theory behind PeT effects, and applies it to study the variation in individual-level comparative effects of prostate cancer treatments on overall survival and costs. PMID:25620844

  7. Individual differences in toddlers' social understanding and prosocial behavior: disposition or socialization?

    PubMed

    Gross, Rebekkah L; Drummond, Jesse; Satlof-Bedrick, Emma; Waugh, Whitney E; Svetlova, Margarita; Brownell, Celia A

    2015-01-01

    We examined how individual differences in social understanding contribute to variability in early-appearing prosocial behavior. Moreover, potential sources of variability in social understanding were explored and examined as additional possible predictors of prosocial behavior. Using a multi-method approach with both observed and parent-report measures, 325 children aged 18-30 months were administered measures of social understanding (e.g., use of emotion words; self-understanding), prosocial behavior (in separate tasks measuring instrumental helping, empathic helping, and sharing, as well as parent-reported prosociality at home), temperament (fearfulness, shyness, and social fear), and parental socialization of prosocial behavior in the family. Individual differences in social understanding predicted variability in empathic helping and parent-reported prosociality, but not instrumental helping or sharing. Parental socialization of prosocial behavior was positively associated with toddlers' social understanding, prosocial behavior at home, and instrumental helping in the lab, and negatively associated with sharing (possibly reflecting parents' increased efforts to encourage children who were less likely to share). Further, socialization moderated the association between social understanding and prosocial behavior, such that social understanding was less predictive of prosocial behavior among children whose parents took a more active role in socializing their prosociality. None of the dimensions of temperament was associated with either social understanding or prosocial behavior. Parental socialization of prosocial behavior is thus an important source of variability in children's early prosociality, acting in concert with early differences in social understanding, with different patterns of influence for different subtypes of prosocial behavior.

  8. A Culture-Specific Nutrient Intake Assessment Instrument in Patients with Pulmonary Tuberculosis

    PubMed Central

    Frediani, Jennifer K.; Tukvadze, Nestani; Sanikidze, Ekaterina; Kipiani, Maia; Hebbar, Gautam; Easley, Kirk A.; Shenvi, Neeta; Ramakrishnan, Usha; Tangpricha, Vin; Blumberg, Henry M.; Ziegler, Thomas R.

    2013-01-01

    Background and Aim To develop and evaluate a culture-specific nutrient intake assessment tool for use in adults with pulmonary tuberculosis (TB) in Tbilisi, Georgia. Methods We developed an instrument to measure food intake over 3 consecutive days using a questionnaire format. The tool was then compared to 24 hour food recalls. Food intake data from 31 subjects with TB were analyzed using the Nutrient Database System for Research (NDS-R) dietary analysis program. Paired t-tests, Pearson correlations and intraclass correlation coefficients (ICC) were used to assess the agreement between the two methods of dietary intake for calculated nutrient intakes. Results The Pearson correlation coefficient for mean daily caloric intake between the 2 methods was 0.37 (P = 0.04) with a mean difference of 171 kcals/day (p = 0.34). The ICC was 0.38 (95% CI: 0.03 to 0.64) suggesting the within-patient variability may be larger than between-patient variability. Results for mean daily intake of total fat, total carbohydrate, total protein, retinol, vitamins D and E, thiamine, calcium, sodium, iron, selenium, copper, and zinc between the two assessment methods were also similar. Conclusions This novel nutrient intake assessment tool provided quantitative nutrient intake data from TB patients. These pilot data can inform larger studies in similar populations. PMID:23541173

  9. Versailles Project on Advanced Materials and Standards Interlaboratory Study on Measuring the Thickness and Chemistry of Nanoparticle Coatings Using XPS and LEIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belsey, Natalie A.; Cant, David J. H.; Minelli, Caterina

    We report the results of a VAMAS (Versailles Project on Advanced Materials and Standards) inter-laboratory study on the measurement of the shell thickness and chemistry of nanoparticle coatings. Peptide-coated gold particles were supplied to laboratories in two forms: a colloidal suspension in pure water and; particles dried onto a silicon wafer. Participants prepared and analyzed these samples using either X-ray photoelectron spectroscopy (XPS) or low energy ion scattering (LEIS). Careful data analysis revealed some significant sources of discrepancy, particularly for XPS. Degradation during transportation, storage or sample preparation resulted in a variability in thickness of 53 %. The calculation methodmore » chosen by XPS participants contributed a variability of 67 %. However, variability of 12 % was achieved for the samples deposited using a single method and by choosing photoelectron peaks that were not adversely affected by instrumental transmission effects. The study identified a need for more consistency in instrumental transmission functions and relative sensitivity factors, since this contributed a variability of 33 %. The results from the LEIS participants were more consistent, with variability of less than 10 % in thickness and this is mostly due to a common method of data analysis. The calculation was performed using a model developed for uniform, flat films and some participants employed a correction factor to account for the sample geometry, which appears warranted based upon a simulation of LEIS data from one of the participants and comparison to the XPS results.« less

  10. Close-range laser scanning in forests: towards physically based semantics across scales.

    PubMed

    Morsdorf, F; Kükenbrink, D; Schneider, F D; Abegg, M; Schaepman, M E

    2018-04-06

    Laser scanning with its unique measurement concept holds the potential to revolutionize the way we assess and quantify three-dimensional vegetation structure. Modern laser systems used at close range, be it on terrestrial, mobile or unmanned aerial platforms, provide dense and accurate three-dimensional data whose information just waits to be harvested. However, the transformation of such data to information is not as straightforward as for airborne and space-borne approaches, where typically empirical models are built using ground truth of target variables. Simpler variables, such as diameter at breast height, can be readily derived and validated. More complex variables, e.g. leaf area index, need a thorough understanding and consideration of the physical particularities of the measurement process and semantic labelling of the point cloud. Quantified structural models provide a framework for such labelling by deriving stem and branch architecture, a basis for many of the more complex structural variables. The physical information of the laser scanning process is still underused and we show how it could play a vital role in conjunction with three-dimensional radiative transfer models to shape the information retrieval methods of the future. Using such a combined forward and physically based approach will make methods robust and transferable. In addition, it avoids replacing observer bias from field inventories with instrument bias from different laser instruments. Still, an intensive dialogue with the users of the derived information is mandatory to potentially re-design structural concepts and variables so that they profit most of the rich data that close-range laser scanning provides.

  11. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    PubMed Central

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  12. Instrumental variable applications using nursing home prescribing preferences in comparative effectiveness research.

    PubMed

    Huybrechts, Krista F; Gerhard, Tobias; Franklin, Jessica M; Levin, Raisa; Crystal, Stephen; Schneeweiss, Sebastian

    2014-08-01

    Nursing home residents are of particular interest for comparative effectiveness research given their susceptibility to adverse treatment effects and systematic exclusion from trials. However, the risk of residual confounding because of unmeasured markers of declining health using conventional analytic methods is high. We evaluated the validity of instrumental variable (IV) methods based on nursing home prescribing preference to mitigate such confounding, using psychotropic medications to manage behavioral problems in dementia as a case study. A cohort using linked data from Medicaid, Medicare, Minimum Data Set, and Online Survey, Certification and Reporting for 2001-2004 was established. Dual-eligible patients ≥65 years who initiated psychotropic medication use after admission were selected. Nursing home prescribing preference was characterized using mixed-effects logistic regression models. The plausibility of IV assumptions was explored, and the association between psychotropic medication class and 180-day mortality was estimated. High-prescribing and low-prescribing nursing homes differed by a factor of 2. Each preference-based IV measure described a substantial proportion of variation in psychotropic medication choice (β(IV → treatment): 0.22-0.36). Measured patient characteristics were well balanced across patient groups based on instrument status (52% average reduction in Mahalanobis distance). There was no evidence that instrument status was associated with markers of nursing home quality of care. Findings indicate that IV analyses using nursing home prescribing preference may be a useful approach in comparative effectiveness studies, and should extend naturally to analyses including untreated comparison groups, which are of great scientific interest but subject to even stronger confounding. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Wage Determinants among Medical Doctors and Nurses in Spain

    ERIC Educational Resources Information Center

    Salas-Velasco, Manuel

    2010-01-01

    This paper examines the determination of wage rates for health professionals using three well known, and commonly used, econometric techniques: ordinary least squares, instrumental variables, and Heckman's method. The data come from a graduate survey and the analysis focuses on a regional labor market, due to nationwide information on salaries is…

  14. Mixing Methods in Randomized Controlled Trials (RCTs): Validation, Contextualization, Triangulation, and Control

    ERIC Educational Resources Information Center

    Spillane, James P.; Pareja, Amber Stitziel; Dorner, Lisa; Barnes, Carol; May, Henry; Huff, Jason; Camburn, Eric

    2010-01-01

    In this paper we described how we mixed research approaches in a Randomized Control Trial (RCT) of a school principal professional development program. Using examples from our study we illustrate how combining qualitative and quantitative data can address some key challenges from validating instruments and measures of mediator variables to…

  15. Do Students in Secondary Education Manifest Sexist Attitudes?

    ERIC Educational Resources Information Center

    Pozo, Carmen; Martos, Maria J.; Morillejo, Enrique Alonso

    2010-01-01

    Introduction: Sexism and sexist attitudes can give rise to gender violence. It is therefore important to analyze these variables at an early age (in secondary school classrooms); from this analysis we will have a basis for intervention. Method: The study sample consists of 962 secondary school students. Measuring instruments were used to assess…

  16. Rural Idaho Family Physicians' Scope of Practice

    ERIC Educational Resources Information Center

    Baker, Ed; Schmitz, David; Epperly, Ted; Nukui, Ayaka; Miller, Carissa Moffat

    2010-01-01

    Context: Scope of practice is an important factor in both training and recruiting rural family physicians. Purpose: To assess rural Idaho family physicians' scope of practice and to examine variations in scope of practice across variables such as gender, age and employment status. Methods: A survey instrument was developed based on a literature…

  17. Determining the Attitudes of Undergraduate Students Having Vocational Music Education towards Individual Instrument Course According to Different Variables

    ERIC Educational Resources Information Center

    Uluçay, Taner

    2017-01-01

    This study was carried out in order to determine attitudes of undergraduate students who studied music vocationally towards the individual instrument course according to the variables of grade, gender, individual instrument and graduated high school type. The research data were obtained from 102 undergraduate students studying in Erzincan…

  18. Thin layer activation techniques at the U-120 cyclotron of Bucharest

    NASA Astrophysics Data System (ADS)

    Constantinescu, B.; Ivanov, E. A.; Pascovici, G.; Popa-Simil, L.; Racolta, P. M.

    1994-05-01

    The Thin Layer Activation (TLA) technique is a nuclear method especially used for different types of wear (or corrosion) investigations. Experimental results for selection criteria of nuclear reactions for various tribological studies, using the IPNE U-120 classical variable energy Cyclotron are presented. Measuring methods for the main types of wear phenomena and home made instrumentations dedicated for TLA industrial applications are also reported. Some typical TLA tribological applications, a nuclear scanning method to obtain wear profile of piston-rings are presented as well.

  19. Instrumental variable approaches to identifying the causal effect of educational attainment on dementia risk

    PubMed Central

    Nguyen, Thu T.; Tchetgen Tchetgen, Eric J.; Kawachi, Ichiro; Gilman, Stephen E.; Walter, Stefan; Liu, Sze Y.; Manly, Jennifer; Glymour, M. Maria

    2015-01-01

    Purpose Education is an established correlate of cognitive status in older adulthood, but whether expanding educational opportunities would improve cognitive functioning remains unclear given limitations of prior studies for causal inference. Therefore, we conducted instrumental variable (IV) analyses of the association between education and dementia risk, using for the first time in this area, genetic variants as instruments as well as state-level school policies. Methods IV analyses in the Health and Retirement Study cohort (1998–2010) used two sets of instruments: 1) a genetic risk score constructed from three single nucleotide polymorphisms (SNPs) (n=8,054); and 2) compulsory schooling laws (CSLs) and state school characteristics (term length, student teacher ratios, and expenditures) (n=13,167). Results Employing the genetic risk score as an IV, there was a 1.1% reduction in dementia risk per year of schooling (95% CI: −2.4, 0.02). Leveraging compulsory schooling laws and state school characteristics as IVs, there was a substantially larger protective effect (−9.5%; 95% CI: −14.8, −4.2). Analyses evaluating the plausibility of the IV assumptions indicated estimates derived from analyses relying on CSLs provide the best estimates of the causal effect of education. Conclusion IV analyses suggest education is protective against risk of dementia in older adulthood. PMID:26633592

  20. Two-Stage Bayesian Model Averaging in Endogenous Variable Models*

    PubMed Central

    Lenkoski, Alex; Eicher, Theo S.; Raftery, Adrian E.

    2013-01-01

    Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed. PMID:24223471

  1. Evaluation of Brazilian Sugarcane Bagasse Characterization: An Interlaboratory Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sluiter, Justin B.; Chum, Helena; Gomes, Absai C.

    2016-05-01

    This paper describes a study of the variability of measured composition for a single bulk sugarcane bagasse conducted across eight laboratories using similar analytical methods, with the purpose of determining the expected variation for compositional analysis performed by different laboratories. The results show good agreement of measured composition within a single laboratory, but greater variability when results are compared among laboratories. These interlaboratory variabilities do not seem to be associated with a specific method or technique or any single piece of instrumentation. The summary censored statistics provide mean values and pooled standard deviations as follows: total extractives 6.7% (0.6%), wholemore » ash 1.5% (0.2%), glucan 42.3% (1.2%), xylan 22.3% (0.5%), total lignin 21.3% (0.4%), and total mass closure 99.4% (2.9%).« less

  2. The quality of evidence of psychometric properties of three-dimensional spinal posture-measuring instruments

    PubMed Central

    2011-01-01

    Background Psychometric properties include validity, reliability and sensitivity to change. Establishing the psychometric properties of an instrument which measures three-dimensional human posture are essential prior to applying it in clinical practice or research. Methods This paper reports the findings of a systematic literature review which aimed to 1) identify non-invasive three-dimensional (3D) human posture-measuring instruments; and 2) assess the quality of reporting of the methodological procedures undertaken to establish their psychometric properties, using a purpose-build critical appraisal tool. Results Seventeen instruments were identified, of which nine were supported by research into psychometric properties. Eleven and six papers respectively, reported on validity and reliability testing. Rater qualification and reference standards were generally poorly addressed, and there was variable quality reporting of rater blinding and statistical analysis. Conclusions There is a lack of current research to establish the psychometric properties of non-invasive 3D human posture-measuring instruments. PMID:21569486

  3. The Perceptions of Fair Interpersonal Treatment Scale: development and validation of a measure of interpersonal treatment in the workplace.

    PubMed

    Donovan, M A; Drasgow, F; Munson, L J

    1998-10-01

    The Perceptions of Fair Interpersonal Treatment (PFIT) scale was designed to assess employees' perceptions of the interpersonal treatment in their work environment. Analyses of the factor structure and reliability of this new instrument indicate that the PFIT scale is a reliable instrument composed of 2 factors: supervisor treatment and coworker treatment. It was hypothesized that the PFIT scale would be positively correlated with job satisfaction variables and negatively correlated with work withdrawal, job withdrawal, experiences of sexual harassment, and an organization's tolerance of sexual harassment. Results based on 509 employees in a private-sector organization and 217 female faculty and staff members at a large midwestern university supported these hypotheses. Arguments that common method variance and employees' dispositions are responsible for the significant correlations between the PFIT scale and other job-related variables were eliminated. The implications of these results are discussed.

  4. Beyond annual streamflow reconstructions for the Upper Colorado River Basin: a paleo-water-balance approach

    USGS Publications Warehouse

    Gangopadhyay, Subhrendu; McCabe, Gregory J.; Woodhouse, Connie A.

    2015-01-01

    In this paper, we present a methodology to use annual tree-ring chronologies and a monthly water balance model to generate annual reconstructions of water balance variables (e.g., potential evapotrans- piration (PET), actual evapotranspiration (AET), snow water equivalent (SWE), soil moisture storage (SMS), and runoff (R)). The method involves resampling monthly temperature and precipitation from the instrumental record directed by variability indicated by the paleoclimate record. The generated time series of monthly temperature and precipitation are subsequently used as inputs to a monthly water balance model. The methodology is applied to the Upper Colorado River Basin, and results indicate that the methodology reliably simulates water-year runoff, maximum snow water equivalent, and seasonal soil moisture storage for the instrumental period. As a final application, the methodology is used to produce time series of PET, AET, SWE, SMS, and R for the 1404–1905 period for the Upper Colorado River Basin.

  5. Parameter de-correlation and model-identification in hybrid-style terrestrial laser scanner self-calibration

    NASA Astrophysics Data System (ADS)

    Lichti, Derek D.; Chow, Jacky; Lahamy, Hervé

    One of the important systematic error parameters identified in terrestrial laser scanners is the collimation axis error, which models the non-orthogonality between two instrumental axes. The quality of this parameter determined by self-calibration, as measured by its estimated precision and its correlation with the tertiary rotation angle κ of the scanner exterior orientation, is strongly dependent on instrument architecture. While the quality is generally very high for panoramic-type scanners, it is comparably poor for hybrid-style instruments. Two methods for improving the quality of the collimation axis error in hybrid instrument self-calibration are proposed herein: (1) the inclusion of independent observations of the tertiary rotation angle κ; and (2) the use of a new collimation axis error model. Five real datasets were captured with two different hybrid-style scanners to test each method's efficacy. While the first method achieves the desired outcome of complete decoupling of the collimation axis error from κ, it is shown that the high correlation is simply transferred to other model variables. The second method achieves partial parameter de-correlation to acceptable levels. Importantly, it does so without any adverse, secondary correlations and is therefore the method recommended for future use. Finally, systematic error model identification has been greatly aided in previous studies by graphical analyses of self-calibration residuals. This paper presents results showing the architecture dependence of this technique, revealing its limitations for hybrid scanners.

  6. Econometrics in outcomes research: the use of instrumental variables.

    PubMed

    Newhouse, J P; McClellan, M

    1998-01-01

    We describe an econometric technique, instrumental variables, that can be useful in estimating the effectiveness of clinical treatments in situations when a controlled trial has not or cannot be done. This technique relies upon the existence of one or more variables that induce substantial variation in the treatment variable but have no direct effect on the outcome variable of interest. We illustrate the use of the technique with an application to aggressive treatment of acute myocardial infarction in the elderly.

  7. Erythropoietin-Stimulating Agents and Survival in End-Stage Renal Disease: Comparison of Payment Policy Analysis, Instrumental Variables, and Multiple Imputation of Potential Outcomes

    PubMed Central

    Dore, David D.; Swaminathan, Shailender; Gutman, Roee; Trivedi, Amal N.; Mor, Vincent

    2013-01-01

    Objective To compare the assumptions and estimands across three approaches to estimating the effect of erythropoietin-stimulating agents (ESAs) on mortality. Study Design and Setting Using data from the Renal Management Information System, we conducted two analyses utilizing a change to bundled payment that we hypothesized mimicked random assignment to ESA (pre-post, difference-in-difference, and instrumental variable analyses). A third analysis was based on multiply imputing potential outcomes using propensity scores. Results There were 311,087 recipients of ESAs and 13,095 non-recipients. In the pre-post comparison, we identified no clear relationship between bundled payment (measured by calendar time) and the incidence of death within six months (risk difference -1.5%; 95% CI - 7.0% to 4.0%). In the instrumental variable analysis, the risk of mortality was similar among ESA recipients (risk difference -0.9%; 95% CI -2.1 to 0.3). In the multiple imputation analysis, we observed a 4.2% (95% CI 3.4% to 4.9%) absolute reduction in mortality risk with use of ESAs, but closer to the null for patients with baseline hematocrit >36%. Conclusion Methods emanating from different disciplines often rely on different assumptions, but can be informative about a similar causal contrast. The implications of these distinct approaches are discussed. PMID:23849152

  8. Effect of Highly Active Antiretroviral Therapy on Incident AIDS Using Calendar Period as an Instrumental Variable

    PubMed Central

    Cole, Stephen R.; Greenland, Sander; Brown, Todd T.; Chmiel, Joan S.; Kingsley, Lawrence; Detels, Roger

    2009-01-01

    Human immunodeficiency virus (HIV) researchers often use calendar periods as an imperfect proxy for highly active antiretroviral therapy (HAART) when estimating the effect of HAART on HIV disease progression. The authors report on 614 HIV-positive homosexual men followed from 1984 to 2007 in 4 US cities. During 5,321 person-years, 268 of 614 men incurred acquired immunodeficiency syndrome, 49 died, and 90 were lost to follow-up. Comparing the pre-HAART calendar period (<1996) with the HAART calendar period (≥1996) resulted in a naive rate ratio of 3.62 (95% confidence limits: 2.67, 4.92). However, this estimate is likely biased because of misclassification of HAART use by calendar period. Simple calendar period approaches may circumvent confounding by indication at the cost of inducing exposure misclassification. To correct this misclassification, the authors propose an instrumental-variable estimator analogous to ones previously used for noncompliance corrections in randomized clinical trials. When the pre-HAART calendar period was compared with the HAART calendar period, the instrumental-variable rate ratio was 5.02 (95% confidence limits: 3.45, 7.31), 39% higher than the naive result. Weighting by the inverse probability of calendar period given age at seroconversion, race/ethnicity, and time since seroconversion did not appreciably alter the results. These methods may help resolve discrepancies between observational and randomized evidence. PMID:19318615

  9. Inter-Hospital Transfer is Associated with Increased Mortality and Costs in Severe Sepsis and Septic Shock: An Instrumental Variables Approach

    PubMed Central

    Mohr, Nicholas M.; Harland, Karisa K.; Shane, Dan M.; Ahmed, Azeemuddin; Fuller, Brian M.; Torner, James C.

    2016-01-01

    Purpose The objective of this study was to evaluate the impact of regionalization on sepsis survival, to describe the role of inter-hospital transfer in rural sepsis care, and to measure the cost of inter-hospital transfer in a predominantly rural state. Materials and Methods Observational case-control study using statewide administrative claims data from 2005-2014 in a predominantly rural Midwestern state. Mortality and marginal costs were estimated with multivariable generalized estimating equations (GEE) models and with instrumental variables models. Results A total of 18,246 patients were included, of which 59% were transferred between hospitals. Transferred patients had higher mortality and longer hospital length-of-stay than non-transferred patients. Using a multivariable GEE model to adjust for potentially confounding factors, inter-hospital transfer was associated with increased mortality (aOR 1.7, 95%CI 1.5 – 1.9). Using an instrumental variables model, transfer was associated with a 9.2% increased risk of death. Transfer was associated with additional costs of $6,897 (95%CI $5,769-8,024). Even when limiting to only those patients who received care in the largest hospitals, transfer was still associated with $5,167 (95%CI $3,696-6,638) in additional cost. Conclusions The majority of rural sepsis patients are transferred, and these transferred patients have higher mortality and significantly increased cost of care. PMID:27546770

  10. Rapid optimization of MRM-MS instrument parameters by subtle alteration of precursor and product m/z targets.

    PubMed

    Sherwood, Carly A; Eastham, Ashley; Lee, Lik Wee; Risler, Jenni; Mirzaei, Hamid; Falkner, Jayson A; Martin, Daniel B

    2009-07-01

    Multiple reaction monitoring (MRM) is a highly sensitive method of targeted mass spectrometry (MS) that can be used to selectively detect and quantify peptides based on the screening of specified precursor peptide-to-fragment ion transitions. MRM-MS sensitivity depends critically on the tuning of instrument parameters, such as collision energy and cone voltage, for the generation of maximal product ion signal. Although generalized equations and values exist for such instrument parameters, there is no clear indication that optimal signal can be reliably produced for all types of MRM transitions using such an algorithmic approach. To address this issue, we have devised a workflow functional on both Waters Quattro Premier and ABI 4000 QTRAP triple quadrupole instruments that allows rapid determination of the optimal value of any programmable instrument parameter for each MRM transition. Here, we demonstrate the strategy for the optimizations of collision energy and cone voltage, but the method could be applied to other instrument parameters, such as declustering potential, as well. The workflow makes use of the incremental adjustment of the precursor and product m/z values at the hundredth decimal place to create a series of MRM targets at different collision energies that can be cycled through in rapid succession within a single run, avoiding any run-to-run variability in execution or comparison. Results are easily visualized and quantified using the MRM software package Mr. M to determine the optimal instrument parameters for each transition.

  11. Rapid Optimization of MRM-MS Instrument Parameters by Subtle Alteration of Precursor and Product m/z Targets

    PubMed Central

    Sherwood, Carly A.; Eastham, Ashley; Lee, Lik Wee; Risler, Jenni; Mirzaei, Hamid; Falkner, Jayson A.; Martin, Daniel B.

    2009-01-01

    Multiple reaction monitoring (MRM) is a highly sensitive method of targeted mass spectrometry (MS) that can be used to selectively detect and quantify peptides based on the screening of specified precursor peptide-to-fragment ion transitions. MRM-MS sensitivity depends critically on the tuning of instrument parameters, such as collision energy and cone voltage, for the generation of maximal product ion signal. Although generalized equations and values exist for such instrument parameters, there is no clear indication that optimal signal can be reliably produced for all types of MRM transitions using such an algorithmic approach. To address this issue, we have devised a workflow functional on both Waters Quattro Premier and ABI 4000 QTRAP triple quadrupole instruments that allows rapid determination of the optimal value of any programmable instrument parameter for each MRM transition. Here, we demonstrate the strategy for the optimizations of collision energy and cone voltage, but the method could be applied to other instrument parameters, such as declustering potential, as well. The workflow makes use of the incremental adjustment of the precursor and product m/z values at the hundredth decimal place to create a series of MRM targets at different collision energies that can be cycled through in rapid succession within a single run, avoiding any run-to-run variability in execution or comparison. Results are easily visualized and quantified using the MRM software package Mr. M to determine the optimal instrument parameters for each transition. PMID:19405522

  12. A Rapid Leaf-Disc Sampler for Psychrometric Water Potential Measurements 1

    PubMed Central

    Wullschleger, Stan D.; Oosterhuis, Derrick M.

    1986-01-01

    An instrument was designed which facilitates faster and more accurate sampling of leaf discs for psychrometric water potential measurements. The instrument consists of an aluminum housing, a spring-loaded plunger, and a modified brass-plated cork borer. The leaf-disc sampler was compared with the conventional method of sampling discs for measurement of leaf water potential with thermocouple psychrometers on a range of plant material including Gossypium hirsutum L., Zea mays L., and Begonia rex-cultorum L. The new sampler permitted a leaf disc to be excised and inserted into the psychrometer sample chamber in less than 7 seconds, which was more than twice as fast as the conventional method. This resulted in more accurate determinations of leaf water potential due to reduced evaporative water losses. The leaf-disc sampler also significantly reduced sample variability between individual measurements. This instrument can be used for many other laboratory and field measurements that necessitate leaf disc sampling. PMID:16664879

  13. Subjective evaluation of treatment outcomes of instrumentation with pedicle screws or hybrid constructs in Lenke Type 1 and 2 adolescent idiopathic scoliosis: what happens when judges are blinded to the instrumentation?

    PubMed Central

    Ouellet, Jean Albert; Shilt, Jeffrey; Shen, Francis H.; Wood, Kirkham; Chan, Donald; Hicks, John; Bersusky, Ernesto; Reddi, Vasantha

    2009-01-01

    Superiority of pedicle screws over hybrid/hook instrumentation or vice versa in the treatment of Lenke Type 1 and 2 adolescent idiopathic scoliosis (AIS) remains unresolved for moderate curves. Our objective was therefore to compare the assessment of pedicle screw and hybrid/hooks instrumentation with special attention to cosmesis and uninstrumented spine using novel assessment methods. We carried out a retrospective study of radiographs and clinical photos of 40 cases of thoracic AIS between 40° and 70° of Cobb angle Lenke Type 1 and 2, treated with either pedicle screws or hybrid/hooks. The cases were subjectively assessed by four spine surgeons (SRS Travelling Fellows) for radiographic and operative cosmetic result, shoulder balance, trunk shift, rib hump, and waist asymmetry. Instrumentation in the radiographs was obscured with only the non-instrumented part visible, and the surgeons were asked to guess the instrumentation being used. Eighty photographs of patients before and after surgery were assessed for cosmesis by ten non-medical judges for overall cosmetic score, shoulder balance, waist asymmetry, and shoulder blade prominence. Objective assessment of radiographs and clinical photos was performed for Cobb angle of instrumented and non-instrumented spine, global coronal and sagittal balance, number of unfused vertebrae, disc angulation, tilt of last instrumented vertebra, shoulder balance, waist asymmetry, rib prominence, and percent correction. SRS-24 questionnaire was used to measure health-related quality of life in patients. Subjective assessments by surgeons and non-medical judges showed no significant difference by instrumentation (P ≥ 0.05) for all variables. Out of the 160 guesses by surgeons of the cases with instrumentation blocked in the radiographs, they were unable to guess the instrumentation in 92% of the cases. Objective assessment of all variables and SRS-24 scores of all five domains showed no significant difference by instrumentation (P ≥ 0.05). In this first-ever conducted study in a blinded-fashion, we conclude that there is no significant difference between the pedicle screw and hybrid/hooks instrumentations used to treat AIS for Lenke Type 1 and 2 curves for moderate curves between 40° and 70°. PMID:19672635

  14. Development process and initial validation of the Ethical Conflict in Nursing Questionnaire-Critical Care Version

    PubMed Central

    2013-01-01

    Background Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables ‘frequency’ and ‘degree of conflict’. In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable ‘exposure to conflict’, as well as considering six ‘types of ethical conflict’. An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach’s alpha was used to evaluate the instrument’s reliability. All analyses were performed using the statistical software PASW v19. Results Cronbach’s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV. PMID:23725477

  15. The Trojan Lifetime Champions Health Survey: Development, Validity, and Reliability

    PubMed Central

    Sorenson, Shawn C.; Romano, Russell; Scholefield, Robin M.; Schroeder, E. Todd; Azen, Stanley P.; Salem, George J.

    2015-01-01

    Context Self-report questionnaires are an important method of evaluating lifespan health, exercise, and health-related quality of life (HRQL) outcomes among elite, competitive athletes. Few instruments, however, have undergone formal characterization of their psychometric properties within this population. Objective To evaluate the validity and reliability of a novel health and exercise questionnaire, the Trojan Lifetime Champions (TLC) Health Survey. Design Descriptive laboratory study. Setting A large National Collegiate Athletic Association Division I university. Patients or Other Participants A total of 63 university alumni (age range, 24 to 84 years), including former varsity collegiate athletes and a control group of nonathletes. Intervention(s) Participants completed the TLC Health Survey twice at a mean interval of 23 days with randomization to the paper or electronic version of the instrument. Main Outcome Measure(s) Content validity, feasibility of administration, test-retest reliability, parallel-form reliability between paper and electronic forms, and estimates of systematic and typical error versus differences of clinical interest were assessed across a broad range of health, exercise, and HRQL measures. Results Correlation coefficients, including intraclass correlation coefficients (ICCs) for continuous variables and κ agreement statistics for ordinal variables, for test-retest reliability averaged 0.86, 0.90, 0.80, and 0.74 for HRQL, lifetime health, recent health, and exercise variables, respectively. Correlation coefficients, again ICCs and κ, for parallel-form reliability (ie, equivalence) between paper and electronic versions averaged 0.90, 0.85, 0.85, and 0.81 for HRQL, lifetime health, recent health, and exercise variables, respectively. Typical measurement error was less than the a priori thresholds of clinical interest, and we found minimal evidence of systematic test-retest error. We found strong evidence of content validity, convergent construct validity with the Short-Form 12 Version 2 HRQL instrument, and feasibility of administration in an elite, competitive athletic population. Conclusions These data suggest that the TLC Health Survey is a valid and reliable instrument for assessing lifetime and recent health, exercise, and HRQL, among elite competitive athletes. Generalizability of the instrument may be enhanced by additional, larger-scale studies in diverse populations. PMID:25611315

  16. Individual differences in toddlers’ social understanding and prosocial behavior: disposition or socialization?

    PubMed Central

    Gross, Rebekkah L.; Drummond, Jesse; Satlof-Bedrick, Emma; Waugh, Whitney E.; Svetlova, Margarita; Brownell, Celia A.

    2015-01-01

    We examined how individual differences in social understanding contribute to variability in early-appearing prosocial behavior. Moreover, potential sources of variability in social understanding were explored and examined as additional possible predictors of prosocial behavior. Using a multi-method approach with both observed and parent-report measures, 325 children aged 18–30 months were administered measures of social understanding (e.g., use of emotion words; self-understanding), prosocial behavior (in separate tasks measuring instrumental helping, empathic helping, and sharing, as well as parent-reported prosociality at home), temperament (fearfulness, shyness, and social fear), and parental socialization of prosocial behavior in the family. Individual differences in social understanding predicted variability in empathic helping and parent-reported prosociality, but not instrumental helping or sharing. Parental socialization of prosocial behavior was positively associated with toddlers’ social understanding, prosocial behavior at home, and instrumental helping in the lab, and negatively associated with sharing (possibly reflecting parents’ increased efforts to encourage children who were less likely to share). Further, socialization moderated the association between social understanding and prosocial behavior, such that social understanding was less predictive of prosocial behavior among children whose parents took a more active role in socializing their prosociality. None of the dimensions of temperament was associated with either social understanding or prosocial behavior. Parental socialization of prosocial behavior is thus an important source of variability in children’s early prosociality, acting in concert with early differences in social understanding, with different patterns of influence for different subtypes of prosocial behavior. PMID:26029139

  17. Automated Gait and Balance Parameters Diagnose and Correlate with Severity in Parkinson Disease

    PubMed Central

    Dewey, Daniel C.; Miocinovic, Svjetlana; Bernstein, Ira; Khemani, Pravin; Dewey, Richard B.; Querry, Ross; Chitnis, Shilpa; Dewey, Richard B.

    2014-01-01

    Objective To assess the suitability of instrumented gait and balance measures for diagnosis and estimation of disease severity in PD. Methods Each subject performed iTUG (instrumented Timed-Up-and-Go) and iSway (instrumented Sway) using the APDM® Mobility Lab. MDS-UPDRS parts II and III, a postural instability and gait disorder (PIGD) score, the mobility subscale of the PDQ-39, and Hoehn & Yahr stage were measured in the PD cohort. Two sets of gait and balance variables were defined by high correlation with diagnosis or disease severity and were evaluated using multiple linear and logistic regressions, ROC analyses, and t-tests. Results 135 PD subjects and 66 age-matched controls were evaluated in this prospective cohort study. We found that both iTUG and iSway variables differentiated PD subjects from controls (area under the ROC curve was 0.82 and 0.75 respectively) and correlated with all PD severity measures (R2 ranging from 0.18 to 0.61). Objective exam-based scores correlated more strongly with iTUG than iSway. The chosen set of iTUG variables was abnormal in very mild disease. Age and gender influenced gait and balance parameters and were therefore controlled in all analyses. Interpretation Our study identified sets of iTUG and iSway variables which correlate with PD severity measures and differentiate PD subjects from controls. These gait and balance measures could potentially serve as markers of PD progression and are under evaluation for this purpose in the ongoing NIH Parkinson Disease Biomarker Program. PMID:25082782

  18. Modeling Outcomes with Floor or Ceiling Effects: An Introduction to the Tobit Model

    ERIC Educational Resources Information Center

    McBee, Matthew

    2010-01-01

    In gifted education research, it is common for outcome variables to exhibit strong floor or ceiling effects due to insufficient range of measurement of many instruments when used with gifted populations. Common statistical methods (e.g., analysis of variance, linear regression) produce biased estimates when such effects are present. In practice,…

  19. Generalized Path Analysis and Generalized Simultaneous Equations Model for Recursive Systems with Responses of Mixed Types

    ERIC Educational Resources Information Center

    Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang

    2006-01-01

    This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…

  20. The Infant Motor Profile: A Standardized and Qualitative Method to Assess Motor Behaviour in Infancy

    ERIC Educational Resources Information Center

    Heineman, Kirsten R.; Bos, Arend F.; Hadders-Algra, Mijna

    2008-01-01

    A reliable and valid instrument to assess neuromotor condition in infancy is a prerequisite for early detection of developmental motor disorders. We developed a video-based assessment of motor behaviour, the Infant Motor Profile (IMP), to evaluate motor abilities, movement variability, ability to select motor strategies, movement symmetry, and…

  1. An Examination of the Practices of Teachers Regarding the Arrangement of Education According to Individual Differences

    ERIC Educational Resources Information Center

    Avci, Süleyman; Akinoglu, Orhan

    2014-01-01

    The present study tries to find out how often teachers in Istanbul employ the methods, techniques, materials, contents and assessment instruments that are preferred within the scope of differentiated instruction, as well as the variables that influence their choices. The results of the research indicate that teachers more frequently use specific…

  2. Instrumental variables vs. grouping approach for reducing bias due to measurement error.

    PubMed

    Batistatou, Evridiki; McNamee, Roseanne

    2008-01-01

    Attenuation of the exposure-response relationship due to exposure measurement error is often encountered in epidemiology. Given that error cannot be totally eliminated, bias correction methods of analysis are needed. Many methods require more than one exposure measurement per person to be made, but the `group mean OLS method,' in which subjects are grouped into several a priori defined groups followed by ordinary least squares (OLS) regression on the group means, can be applied with one measurement. An alternative approach is to use an instrumental variable (IV) method in which both the single error-prone measure and an IV are used in IV analysis. In this paper we show that the `group mean OLS' estimator is equal to an IV estimator with the group mean used as IV, but that the variance estimators for the two methods are different. We derive a simple expression for the bias in the common estimator which is a simple function of group size, reliability and contrast of exposure between groups, and show that the bias can be very small when group size is large. We compare this method with a new proposal (group mean ranking method), also applicable with a single exposure measurement, in which the IV is the rank of the group means. When there are two independent exposure measurements per subject, we propose a new IV method (EVROS IV) and compare it with Carroll and Stefanski's (CS IV) proposal in which the second measure is used as an IV; the new IV estimator combines aspects of the `group mean' and `CS' strategies. All methods are evaluated in terms of bias, precision and root mean square error via simulations and a dataset from occupational epidemiology. The `group mean ranking method' does not offer much improvement over the `group mean method.' Compared with the `CS' method, the `EVROS' method is less affected by low reliability of exposure. We conclude that the group IV methods we propose may provide a useful way to handle mismeasured exposures in epidemiology with or without replicate measurements. Our finding may also have implications for the use of aggregate variables in epidemiology to control for unmeasured confounding.

  3. CIEL*a*b* color space predictive models for colorimetry devices--analysis of perfume quality.

    PubMed

    Korifi, Rabia; Le Dréau, Yveline; Antinelli, Jean-François; Valls, Robert; Dupuy, Nathalie

    2013-01-30

    Color perception plays a major role in the consumer evaluation of perfume quality. Consumers need first to be entirely satisfied with the sensory properties of products, before other quality dimensions become relevant. The evaluation of complex mixtures color presents a challenge even for modern analytical techniques. A variety of instruments are available for color measurement. They can be classified as tristimulus colorimeters and spectrophotometers. Obsolescence of the electronics of old tristimulus colorimeter arises from the difficulty in finding repair parts and leads to its replacement by more modern instruments. High quality levels in color measurement, i.e., accuracy and reliability in color control are the major advantages of the new generation of color instrumentation, the integrating sphere spectrophotometer. Two models of spectrophotometer were tested in transmittance mode, employing the d/0° geometry. The CIEL(*)a(*)b(*) color space parameters were measured with each instrument for 380 samples of raw materials and bases used in the perfume compositions. The results were graphically compared between the colorimeter device and the spectrophotometer devices. All color space parameters obtained with the colorimeter were used as dependent variables to generate regression equations with values obtained from the spectrophotometers. The data was statistically analyzed to create predictive model between the reference and the target instruments through two methods. The first method uses linear regression analysis and the second method consists of partial least square regression (PLS) on each component. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Laboratory test for ice adhesion strength using commercial instrumentation.

    PubMed

    Wang, Chenyu; Zhang, Wei; Siva, Adarsh; Tiea, Daniel; Wynne, Kenneth J

    2014-01-21

    A laboratory test method for evaluating ice adhesion has been developed employing a commercially available instrument normally used for dynamic mechanical analysis (TA RSA-III). This is the first laboratory ice adhesion test that does not require a custom-built apparatus. The upper grip range of ∼10 mm is an enabling feature that is essential for the test. The method involves removal of an ice cylinder from a polymer coating with a probe and the determination of peak removal force (Ps). To validate the test method, the strength of ice adhesion was determined for a prototypical glassy polymer, poly(methyl methacrylate). The distance of the probe from the PMMA surface has been identified as a critical variable for Ps. The new test provides a readily available platform for investigating fundamental surface characteristics affecting ice adhesion. In addition to the ice release test, PMMA coatings were characterized using DSC, DCA, and TM-AFM.

  5. The EPIC-MOS Particle-Induced Background Spectrum

    NASA Technical Reports Server (NTRS)

    Kuntz, K. D.; Snowden, S. L.

    2006-01-01

    We have developed a method for constructing a spectrum of the particle-induced instrumental background of the XMM-Newton EPIC MOS detectors that can be used for observations of the diffuse background and extended sources that fill a significant fraction of the instrument field of view. The strength and spectrum of the particle-induced background, that is, the background due to the interaction of particles with the detector and the detector surroundings, is temporally variable as well as spatially variable over individual chips. Our method uses a combination of the filter-wheel-closed data and a database of unexposed-region data to construct a spectrum of the "quiescent" background. We show that, using this method of background subtraction, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear evidence of solar wind charge exchange emission. We use the blank sky observations to show that contamination by SWCX emission is a strong function of the solar wind proton flux, and that observations through the flanks of the magnetosheath appear to be contaminated only at much higher solar wind fluxes. We have also developed a spectral model of the residual soft proton flares, which allows their effects to be removed to a substantial degree during spectral fitting.

  6. Use of instrumental variables in the analysis of generalized linear models in the presence of unmeasured confounding with applications to epidemiological research.

    PubMed

    Johnston, K M; Gustafson, P; Levy, A R; Grootendorst, P

    2008-04-30

    A major, often unstated, concern of researchers carrying out epidemiological studies of medical therapy is the potential impact on validity if estimates of treatment are biased due to unmeasured confounders. One technique for obtaining consistent estimates of treatment effects in the presence of unmeasured confounders is instrumental variables analysis (IVA). This technique has been well developed in the econometrics literature and is being increasingly used in epidemiological studies. However, the approach to IVA that is most commonly used in such studies is based on linear models, while many epidemiological applications make use of non-linear models, specifically generalized linear models (GLMs) such as logistic or Poisson regression. Here we present a simple method for applying IVA within the class of GLMs using the generalized method of moments approach. We explore some of the theoretical properties of the method and illustrate its use within both a simulation example and an epidemiological study where unmeasured confounding is suspected to be present. We estimate the effects of beta-blocker therapy on one-year all-cause mortality after an incident hospitalization for heart failure, in the absence of data describing disease severity, which is believed to be a confounder. 2008 John Wiley & Sons, Ltd

  7. Performing both propensity score and instrumental variable analyses in observational studies often leads to discrepant results: a systematic review.

    PubMed

    Laborde-Castérot, Hervé; Agrinier, Nelly; Thilly, Nathalie

    2015-10-01

    Propensity score (PS) and instrumental variable (IV) are analytical techniques used to adjust for confounding in observational research. More and more, they seem to be used simultaneously in studies evaluating health interventions. The present review aimed to analyze the agreement between PS and IV results in medical research published to date. Review of all published observational studies that evaluated a clinical intervention using simultaneously PS and IV analyses, as identified in MEDLINE and Web of Science. Thirty-seven studies, most of them published during the previous 5 years, reported 55 comparisons between results from PS and IV analyses. There was a slight/fair agreement between the methods [Cohen's kappa coefficient = 0.21 (95% confidence interval: 0.00, 0.41)]. In 23 cases (42%), results were nonsignificant for one method and significant for the other, and IV analysis results were nonsignificant in most situations (87%). Discrepancies are frequent between PS and IV analyses and can be interpreted in various ways. This suggests that researchers should carefully consider their analytical choices, and readers should be cautious when interpreting results, until further studies clarify the respective roles of the two methods in observational comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Self-Consistency of Rain Event Definitions

    NASA Astrophysics Data System (ADS)

    Teves, J. B.; Larsen, M.

    2014-12-01

    A dense optical rain disdrometer array was constructed to study rain variability on spatial scales less than 100 meters with temporal resolution of 1 minute. Approximately two months of data were classified into rain events using methods common in the literature. These methods were unable to identify an array-wide consensus as to the total number of rain events; instruments as little as 2 meters apart with similar data records sometimes identified different rain event totals. Physical considerations suggest that these differing event totals are likely due to instrument sampling fluctuations that are typically not accounted for in rain event studies. Detection of varying numbers of rain events impact many commonly used storm statistics including storm duration distributions and mean rain rate. A summary of the results above and their implications are presented.

  9. DIY Weights and Measures: Developing Your Own Calibration Methods Using Available Materials

    NASA Astrophysics Data System (ADS)

    Marchetto, P.

    2017-12-01

    In most cases, when a custom device is built, when an instrument arrives from the manufacturer, or when a piece of equipment is picked up from the store, it is assumed that it will just work, and that data from it will be completely reliable. This fundamental expectation by many scientists is completely unfounded. Between shipping, variable environmental conditions, and many other factors, sensing systems and instruments will be out of tolerance when used. In order to get them back in tolerance, or to at least find out what their uncertainties are, characterization is necessary. If adjustment to the device is possible, this is carried out and called a calibration. In this work, several ways of generating characterization and adjustment methods are discussed, with acoustic, hydraulic, and meteorological devices as examples.

  10. Rotatable Small Permanent Magnet Array for Ultra-Low Field Nuclear Magnetic Resonance Instrumentation: A Concept Study

    PubMed Central

    Vegh, Viktor; Reutens, David C.

    2016-01-01

    Object We studied the feasibility of generating the variable magnetic fields required for ultra-low field nuclear magnetic resonance relaxometry with dynamically adjustable permanent magnets. Our motivation was to substitute traditional electromagnets by distributed permanent magnets, increasing system portability. Materials and Methods The finite element method (COMSOL®) was employed for the numerical study of a small permanent magnet array to calculate achievable magnetic field strength, homogeneity, switching time and magnetic forces. A manually operated prototype was simulated and constructed to validate the numerical approach and to verify the generated magnetic field. Results A concentric small permanent magnet array can be used to generate strong sample pre-polarisation and variable measurement fields for ultra-low field relaxometry via simple prescribed magnet rotations. Using the array, it is possible to achieve a pre-polarisation field strength above 100 mT and variable measurement fields ranging from 20–50 μT with 200 ppm absolute field homogeneity within a field-of-view of 5 x 5 x 5 cubic centimetres. Conclusions A dynamic small permanent magnet array can generate multiple highly homogeneous magnetic fields required in ultra-low field nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI) instruments. This design can significantly reduce the volume and energy requirements of traditional systems based on electromagnets, improving portability considerably. PMID:27271886

  11. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  12. Factors explaining children's responses to intravenous needle insertions.

    PubMed

    McCarthy, Ann Marie; Kleiber, Charmaine; Hanrahan, Kirsten; Zimmerman, M Bridget; Westhus, Nina; Allen, Susan

    2010-01-01

    Previous research shows that numerous child, parent, and procedural variables affect children's distress responses to procedures. Cognitive-behavioral interventions such as distraction are effective in reducing pain and distress for many children undergoing these procedures. The purpose of this report was to examine child, parent, and procedural variables that explain child distress during a scheduled intravenous insertion when parents are distraction coaches for their children. A total of 542 children, between 4 and 10 years of age, and their parents participated. Child age, gender, diagnosis, and ethnicity were measured by questions developed for this study. Standardized instruments were used to measure child experience with procedures, temperament, ability to attend, anxiety, coping style, and pain sensitivity. Questions were developed to measure parent variables, including ethnicity, gender, previous experiences, and expectations, and procedural variables, including use of topical anesthetics and difficulty of procedure. Standardized instruments were used to measure parenting style and parent anxiety, whereas a new instrument was developed to measure parent performance of distraction. Children's distress responses were measured with the Observation Scale of Behavioral Distress-Revised (behavioral), salivary cortisol (biological), Oucher Pain Scale (self-report), and parent report of child distress (parent report). Regression methods were used for data analyses. Variables explaining behavioral, child-report and parent-report measures include child age, typical coping response, and parent expectation of distress (p < .01). Level of parents' distraction coaching explained a significant portion of behavioral, biological, and parent-report distress measures (p < .05). Child impulsivity and special assistance at school also significantly explained child self-report of pain (p < .05). Additional variables explaining cortisol response were child's distress in the morning before clinic, diagnoses of attention deficit hyperactivity disorder or anxiety disorder, and timing of preparation for the clinic visit. The findings can be used to identify children at risk for high distress during procedures. This is the first study to find a relationship between child behavioral distress and level of parent distraction coaching.

  13. Bayesian Normalization Model for Label-Free Quantitative Analysis by LC-MS

    PubMed Central

    Nezami Ranjbar, Mohammad R.; Tadesse, Mahlet G.; Wang, Yue; Ressom, Habtom W.

    2016-01-01

    We introduce a new method for normalization of data acquired by liquid chromatography coupled with mass spectrometry (LC-MS) in label-free differential expression analysis. Normalization of LC-MS data is desired prior to subsequent statistical analysis to adjust variabilities in ion intensities that are not caused by biological differences but experimental bias. There are different sources of bias including variabilities during sample collection and sample storage, poor experimental design, noise, etc. In addition, instrument variability in experiments involving a large number of LC-MS runs leads to a significant drift in intensity measurements. Although various methods have been proposed for normalization of LC-MS data, there is no universally applicable approach. In this paper, we propose a Bayesian normalization model (BNM) that utilizes scan-level information from LC-MS data. Specifically, the proposed method uses peak shapes to model the scan-level data acquired from extracted ion chromatograms (EIC) with parameters considered as a linear mixed effects model. We extended the model into BNM with drift (BNMD) to compensate for the variability in intensity measurements due to long LC-MS runs. We evaluated the performance of our method using synthetic and experimental data. In comparison with several existing methods, the proposed BNM and BNMD yielded significant improvement. PMID:26357332

  14. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    NASA Astrophysics Data System (ADS)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  15. THAT INSTRUMENT IS LOUSY! IN SEARCH OF AGREEMENT WHEN USING INSTRUMENTAL VARIABLES ESTIMATION IN SUBSTANCE USE RESEARCH

    PubMed Central

    Popovici, Ioana

    2009-01-01

    SUMMARY The primary statistical challenge that must be addressed when using cross-sectional data to estimate the consequences of consuming addictive substances is the likely endogeneity of substance use. While economists are in agreement on the need to consider potential endogeneity bias and the value of instrumental variables estimation, the selection of credible instruments is a topic of heated debate in the field. Rather than attempt to resolve this debate, our paper highlights the diversity of judgments about what constitutes appropriate instruments for substance use based on a comprehensive review of the economics literature since 1990. We then offer recommendations related to the selection of reliable instruments in future studies. PMID:20029936

  16. Posthospitalization home health care use and changes in functional status in a Medicare population.

    PubMed

    Hadley, J; Rabin, D; Epstein, A; Stein, S; Rimes, C

    2000-05-01

    The objective of this work was to estimate the effect of Medicare beneficiaries' use of home health care (HHC) for 6 months after hospital discharge on the change in functional status over a 1-year period beginning before hospitalization. Data came from the Medicare Current Beneficiary Survey, which is a nationally representative sample of Medicare beneficiaries, in-person interview data, and Medicare claims for 1991 through 1994 for 2,127 nondisabled, community-dwelling, elderly Medicare beneficiaries who were hospitalized within 6 months of their annual in-person interviews. Econometric estimation with the instrumental variable method was used to correct for observational data bias, ie, the nonrandom allocation of discharged beneficiaries to the use of posthospitalization HHC. The analysis estimates a first-stage model of HHC use from which an instrumental variable estimate is constructed to estimate the effect on change in functional status. The instrumental variable estimates suggest that HHC users experienced greater improvements in functional status than nonusers as measured by the change in a continuous scale based on the number and mix of activities of daily living and instrumental activities of daily living before and after hospitalization. The estimated improvement in functional status could be as large as 13% for a 10% increase in HHC use. In contrast, estimation with the observational data on HHC use implies that HHC users had poorer health outcomes. Adjusting for potential observational data bias is critical to obtaining estimates of the relationship between the use of posthospitalization HHC and the change in health before and after hospitalization. After adjustment, the results suggest that efforts to constrain Medicare's spending for HHC, as required by the Balanced Budget Act of 1997, may lead to poorer health outcomes for some beneficiaries.

  17. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: An instrumental variables re-analysis of randomized clinical trials

    PubMed Central

    Humphreys, Keith; Blodgett, Janet C.; Wagner, Todd H.

    2014-01-01

    Background Observational studies of Alcoholics Anonymous’ (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study therefore employed an innovative statistical technique to derive a selection bias-free estimate of AA’s impact. Methods Six datasets from 5 National Institutes of Health-funded randomized trials (one with two independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol dependent individuals in one of the datasets (n = 774) were analyzed separately from the rest of sample (n = 1582 individuals pooled from 5 datasets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Results Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In five of the six data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = .38, p = .001) and 15-month (B = 0.42, p = .04) follow-up. However, in the remaining dataset, in which pre-existing AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. Conclusions For most individuals seeking help for alcohol problems, increasing AA attendance leads to short and long term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high pre-existing AA involvement, further increases in AA attendance may have little impact. PMID:25421504

  18. Mendelian randomization with fine-mapped genetic data: Choosing from large numbers of correlated instrumental variables.

    PubMed

    Burgess, Stephen; Zuber, Verena; Valdes-Marquez, Elsa; Sun, Benjamin B; Hopewell, Jemma C

    2017-12-01

    Mendelian randomization uses genetic variants to make causal inferences about the effect of a risk factor on an outcome. With fine-mapped genetic data, there may be hundreds of genetic variants in a single gene region any of which could be used to assess this causal relationship. However, using too many genetic variants in the analysis can lead to spurious estimates and inflated Type 1 error rates. But if only a few genetic variants are used, then the majority of the data is ignored and estimates are highly sensitive to the particular choice of variants. We propose an approach based on summarized data only (genetic association and correlation estimates) that uses principal components analysis to form instruments. This approach has desirable theoretical properties: it takes the totality of data into account and does not suffer from numerical instabilities. It also has good properties in simulation studies: it is not particularly sensitive to varying the genetic variants included in the analysis or the genetic correlation matrix, and it does not have greatly inflated Type 1 error rates. Overall, the method gives estimates that are less precise than those from variable selection approaches (such as using a conditional analysis or pruning approach to select variants), but are more robust to seemingly arbitrary choices in the variable selection step. Methods are illustrated by an example using genetic associations with testosterone for 320 genetic variants to assess the effect of sex hormone related pathways on coronary artery disease risk, in which variable selection approaches give inconsistent inferences. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  19. CD4 Lymphocyte Enumeration and Hemoglobin Assessment Aid for Priority Decisions: A Multisite Evaluation of the BD FACSPresto™ System

    PubMed Central

    Thakar, Madhuri; Angira, Francis; Pattanapanyasat, Kovit; Wu, Alan H.B.; O’Gorman, Maurice; Zeng, Hui; Qu, Chenxue; Mahajan, Bharati; Sukapirom, Kasama; Chen, Danying; Hao, Yu; Gong, Yan; Indig, Monika De Arruda; Graminske, Sharon; Orta, Diana; d’Empaire, Nicole; Lu, Beverly; Omana-Zapata, Imelda; Zeh, Clement

    2017-01-01

    Background: The BD FACSPresto™ system uses capillary and venous blood to measure CD4 absolute counts (CD4), %CD4 in lymphocytes, and hemoglobin (Hb) in approximately 25 minutes. CD4 cell count is used with portable CD4 counters in resource-limited settings to manage HIV/AIDS patients. A method comparison was performed using capillary and venous samples from seven clinical laboratories in five countries. The BD FACSPresto system was assessed for variability between laboratory, instrument/operators, cartridge lots and within-run at four sites. Methods: Samples were collected under approved voluntary consent. EDTA-anticoagulated venous samples were tested for CD4 and %CD4 T cells using the gold-standard BD FACSCalibur™ system, and for Hb, using the Sysmex® KX-21N™ analyzer. Venous and capillary samples were tested on the BD FACSPresto system. Matched data was analyzed for bias (Deming linear regression and Bland-Altman methods), and for concordance around the clinical decision point. The coefficient of variation was estimated per site, instrument/operator, cartridge-lot and between-runs. Results: For method comparison, 93% of the 720 samples were from HIV-positive and 7% from HIV-negative or normal subjects. CD4 and %CD4 T cells venous and capillary results gave slopes within 0.96–1.05 and R2 ≥0.96; Hb slopes were ≥1.00 and R2 ≥0.89. Variability across sites/operators gave %CV <5.8% for CD4 counts, <1.9% for %CD4 and <3.2% for Hb. The total %CV was <7.7% across instrument/cartridge lot. Conclusion: The BD FACSPresto system provides accurate, reliable, precise CD4/%CD4/Hb results compared to gold-standard methods, irrespective of venous or capillary blood sampling. The data showed good agreement between the BD FACSPresto, BD FACSCalibur and Sysmex systems. PMID:29290885

  20. Spatial and temporal variability of rainfall and their effects on hydrological response in urban areas - a review

    NASA Astrophysics Data System (ADS)

    Cristiano, Elena; ten Veldhuis, Marie-claire; van de Giesen, Nick

    2017-07-01

    In urban areas, hydrological processes are characterized by high variability in space and time, making them sensitive to small-scale temporal and spatial rainfall variability. In the last decades new instruments, techniques, and methods have been developed to capture rainfall and hydrological processes at high resolution. Weather radars have been introduced to estimate high spatial and temporal rainfall variability. At the same time, new models have been proposed to reproduce hydrological response, based on small-scale representation of urban catchment spatial variability. Despite these efforts, interactions between rainfall variability, catchment heterogeneity, and hydrological response remain poorly understood. This paper presents a review of our current understanding of hydrological processes in urban environments as reported in the literature, focusing on their spatial and temporal variability aspects. We review recent findings on the effects of rainfall variability on hydrological response and identify gaps where knowledge needs to be further developed to improve our understanding of and capability to predict urban hydrological response.

  1. Emerging Methods and Systems for Observing Life in the Sea

    NASA Astrophysics Data System (ADS)

    Chavez, F.; Pearlman, J.; Simmons, S. E.

    2016-12-01

    There is a growing need for observations of life in the sea at time and space scales consistent with those made for physical and chemical parameters. International programs such as the Global Ocean Observing System (GOOS) and Marine Biodiversity Observation Networks (MBON) are making the case for expanded biological observations and working diligently to prioritize essential variables. Here we review past, present and emerging systems and methods for observing life in the sea from the perspective of maintaining continuous observations over long time periods. Methods that rely on ships with instrumentation and over-the-side sample collections will need to be supplemented and eventually replaced with those based from autonomous platforms. Ship-based optical and acoustic instruments are being reduced in size and power for deployment on moorings and autonomous vehicles. In parallel a new generation of low power, improved resolution sensors are being developed. Animal bio-logging is evolving with new, smaller and more sophisticated tags being developed. New genomic methods, capable of assessing multiple trophic levels from a single water sample, are emerging. Autonomous devices for genomic sample collection are being miniaturized and adapted to autonomous vehicles. The required processing schemes and methods for these emerging data collections are being developed in parallel with the instrumentation. An evolving challenge will be the integration of information from these disparate methods given that each provides their own unique view of life in the sea.

  2. The Meal Pattern Questionnaire: A psychometric evaluation using the Eating Disorder Examination.

    PubMed

    Alfonsson, S; Sewall, A; Lidholm, H; Hursti, T

    2016-04-01

    Meal pattern is an important variable in both obesity treatment and treatment for eating disorders. Momentary assessment and eating diaries are highly valid measurement methods but often cumbersome and not always feasible to use in clinical practice. The aim of this study was to design and evaluate a self-report instrument for measuring meal patterns. The Pattern of eating item from the Eating Disorder Examination (EDE) interview was adapted to self-report format to follow the same overall structure as the Eating Disorder Examination Questionnaire. The new instrument was named the Meal Patterns Questionnaire (MPQ) and was compared with the EDE in a student sample (n=105) and an obese sample (n=111). The individual items of the MPQ and the EDE showed moderate to high correlations (rho=.63-89) in the two samples. Significant differences between the MPQ and EDE were only found for two items in the obese sample. The total scores correlated to a high degree (rho=.87/.74) in both samples and no significant differences were found in this variable. The MPQ can provide an overall picture of a person's eating patterns and is a valid way to collect data regarding meal patterns. The MPQ may be a useable tool in clinical practice and research studies when more extensive instruments cannot be used. Future studies should evaluate the MPQ in diverse cultural populations and with more ecological assessment methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. 99Tc atom counting by quadrupole ICP-MS. Optimisation of the instrumental response

    NASA Astrophysics Data System (ADS)

    Más, José L.; Garcia-León, Manuel; Bolívar, Juan P.

    2002-05-01

    In this paper, an extensive work is done on the specific tune of a conventional ICP-MS for 99Tc atom counting. For this, two methods have been used and compared: the partial variable control method and the 5D Simplex method. Instrumental limits of detection of 0.2 and 0.8 ppt, respectively, were obtained. They are noticeably lower than that found with the automatic tune method of the spectrometer, 47 ppt, which shows the need of a specific tune when very low levels of 99Tc have to be determined. A study is presented on the mass interferences for 99Tc. Our experiments show that the formation of polyatomic atoms or refractory oxides as well as 98Mo hydrides seem to be irrelevant for 99Tc atom counting. The opposite occurs with the presence of isobaric interferences, i.e. 99Ru, and the effect of abundance sensitivity, or low-mass resolution, which can modify the response at m/ z=99 to a non-negligible extent.

  4. Measuring spatial variability in soil characteristics

    DOEpatents

    Hoskinson, Reed L.; Svoboda, John M.; Sawyer, J. Wayne; Hess, John R.; Hess, J. Richard

    2002-01-01

    The present invention provides systems and methods for measuring a load force associated with pulling a farm implement through soil that is used to generate a spatially variable map that represents the spatial variability of the physical characteristics of the soil. An instrumented hitch pin configured to measure a load force is provided that measures the load force generated by a farm implement when the farm implement is connected with a tractor and pulled through or across soil. Each time a load force is measured, a global positioning system identifies the location of the measurement. This data is stored and analyzed to generate a spatially variable map of the soil. This map is representative of the physical characteristics of the soil, which are inferred from the magnitude of the load force.

  5. Diagnosing Students' Understanding of the Nature of Models

    NASA Astrophysics Data System (ADS)

    Gogolin, Sarah; Krüger, Dirk

    2017-10-01

    Students' understanding of models in science has been subject to a number of investigations. The instruments the researchers used are suitable for educational research but, due to their complexity, cannot be employed directly by teachers. This article presents forced choice (FC) tasks, which, assembled as a diagnostic instrument, are supposed to measure students' understanding of the nature of models efficiently, while being sensitive enough to detect differences between individuals. In order to evaluate if the diagnostic instrument is suitable for its intended use, we propose an approach that complies with the demand to integrate students' responses to the tasks into the validation process. Evidence for validity was gathered based on relations to other variables and on students' response processes. Students' understanding of the nature of models was assessed using three methods: FC tasks, open-ended tasks and interviews ( N = 448). Furthermore, concurrent think-aloud protocols ( N = 30) were performed. The results suggest that the method and the age of the students have an effect on their understanding of the nature of models. A good understanding of the FC tasks as well as a convergence in the findings across the three methods was documented for grades eleven and twelve. This indicates that teachers can use the diagnostic instrument for an efficient and, at the same time, valid diagnosis for this group. Finally, the findings of this article may provide a possible explanation for alternative findings from previous studies as a result of specific methods that were used.

  6. Nanopore sequencing in microgravity

    PubMed Central

    McIntyre, Alexa B R; Rizzardi, Lindsay; Yu, Angela M; Alexander, Noah; Rosen, Gail L; Botkin, Douglas J; Stahl, Sarah E; John, Kristen K; Castro-Wallace, Sarah L; McGrath, Ken; Burton, Aaron S; Feinberg, Andrew P; Mason, Christopher E

    2016-01-01

    Rapid DNA sequencing and analysis has been a long-sought goal in remote research and point-of-care medicine. In microgravity, DNA sequencing can facilitate novel astrobiological research and close monitoring of crew health, but spaceflight places stringent restrictions on the mass and volume of instruments, crew operation time, and instrument functionality. The recent emergence of portable, nanopore-based tools with streamlined sample preparation protocols finally enables DNA sequencing on missions in microgravity. As a first step toward sequencing in space and aboard the International Space Station (ISS), we tested the Oxford Nanopore Technologies MinION during a parabolic flight to understand the effects of variable gravity on the instrument and data. In a successful proof-of-principle experiment, we found that the instrument generated DNA reads over the course of the flight, including the first ever sequenced in microgravity, and additional reads measured after the flight concluded its parabolas. Here we detail modifications to the sample-loading procedures to facilitate nanopore sequencing aboard the ISS and in other microgravity environments. We also evaluate existing analysis methods and outline two new approaches, the first based on a wave-fingerprint method and the second on entropy signal mapping. Computationally light analysis methods offer the potential for in situ species identification, but are limited by the error profiles (stays, skips, and mismatches) of older nanopore data. Higher accuracies attainable with modified sample processing methods and the latest version of flow cells will further enable the use of nanopore sequencers for diagnostics and research in space. PMID:28725742

  7. Hypothesis driven assessment of an NMR curriculum

    NASA Astrophysics Data System (ADS)

    Cossey, Kimberly

    The goal of this project was to develop a battery of assessments to evaluate an undergraduate NMR curriculum at Penn State University. As a chemical education project, we sought to approach the problem of curriculum assessment from a scientific perspective, while remaining grounded in the education research literature and practices. We chose the phrase hypothesis driven assessment to convey this process of relating the scientific method to the study of educational methods, modules, and curricula. We began from a hypothesis, that deeper understanding of one particular analytical technique (NMR) will increase undergraduate students' abilities to solve chemical problems. We designed an experiment to investigate this hypothesis, and data collected were analyzed and interpreted in light of the hypothesis and several related research questions. The expansion of the NMR curriculum at Penn State was funded through the NSF's Course, Curriculum, and Laboratory Improvement (CCLI) program, and assessment was required. The goal of this project, as stated in the grant proposal, was to provide NMR content in greater depth by integrating NMR modules throughout the curriculum in physical chemistry, instrumental, and organic chemistry laboratory courses. Hands-on contact with the NMR spectrometer and NMR data and repeated exposure of the analytical technique within different contexts (courses) were unique factors of this curriculum. Therefore, we maintained a focus on these aspects throughout the evaluation process. The most challenging and time-consuming aspect of any assessment is the development of testing instruments and methods to provide useful data. After key variables were defined, testing instruments were designed to measure these variables based on educational literature (Chapter 2). The primary variables measured in this assessment were: depth of understanding of NMR, basic NMR knowledge, problem solving skills (HETCOR problem), confidence for skills used in class (within the hands-on NMR modules), confidence for NMR tasks (not practiced), and confidence for general science tasks. Detailed discussion of the instruments, testing methods and experimental design used in this assessment are provided (Chapter 3). All data were analyzed quantitatively using methods adapted from the educational literature (Chapter 4). Data were analyzed and the descriptive statistics, independent t-tests between the experimental and control groups, and correlation statistics were calculated for each variable. In addition, for those variables included on the pretest, dependent t-tests between pretest and posttest scores were also calculated. The results of study 1 and study 2 were used to draw conclusions based on the hypothesis and research questions proposed in this work (Chapter 4). Data collected in this assessment were used to answer the following research questions: (1) Primary research question: Is depth of understanding of NMR linked to problem solving skills? (2) Are the NMR modules working as intended? Do they promote depth of understanding of NMR? (a) Will students who complete NMR modules have a greater depth of understanding of NMR than students who do not complete the modules? (b) Is depth of understanding increasing over the course of the experiment? (3) Is confidence an intermediary between depth of understanding and problem solving skills? Is it linked to both variables? (4) What levels of confidence are affected by the NMR modules? (a) Will confidence for the NMR class skills used in the modules themselves be greater for those who have completed the modules? (b) Will confidence for NMR tasks not practiced in the course be affected? (c) Will confidence for general science tasks be affected? (d) Are different levels of confidence (class skills, NMR tasks, general science tasks) linked to each other? Results from this NMR curriculum assessment could also have implications outside of the courses studied, and so there is potential to impact the chemical education community (section 5.2.1). In addition to providing reliable testing instruments/measures that could be used outside the university, the results of this research contribute to the study of problem solving in chemistry, learner characteristics within the context of chemical education studies, and NMR specific educational evaluations. Valuable information was gathered through the current method of evaluation for the NMR curriculum. However, improvements could be made to the existing assessment, and an alternate assessment that could supplement the information found in this study has been proposed (Chapter 5).

  8. The Difference That One Year of Schooling Makes for Russian Schoolchildren. Based on PISA 2009: Reading

    ERIC Educational Resources Information Center

    Tiumeneva, Yu. A.; Kuzmina, Ju. V.

    2015-01-01

    The PISA 2009 data (in reading) investigated the effectiveness of one year of schooling in seven countries: Russia, Czech Republic, Hungary, Slovakia, Germany, Canada, and Brazil. We used an instrumental variable, which allowed us to estimate the effect of one year of schooling through the fuzzy method of regression discontinuity. The analysis was…

  9. The Fixed-Effects Model in Returns to Schooling and Its Application to Community Colleges: A Methodological Note

    ERIC Educational Resources Information Center

    Dynarski, Susan; Jacob, Brian; Kreisman, Daniel

    2016-01-01

    The purpose of this note is to develop insight into the performance of the individual fixed-effects model when used to estimate wage returns to postsecondary schooling. We focus our attention on the returns to attending and completing community college. While other methods (instrumental variables, regression discontinuity) have been used to…

  10. Greek Physicians' Perceptions on Generic Drugs in the Era of Austerity

    PubMed Central

    Labiris, Georgios; Fanariotis, Michael; Kastanioti, Catherine; Alexias, Georgios; Protopapas, Adonis; Karampitsakos, Theodoros; Niakas, Dimitris

    2015-01-01

    Purpose. To assess the beliefs and preferences of Greek physicians, regarding generic drugs, in the years of financial crisis. Setting. Multicentered, nationwide survey. Material and Methods. A custom questionnaire based on former similar studies was developed and administered to Greek physicians. The variable “perception on generics” was constructed after an exploratory study and the instrument was validated by conventional and Rasch analysis methods. 22 items formed 5 subscales that constructed the variable in question. Results. 908 physicians successfully participated in the study (response rate: 80%). Mean total scores to the instrument were 60.63 ± 12.12 for men and significantly less (58.24 ± 11.73) for women (p = 0.04). Greek physicians were not persuaded on the potential economic gain (45.79 ± 10.53); moreover they identified that Greek authorities cannot address the increased pharmacovigilance mandates. Physicians working in Athens and those working in surgical units demonstrated significantly worse scores than their colleagues from the rest of Greece and those working in Internal Medicine wards (p = 0.03).  Conclusion. Our results suggest an overall poor acceptance of the national initiative on generic drugs by Greek physicians. This trial is registered with Clinicaltrials.gov identifier: NCT01855802. PMID:26457225

  11. Review of the tactical evaluation tools for youth players, assessing the tactics in team sports: football.

    PubMed

    González-Víllora, Sixto; Serra-Olivares, Jaime; Pastor-Vicedo, Juan Carlos; da Costa, Israel Teoldo

    2015-01-01

    For sports assessment to be comprehensive, it must address all variables of sports development, such as psychological, social-emotional, physical and physiological, technical and tactical. Tactical assessment has been a neglected variable until the 1980s or 1990s. In the last two decades (1995-2015), the evolution of tactical assessment has grown considerably, given its importance in game performance. The aim of this paper is to compile and analyze different tactical measuring tools in team sports, particularly in soccer, through a bibliographical review. Six tools have been selected on five different criteria: (1) Instruments which assess tactics, (2) The studies have an evolution approach related to the tactical principles, (3) With a valid and reliable method, (4) The existence of publications mentioning the tool in the method, v. Applicable in different sports contexts. All six tools are structured around seven headings: introduction, objective(s), tactical principles, materials, procedures, instructions/rules of the game and published studies. In conclusion, the teaching-learning processes more tactical oriented have useful tactical assessment instrument in the literature. The selection of one or another depends some context information, like age and level of expertise of the players.

  12. Shape optimization techniques for musical instrument design

    NASA Astrophysics Data System (ADS)

    Henrique, Luis; Antunes, Jose; Carvalho, Joao S.

    2002-11-01

    The design of musical instruments is still mostly based on empirical knowledge and costly experimentation. One interesting improvement is the shape optimization of resonating components, given a number of constraints (allowed parameter ranges, shape smoothness, etc.), so that vibrations occur at specified modal frequencies. Each admissible geometrical configuration generates an error between computed eigenfrequencies and the target set. Typically, error surfaces present many local minima, corresponding to suboptimal designs. This difficulty can be overcome using global optimization techniques, such as simulated annealing. However these methods are greedy, concerning the number of function evaluations required. Thus, the computational effort can be unacceptable if complex problems, such as bell optimization, are tackled. Those issues are addressed in this paper, and a method for improving optimization procedures is proposed. Instead of using the local geometric parameters as searched variables, the system geometry is modeled in terms of truncated series of orthogonal space-funcitons, and optimization is performed on their amplitude coefficients. Fourier series and orthogonal polynomials are typical such functions. This technique reduces considerably the number of searched variables, and has a potential for significant computational savings in complex problems. It is illustrated by optimizing the shapes of both current and uncommon marimba bars.

  13. FHR Process Instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holcomb, David Eugene

    2015-01-01

    Fluoride salt-cooled High temperature Reactors (FHRs) are entering into early phase engineering development. Initial candidate technologies have been identified to measure all of the required process variables. The purpose of this paper is to describe the proposed measurement techniques in sufficient detail to enable assessment of the proposed instrumentation suite and to support development of the component technologies. This paper builds upon the instrumentation chapter of the recently published FHR technology development roadmap. Locating instruments outside of the intense core radiation and high-temperature fluoride salt environment significantly decreases their environmental tolerance requirements. Under operating conditions, FHR primary coolant salt ismore » a transparent, low-vapor-pressure liquid. Consequently, FHRs can employ standoff optical measurements from above the salt pool to assess in-vessel conditions. For example, the core outlet temperature can be measured by observing the fuel s blackbody emission. Similarly, the intensity of the core s Cerenkov glow indicates the fission power level. Short-lived activation of the primary coolant provides another means for standoff measurements of process variables. The primary coolant flow and neutron flux can be measured using gamma spectroscopy along the primary coolant piping. FHR operation entails a number of process measurements. Reactor thermal power and core reactivity are the most significant variables for process control. Thermal power can be determined by measuring the primary coolant mass flow rate and temperature rise across the core. The leading candidate technologies for primary coolant temperature measurement are Au-Pt thermocouples and Johnson noise thermometry. Clamp-on ultrasonic flow measurement, that includes high-temperature tolerant standoffs, is a potential coolant flow measurement technique. Also, the salt redox condition will be monitored as an indicator of its corrosiveness. Both electrochemical techniques and optical spectroscopy are candidate fluoride salt redox measurement methods. Coolant level measurement can be performed using radar-level gauges located in standpipes above the reactor vessel. While substantial technical development remains for most of the instruments, industrially compatible instruments based upon proven technology can be reasonably extrapolated from the current state of the art.« less

  14. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    PubMed

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  15. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships

    PubMed Central

    Rassen, Jeremy A.; Brookhart, M. Alan; Glynn, Robert J.; Mittleman, Murray A.; Schneeweiss, Sebastian

    2010-01-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of “exchangeability” between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects. PMID:19356901

  16. Reactions of Fe+ and FeO+ with C2H2, C2H4, and C2H6: Temperature-Dependent Kinetics

    DTIC Science & Technology

    2013-09-12

    studies to lead to the development of efficient quantum chemical calculation methods by offering benchmarks for testing and refinement. Due to the...EXPERIMENTAL METHODS All measurements were performed on the Air Force Research Laboratory’s variable temperature selected ion flow tube (VT- SIFT) instrument...correct within error, indicating that they are in the low-pressure limit,52,53 and the termolecular rate constant is obtained from the slope. In contrast

  17. Outcomes of deliveries by family physicians or obstetricians: a population-based cohort study using an instrumental variable

    PubMed Central

    Aubrey-Bassler, Kris; Cullen, Richard M.; Simms, Alvin; Asghari, Shabnam; Crane, Joan; Wang, Peizhong Peter; Godwin, Marshall

    2015-01-01

    Background: Previous research has suggested that obstetric outcomes are similar for deliveries by family physicians and obstetricians, but many of these studies were small, and none of them adjusted for unmeasured selection bias. We compared obstetric outcomes between these provider types using an econometric method designed to adjust for unobserved confounding. Methods: We performed a retrospective population-based cohort study of all Canadian (except Quebec) hospital births with delivery by family physicians and obstetricians at more than 20 weeks gestational age, with birth weight greater than 500 g, between Apr. 1, 2006, and Mar. 31, 2009. The primary outcomes were the relative risks of in-hospital perinatal death and a composite of maternal mortality and major morbidity assessed with multivariable logistic regression and instrumental variable–adjusted multivariable regression. Results: After exclusions, there were 3600 perinatal deaths and 14 394 cases of maternal morbidity among 799 823 infants and 793 053 mothers at 390 hospitals. For deliveries by family physicians v. obstetricians, the relative risk of perinatal mortality was 0.98 (95% confidence interval [CI] 0.85–1.14) and of maternal morbidity was 0.81 (95% CI 0.70–0.94) according to logistic regression. The respective relative risks were 0.97 (95% CI 0.58–1.64) and 1.13 (95% CI 0.65–1.95) according to instrumental variable methods. Interpretation: After adjusting for both observed and unobserved confounders, we found a similar risk of perinatal mortality and adverse maternal outcome for obstetric deliveries by family physicians and obstetricians. Whether there are differences between these groups for other outcomes remains to be seen. PMID:26303244

  18. On Aethalometer measurement uncertainties and an instrument correction factor for the Arctic

    NASA Astrophysics Data System (ADS)

    Backman, John; Schmeisser, Lauren; Virkkula, Aki; Ogren, John A.; Asmi, Eija; Starkweather, Sandra; Sharma, Sangeeta; Eleftheriadis, Konstantinos; Uttal, Taneil; Jefferson, Anne; Bergin, Michael; Makshtas, Alexander; Tunved, Peter; Fiebig, Markus

    2017-12-01

    Several types of filter-based instruments are used to estimate aerosol light absorption coefficients. Two significant results are presented based on Aethalometer measurements at six Arctic stations from 2012 to 2014. First, an alternative method of post-processing the Aethalometer data is presented, which reduces measurement noise and lowers the detection limit of the instrument more effectively than boxcar averaging. The biggest benefit of this approach can be achieved if instrument drift is minimised. Moreover, by using an attenuation threshold criterion for data post-processing, the relative uncertainty from the electronic noise of the instrument is kept constant. This approach results in a time series with a variable collection time (Δt) but with a constant relative uncertainty with regard to electronic noise in the instrument. An additional advantage of this method is that the detection limit of the instrument will be lowered at small aerosol concentrations at the expense of temporal resolution, whereas there is little to no loss in temporal resolution at high aerosol concentrations ( > 2.1-6.7 Mm-1 as measured by the Aethalometers). At high aerosol concentrations, minimising the detection limit of the instrument is less critical. Additionally, utilising co-located filter-based absorption photometers, a correction factor is presented for the Arctic that can be used in Aethalometer corrections available in literature. The correction factor of 3.45 was calculated for low-elevation Arctic stations. This correction factor harmonises Aethalometer attenuation coefficients with light absorption coefficients as measured by the co-located light absorption photometers. Using one correction factor for Arctic Aethalometers has the advantage that measurements between stations become more inter-comparable.

  19. Determination of arsenic, cadmium, cobalt, chromium, lead, molybdenum, nickel, and selenium in fertilizers by microwave digestion and inductively coupled plasma-optical emission spectrometry detection: collaborative study.

    PubMed

    Kane, Peter F; Hall, William L

    2006-01-01

    There is increasing regulatory interest in the non-nutritive metals content of fertilizer materials, but at present there is no consensus analytical method for acid digestion and instrument detection of those elements in fertilizer matrixes. This lack of method standardization has resulted in unacceptable variability of results between fertilizer laboratories performing metals analysis. A method has been developed using microwave digestion with nitric acid at 200 degrees C, followed by inductively coupled plasma-optical emission spectrometry instrument detection, for the elements arsenic, cadmium, cobalt, chromium, molybdenum, nickel, lead, and selenium. The method has been collaboratively studied, and statistical results are here reported. Fourteen collaborators were sent 62 sample materials in a blind duplicate design. Materials represented a broad cross section of fertilizer types, including phosphate ore, manufactured phosphate products, N-P-K blends, organic fertilizers, and micro-nutrient materials. As much as possible within the limit of the number of samples, materials were selected from different regions of the United States and the world. Limit of detection (LOD) was determined using synthetic fertilizers consisting of reagent grade chemicals with near zero levels of the non-nutritive elements, analyzed blindly. Samples with high iron content caused the most variability between laboratories. Most samples reasonably above LOD gave HorRat values within the range 0.5 to 2.0, indicating acceptable method performance according to AOAC guidelines for analyses in the mg/kg range. The method is recommended for AOAC Official First Action status.

  20. Simulation of HLNC and NCC measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ming-Shih; Teichmann, T.; De Ridder, P.

    1994-03-01

    This report discusses an automatic method of simulating the results of High Level Neutron Coincidence Counting (HLNC) and Neutron Collar Coincidence Counting (NCC) measurements to facilitate the safeguards` inspectors understanding and use of these instruments under realistic conditions. This would otherwise be expensive, and time-consuming, except at sites designed to handle radioactive materials, and having the necessary variety of fuel elements and other samples. This simulation must thus include the behavior of the instruments for variably constituted and composed fuel elements (including poison rods and Gd loading), and must display the changes in the count rates as a function ofmore » these characteristics, as well as of various instrumental parameters. Such a simulation is an efficient way of accomplishing the required familiarization and training of the inspectors by providing a realistic reproduction of the results of such measurements.« less

  1. Variability of the human heart rate as a diagnostic instrument obtained by mean of a wireless monitor

    NASA Astrophysics Data System (ADS)

    Barajas Mauricio, Sánchez; Hernández González, Martha Alicia; Figueroa Vega, Nicte; Malacara Hernández, Juan Manuel; Fraga Teodoro, Córdova

    2014-11-01

    Introduction: Heart rate variability (HRV) is the cyclic measurement of RR intervals between normal beats. Aim: To determine the VFC via a wireless Polar monitor. Material and methods: 100 symptomatic menopausal women were studied for measurements of HRV were I post a Polar RS400 Watch four hrs. Results: Obtained through the fast Fourier transform, the frequency domain HRV low frequency (LF) 0.04-0.15 Hz, high frequency (HF) 0.15-0.4Hz and the ratio LF / HF. Conclusion: obtaining HRV is important for cardiovascular autonomic assessment in menopausal women.

  2. Selection of key ambient particulate variables for epidemiological studies - applying cluster and heatmap analyses as tools for data reduction.

    PubMed

    Gu, Jianwei; Pitz, Mike; Breitner, Susanne; Birmili, Wolfram; von Klot, Stephanie; Schneider, Alexandra; Soentgen, Jens; Reller, Armin; Peters, Annette; Cyrys, Josef

    2012-10-01

    The success of epidemiological studies depends on the use of appropriate exposure variables. The purpose of this study is to extract a relatively small selection of variables characterizing ambient particulate matter from a large measurement data set. The original data set comprised a total of 96 particulate matter variables that have been continuously measured since 2004 at an urban background aerosol monitoring site in the city of Augsburg, Germany. Many of the original variables were derived from measured particle size distribution (PSD) across the particle diameter range 3 nm to 10 μm, including size-segregated particle number concentration, particle length concentration, particle surface concentration and particle mass concentration. The data set was complemented by integral aerosol variables. These variables were measured by independent instruments, including black carbon, sulfate, particle active surface concentration and particle length concentration. It is obvious that such a large number of measured variables cannot be used in health effect analyses simultaneously. The aim of this study is a pre-screening and a selection of the key variables that will be used as input in forthcoming epidemiological studies. In this study, we present two methods of parameter selection and apply them to data from a two-year period from 2007 to 2008. We used the agglomerative hierarchical cluster method to find groups of similar variables. In total, we selected 15 key variables from 9 clusters which are recommended for epidemiological analyses. We also applied a two-dimensional visualization technique called "heatmap" analysis to the Spearman correlation matrix. 12 key variables were selected using this method. Moreover, the positive matrix factorization (PMF) method was applied to the PSD data to characterize the possible particle sources. Correlations between the variables and PMF factors were used to interpret the meaning of the cluster and the heatmap analyses. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Comparison of apical debris extrusion during root canal preparation using instrumentation techniques with two operating principles: An in vitro study

    PubMed Central

    Verma, Mudita; Meena, N.; Kumari, R. Anitha; Mallandur, Sudhanva; Vikram, R.; Gowda, Vishwas

    2017-01-01

    Aims: The aim of this study was to quantify the debris extruded apically from teeth using rotary and reciprocation instrumentation systems. Subjects and Methods: Eighty extracted human mandibular premolars with single canals and similar lengths were instrumented using ProTaper Universal (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), ProTaper Next (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), WaveOne (40, 06; Dentsply Maillefer, Ballaigues, Switzerland), and Reciproc (R40; VDW GmbH, Munich, Germany). Debris extruded during instrumentation was collected into preweighed Eppendorf tubes, which were then stored in an incubator at 70°C for 5 days. The final weight of the Eppendorf tubes with the extruded debris was calculated after obtaining the mean of three consecutive weights obtained for each tube. Statistical Analysis Used: Statistical analysis was performed using SPSS version 16.0 software. The groups were compared using the Kruskal–Wallis test for all variables. Results: There was no statistically significant difference between the groups (P = 0.1114). However, the ProTaper Universal group produced more extrusion and ProTaper Next produced least debris extrusion among the instrument groups (P > 0.05). Conclusions: All instrumentation techniques were associated with extruded debris. PMID:28855755

  4. The Impact of Childhood Obesity on Health and Health Service Use.

    PubMed

    Kinge, Jonas Minet; Morris, Stephen

    2018-06-01

    To test the impact of obesity on health and health care use in children, by the use of various methods to account for reverse causality and omitted variables. Fifteen rounds of the Health Survey for England (1998-2013), which is representative of children and adolescents in England. We use three methods to account for reverse causality and omitted variables in the relationship between BMI and health/health service use: regression with individual, parent, and household control variables; sibling fixed effects; and instrumental variables based on genetic variation in weight. We include all children and adolescents aged 4-18 years old. We find that obesity has a statistically significant and negative impact on self-rated health and a positive impact on health service use in girls, boys, younger children (aged 4-12), and adolescents (aged 13-18). The findings are comparable in each model in both boys and girls. Using econometric methods, we have mitigated several confounding factors affecting the impact of obesity in childhood on health and health service use. Our findings suggest that obesity has severe consequences for health and health service use even among children. © Health Research and Educational Trust.

  5. Extreme Ultraviolet Variability Experiment (EVE) Multiple EUV Grating Spectrographs (MEGS): Radiometric Calibrations and Results

    NASA Technical Reports Server (NTRS)

    Hock, R. A.; Woods, T. N.; Crotser, D.; Eparvier, F. G.; Woodraska, D. L.; Chamberlin, P. C.; Woods, E. C.

    2010-01-01

    The NASA Solar Dynamics Observatory (SDO), scheduled for launch in early 2010, incorporates a suite of instruments including the Extreme Ultraviolet Variability Experiment (EVE). EVE has multiple instruments including the Multiple Extreme ultraviolet Grating Spectrographs (MEGS) A, B, and P instruments, the Solar Aspect Monitor (SAM), and the Extreme ultraviolet SpectroPhotometer (ESP). The radiometric calibration of EVE, necessary to convert the instrument counts to physical units, was performed at the National Institute of Standards and Technology (NIST) Synchrotron Ultraviolet Radiation Facility (SURF III) located in Gaithersburg, Maryland. This paper presents the results and derived accuracy of this radiometric calibration for the MEGS A, B, P, and SAM instruments, while the calibration of the ESP instrument is addressed by Didkovsky et al. . In addition, solar measurements that were taken on 14 April 2008, during the NASA 36.240 sounding-rocket flight, are shown for the prototype EVE instruments.

  6. Sources of Variability in Chlorophyll Analysis by Fluorometry and by High Performance Liquid Chromatography. Chapter 22

    NASA Technical Reports Server (NTRS)

    VanHeukelem, Laurie; Thomas, Crystal S.; Glibert, Patricia M.

    2001-01-01

    The need for accurate determination of chlorophyll a (chl a) is of interest for numerous reasons. From the need for ground-truth data for remote sensing to pigment detection for laboratory experimentation, it is essential to know the accuracy of the analyses and the factors potentially contributing to variability and error. Numerous methods and instrument techniques are currently employed in the analyses of chl a. These methods range from spectrophotometric quantification, to fluorometric analysis and determination by high performance liquid chromatography. Even within the application of HPLC techniques, methods vary. Here we provide the results of a comparison among methods and provide some guidance for improving the accuracy of these analyses. These results are based on a round-robin conducted among numerous investigators, including several in the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) and HyCODE Programs. Our purpose here is not to present the full results of the laboratory intercalibration; those results will be presented elsewhere. Rather, here we highlight some of the major factors that may contribute to the variability observed. Specifically, we aim to assess the comparability of chl a analyses performed by fluorometry and HPLC, and we identify several factors in the analyses which may contribute disproportionately to this variability.

  7. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    PubMed

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. From nanoparticles to large aerosols: Ultrafast measurement methods for size and concentration

    NASA Astrophysics Data System (ADS)

    Keck, Lothar; Spielvogel, Jürgen; Grimm, Hans

    2009-05-01

    A major challenge in aerosol technology is the fast measurement of number size distributions with good accuracy and size resolution. The dedicated instruments are frequently based on particle charging and electric detection. Established fast systems, however, still feature a number of shortcomings. We have developed a new instrument that constitutes of a high flow Differential Mobility Analyser (high flow DMA) and a high sensitivity Faraday Cup Electrometer (FCE). The system enables variable flow rates of up to 150 lpm, and the scan time for size distribution can be shortened considerably due to the short residence time of the particles in the DMA. Three different electrodes can be employed in order to cover a large size range. First test results demonstrate that the scan time can be reduced to less than 1 s for small particles, and that the results from the fast scans feature no significant difference to the results from established slow method. The fields of application for the new instrument comprise the precise monitoring of fast processes with nanoparticles, including monitoring of engine exhaust in automotive research.

  9. The impact of fashion competence and achievement motivation toward college student’s working readiness on “Cipta Karya” subject

    NASA Astrophysics Data System (ADS)

    Marniati; Wibawa, S. C.

    2018-01-01

    This experiment aimed to know the rate of college student’s working readiness of fashion’s program study to perform ‘Cipta Karya’ related to cognitive readiness, manner readiness and skill readiness from a variable of fashion’s workmanship and achievement motivation. The subject of the experiment was 43 college students who took Cipta Karya subject. Method of collecting data used questionnaire with five alternative answers to Likert ratio model. Data analysis technique used path analysis (double regression). The instrument validity test used product moment correlation while for instrument reliability used Alpha Cronbach’s grade. The results showed (1) fashion competence was taking effect significantly on working readiness for ‘Cipta Karya’ (2) achievement motivation is taking effect significantly on working readiness for ‘cipta karya’ (3) both variables are positive. This means that fashion competence and achievement motivation have a positive effect on working readiness for ‘cipta karya’ performance.

  10. Adjusted Analyses in Studies Addressing Therapy and Harm: Users' Guides to the Medical Literature.

    PubMed

    Agoritsas, Thomas; Merglen, Arnaud; Shah, Nilay D; O'Donnell, Martin; Guyatt, Gordon H

    2017-02-21

    Observational studies almost always have bias because prognostic factors are unequally distributed between patients exposed or not exposed to an intervention. The standard approach to dealing with this problem is adjusted or stratified analysis. Its principle is to use measurement of risk factors to create prognostically homogeneous groups and to combine effect estimates across groups.The purpose of this Users' Guide is to introduce readers to fundamental concepts underlying adjustment as a way of dealing with prognostic imbalance and to the basic principles and relative trustworthiness of various adjustment strategies.One alternative to the standard approach is propensity analysis, in which groups are matched according to the likelihood of membership in exposed or unexposed groups. Propensity methods can deal with multiple prognostic factors, even if there are relatively few patients having outcome events. However, propensity methods do not address other limitations of traditional adjustment: investigators may not have measured all relevant prognostic factors (or not accurately), and unknown factors may bias the results.A second approach, instrumental variable analysis, relies on identifying a variable associated with the likelihood of receiving the intervention but not associated with any prognostic factor or with the outcome (other than through the intervention); this could mimic randomization. However, as with assumptions of other adjustment approaches, it is never certain if an instrumental variable analysis eliminates bias.Although all these approaches can reduce the risk of bias in observational studies, none replace the balance of both known and unknown prognostic factors offered by randomization.

  11. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize this gap in best practices and subsequently to promote instrument development research that is more consistent through the peer-review process.

  12. CD4 Lymphocyte Enumeration and Hemoglobin Assessment Aid for Priority Decisions: A Multisite Evaluation of the BD FACSPresto™ System.

    PubMed

    Thakar, Madhuri; Angira, Francis; Pattanapanyasat, Kovit; Wu, Alan H B; O'Gorman, Maurice; Zeng, Hui; Qu, Chenxue; Mahajan, Bharati; Sukapirom, Kasama; Chen, Danying; Hao, Yu; Gong, Yan; Indig, Monika De Arruda; Graminske, Sharon; Orta, Diana; d'Empaire, Nicole; Lu, Beverly; Omana-Zapata, Imelda; Zeh, Clement

    2017-01-01

    The BD FACSPresto ™ system uses capillary and venous blood to measure CD4 absolute counts (CD4), %CD4 in lymphocytes, and hemoglobin (Hb) in approximately 25 minutes. CD4 cell count is used with portable CD4 counters in resource-limited settings to manage HIV/AIDS patients. A method comparison was performed using capillary and venous samples from seven clinical laboratories in five countries. The BD FACSPresto system was assessed for variability between laboratory, instrument/operators, cartridge lots and within-run at four sites. Samples were collected under approved voluntary consent. EDTA-anticoagulated venous samples were tested for CD4 and %CD4 T cells using the gold-standard BD FACSCalibur ™ system, and for Hb, using the Sysmex ® KX-21N ™ analyzer. Venous and capillary samples were tested on the BD FACSPresto system. Matched data was analyzed for bias (Deming linear regression and Bland-Altman methods), and for concordance around the clinical decision point. The coefficient of variation was estimated per site, instrument/operator, cartridge-lot and between-runs. For method comparison, 93% of the 720 samples were from HIV-positive and 7% from HIV-negative or normal subjects. CD4 and %CD4 T cells venous and capillary results gave slopes within 0.96-1.05 and R 2 ≥0.96; Hb slopes were ≥1.00 and R 2 ≥0.89. Variability across sites/operators gave %CV <5.8% for CD4 counts, <1.9% for %CD4 and <3.2% for Hb. The total %CV was <7.7% across instrument/cartridge lot. The BD FACSPresto system provides accurate, reliable, precise CD4/%CD4/Hb results compared to gold-standard methods, irrespective of venous or capillary blood sampling. The data showed good agreement between the BD FACSPresto, BD FACSCalibur and Sysmex systems.

  13. Genetic markers as instrumental variables.

    PubMed

    von Hinke, Stephanie; Davey Smith, George; Lawlor, Debbie A; Propper, Carol; Windmeijer, Frank

    2016-01-01

    The use of genetic markers as instrumental variables (IV) is receiving increasing attention from economists, statisticians, epidemiologists and social scientists. Although IV is commonly used in economics, the appropriate conditions for the use of genetic variants as instruments have not been well defined. The increasing availability of biomedical data, however, makes understanding of these conditions crucial to the successful use of genotypes as instruments. We combine the econometric IV literature with that from genetic epidemiology, and discuss the biological conditions and IV assumptions within the statistical potential outcomes framework. We review this in the context of two illustrative applications. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  14. A compact LWIR imaging spectrometer with a variable gap Fabry-Perot interferometer

    NASA Astrophysics Data System (ADS)

    Zhang, Fang; Gao, Jiaobo; Wang, Nan; Zhao, Yujie; Zhang, Lei; Gao, Shan

    2017-02-01

    Fourier transform spectroscopy is a widely employed method for obtaining spectra, with applications ranging from the desktop to remote sensing. The long wave infrared (LWIR) interferometric spectral imaging system is always with huge volume and large weight. In order to miniaturize and light the instrument, a new method of LWIR spectral imaging system based on a variable gap Fabry-Perot (FP) interferometer is researched. With the system working principle analyzed, theoretically, it is researched that how to make certain the primary parameter, such as, the reflectivity of the two interferometric cavity surfaces, field of view (FOV) and f-number of the imaging lens. A prototype is developed and a good experimental result of CO2 laser is obtained. The research shows that besides high throughput and high spectral resolution, the advantage of miniaturization is also simultaneously achieved in this method.

  15. The Paired Availability Design and Related Instrumental Variable Meta-analyses | Division of Cancer Prevention

    Cancer.gov

    Stuart G. Baker, 2017 Introduction This software computes meta-analysis and extrapolation estimates for an instrumental variable meta-analysis of randomized trial or before-and-after studies (the latter also known as the paired availability design). The software also checks on the assumptions if sufficient data are available. |

  16. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  17. Job demands and job strain as risk factors for employee wellbeing in elderly care: an instrumental-variables analysis.

    PubMed

    Elovainio, Marko; Heponiemi, Tarja; Kuusio, Hannamaria; Jokela, Markus; Aalto, Anna-Mari; Pekkarinen, Laura; Noro, Anja; Finne-Soveri, Harriet; Kivimäki, Mika; Sinervo, Timo

    2015-02-01

    The association between psychosocial work environment and employee wellbeing has repeatedly been shown. However, as environmental evaluations have typically been self-reported, the observed associations may be attributable to reporting bias. Applying instrumental-variable regression, we used staffing level (the ratio of staff to residents) as an unconfounded instrument for self-reported job demands and job strain to predict various indicators of wellbeing (perceived stress, psychological distress and sleeping problems) among 1525 registered nurses, practical nurses and nursing assistants working in elderly care wards. In ordinary regression, higher self-reported job demands and job strain were associated with increased risk of perceived stress, psychological distress and sleeping problems. The effect estimates for the associations of these psychosocial factors with perceived stress and psychological distress were greater, but less precisely estimated, in an instrumental-variables analysis which took into account only the variation in self-reported job demands and job strain that was explained by staffing level. No association between psychosocial factors and sleeping problems was observed with the instrumental-variable analysis. These results support a causal interpretation of high self-reported job demands and job strain being risk factors for employee wellbeing. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  18. Method and apparatus for measuring low currents in capacitance devices

    DOEpatents

    Kopp, M.K.; Manning, F.W.; Guerrant, G.C.

    1986-06-04

    A method and apparatus for measuring subnanoampere currents in capacitance devices is reported. The method is based on a comparison of the voltages developed across the capacitance device with that of a reference capacitor in which the current is adjusted by means of a variable current source to produce a stable voltage difference. The current varying means of the variable current source is calibrated to provide a read out of the measured current. Current gain may be provided by using a reference capacitor which is larger than the device capacitance with a corresponding increase in current supplied through the reference capacitor. The gain is then the ratio of the reference capacitance to the device capacitance. In one illustrated embodiment, the invention makes possible a new type of ionizing radiation dose-rate monitor where dose-rate is measured by discharging a reference capacitor with a variable current source at the same rate that radiation is discharging an ionization chamber. The invention eliminates high-megohm resistors and low current ammeters used in low-current measuring instruments.

  19. Effectively Transforming IMC Flight into VMC Flight: An SVS Case Study

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Hughes, Monic F.; Parrish, Russell V.; Takallu, Mohammad A.

    2006-01-01

    A flight-test experiment was conducted using the NASA LaRC Cessna 206 aircraft. Four primary flight and navigation display concepts, including baseline and Synthetic Vision System (SVS) concepts, were evaluated in the local area of Roanoke Virginia Airport, flying visual and instrument approach procedures. A total of 19 pilots, from 3 pilot groups reflecting the diverse piloting skills of the GA population, served as evaluation pilots. Multi-variable Discriminant Analysis was applied to three carefully selected and markedly different operating conditions with conventional instrumentation to provide an extension of traditional analysis methods as well as provide an assessment of the effectiveness of SVS displays to effectively transform IMC flight into VMC flight.

  20. Translating the short version of the Perinatal Grief Scale: process and challenges.

    PubMed

    Capitulo, K L; Cornelio, M A; Lenz, E R

    2001-08-01

    Non-English-speaking populations may be excluded from rigorous clinical research because of the lack of reliable and valid instrumentation to measure psychosocial variables. The purpose of this article is to describe the process and challenges when translating a research instrument. The process will be illustrated in the project of translating into Spanish the Short Version of the Perinatal Grief Scale, extensively studied in English-speaking, primarily Caucasian populations. Translation methods, errors, and tips are included. Tools cannot be used in transcultural research and practice without careful and accurate translation and subsequent psychometric evaluation, which are essential to generate credible and valid findings. Copyright 2001 by W.B. Saunders Company

  1. Command and data handling of science signals on Spacelab

    NASA Technical Reports Server (NTRS)

    Mccain, H. G.

    1975-01-01

    The Orbiter Avionics and the Spacelab Command and Data Management System (CDMS) combine to provide a relatively complete command, control, and data handling service to the instrument complement during a Shuttle Sortie Mission. The Spacelab CDMS services the instruments and the Orbiter in turn services the Spacelab. The CDMS computer system includes three computers, two I/O units, a mass memory, and a variable number of remote acquisition units. Attention is given to the CDMS high rate multiplexer, CDMS tape recorders, closed circuit television for the visual monitoring of payload bay and cabin area activities, methods of science data acquisition, questions of transmission and recording, CDMS experiment computer usage, and experiment electronics.

  2. Probing dimensionality using a simplified 4-probe method.

    PubMed

    Kjeldby, Snorre B; Evenstad, Otto M; Cooil, Simon P; Wells, Justin W

    2017-10-04

    4-probe electrical measurements have been in existence for many decades. One of the most useful aspects of the 4-probe method is that it is not only possible to find the resistivity of a sample (independently of the contact resistances), but that it is also possible to probe the dimensionality of the sample. In theory, this is straightforward to achieve by measuring the 4-probe resistance as a function of probe separation. In practice, it is challenging to move all four probes with sufficient precision over the necessary range. Here, we present an alternative approach. We demonstrate that the dimensionality of the conductive path within a sample can be directly probed using a modified 4-probe method in which an unconventional geometry is exploited; three of the probes are rigidly fixed, and the position of only one probe is changed. This allows 2D and 3D (and other) contributions the to resistivity to be readily disentangled. The required experimental instrumentation can be vastly simplified relative to traditional variable spacing 4-probe instruments.

  3. A New Principle of Sound Frequency Analysis

    NASA Technical Reports Server (NTRS)

    Theodorsen, Theodore

    1932-01-01

    In connection with the study of aircraft and propeller noises, the National Advisory Committee for Aeronautics has developed an instrument for sound-frequency analysis which differs fundamentally from previous types, and which, owing to its simplicity of principle, construction, and operation, has proved to be of value in this investigation. The method is based on the well-known fact that the Ohmic loss in an electrical resistance is equal to the sum of the losses of the harmonic components of a complex wave, except for the case in which any two components approach or attain vectorial identity, in which case the Ohmic loss is increased by a definite amount. The principle of frequency analysis has been presented mathematically and a number of distinct advantages relative to previous methods have been pointed out. An automatic recording instrument embodying this principle is described in detail. It employs a beat-frequency oscillator as a source of variable frequency. A large number of experiments have verified the predicted superiority of the method. A number of representative records are presented.

  4. Analysis of temperature rise and the use of coolants in the dissipation of ultrasonic heat buildup during post removal.

    PubMed

    Davis, Stephen; Gluskin, Alan H; Livingood, Philip M; Chambers, David W

    2010-11-01

    This study was designed to calculate probabilities for tissue injury and to measure effectiveness of various coolant strategies in countering heat buildup produced by dry ultrasonic vibration during post removal. A simulated biological model was used to evaluate the cooling efficacy of a common refrigerant spray, water spray, and air spray in the recovery of post temperatures deep within the root canal space. The data set consisted of cervical and apical measures of temperature increase at 1-second intervals from baseline during continuous ultrasonic instrumentation until a 10 °C increase in temperature at the cervical site was registered, wherein instrumentation ceased, and the teeth were allowed to cool under ambient conditions or with the assistance of 4 coolant methods. Data were analyzed with analysis of variance by using the independent variables of time of ultrasonic application (10, 15, 20 seconds) and cooling method. In addition to the customary means, standard deviations, and analysis of variance tests, analyses were conducted to determine probabilities that procedures would reach or exceed the 10 °C threshold. Both instrumentation time and cooling agent effects were significant at P <.0001. Under the conditions of this study, it was shown that injurious heat transfer occurs in less than 1 minute during dry ultrasonic instrumentation of metallic posts. Cycles of short instrumentation times with active coolants were effective in reducing the probability of tissue damage when teeth were instrumented dry. With as little as 20 seconds of continuous dry ultrasonic instrumentation, the consequences of thermal buildup to an individual tooth might contribute to an injurious clinical outcome. Copyright © 2010 American Association of Endodontists. All rights reserved.

  5. Survey of equine castration techniques, preferences and outcomes among Australian veterinarians.

    PubMed

    Owens, C D; Hughes, K J; Hilbert, B J; Heller, J; Nielsen, S; Trope, G D

    2018-01-01

    (1) To collect the perceptions of veterinarians performing equine castrations in Australia on techniques, preferences and outcomes, (2) to investigate veterinarian use and experience with the Henderson castrating instrument and (3) to investigate potential associations between demographics, castration methods and techniques, and complications. Online survey of members of the Australian Veterinary Association's Special Interest Group, Equine Veterinarians Australia (EVA). A link to the survey was included in the EVA e-newsletter and practices on the EVA website were contacted by telephone and follow-up email. Fisher's exact test was used to determine associations between ligation and complications. A generalised linear model with a negative binomial family was used to determine associations between count response variables and categorical independent variables. Responses were obtained from 138 veterinarians (response rate, 13.1%) who performed 5330 castrations over 12 months. Castrations were most commonly performed in the field, on anaesthetised horses, using emasculators, via an open approach and without ligation of the spermatic cord. Estimated complications after use of emasculators were swelling (25%), haemorrhage (5%) and infection (5%). The Henderson instrument was used by approximately 10% of respondents and its use for castration was associated with fewer reports of postoperative swelling compared with emasculators (P = 0.002). Rates of evisceration with the Henderson and emasculator methods were comparable (0.43% and 0.9%, respectively). Castration preferences varied widely among survey participants. Reported complication types and rates were comparable to those reported previously in other countries. Perceptions that the Henderson instrument was associated with less swelling should be investigated further via a prospective controlled investigation. © 2017 Australian Veterinary Association.

  6. A new method for measuring lung deposition efficiency of airborne nanoparticles in a single breath

    NASA Astrophysics Data System (ADS)

    Jakobsson, Jonas K. F.; Hedlund, Johan; Kumlin, John; Wollmer, Per; Löndahl, Jakob

    2016-11-01

    Assessment of respiratory tract deposition of nanoparticles is a key link to understanding their health impacts. An instrument was developed to measure respiratory tract deposition of nanoparticles in a single breath. Monodisperse nanoparticles are generated, inhaled and sampled from a determined volumetric lung depth after a controlled residence time in the lung. The instrument was characterized for sensitivity to inter-subject variability, particle size (22, 50, 75 and 100 nm) and breath-holding time (3-20 s) in a group of seven healthy subjects. The measured particle recovery had an inter-subject variability 26-50 times larger than the measurement uncertainty and the results for various particle sizes and breath-holding times were in accordance with the theory for Brownian diffusion and values calculated from the Multiple-Path Particle Dosimetry model. The recovery was found to be determined by residence time and particle size, while respiratory flow-rate had minor importance in the studied range 1-10 L/s. The instrument will be used to investigate deposition of nanoparticles in patients with respiratory disease. The fast and precise measurement allows for both diagnostic applications, where the disease may be identified based on particle recovery, and for studies with controlled delivery of aerosol-based nanomedicine to specific regions of the lungs.

  7. Results from the Rothney Astrophysical Observatory Variable Star Search Program: Background, Procedure, and Results from RAO Field 1

    NASA Astrophysics Data System (ADS)

    Williams, Michael D.; Milone, E. F.

    2013-12-01

    We describe a variable star search program and present the fully reduced results of a search in a 19 square degree (4.4 × 4.4) field centered on J2000 RA = 22:03:24, DEC= +18:54:32. The search was carried out with the Baker-Nunn Patrol Camera located at the Rothney Astrophysical Observatory in the foothills of the Canadian Rockies. A total of 26,271 stars were detected in the field, over a range of about 11-15 (instrumental) magnitudes. Our image processing made use of the IRAF version of the DAOPHOT aperture photometry routine and we used the ANOVA method to search for periodic variations in the light curves. We formally detected periodic variability in 35 stars, that we tentatively classify according to light curve characteristics: 6 EA (Algol), 5 EB (?? Lyrae), 19 EW (W UMa), and 5 RR (RR Lyrae) stars. Eleven of the detected variable stars have been reported previously in the literature. The eclipsing binary light curves have been analyzed with a package of light curve modeling programs and 25 have yielded converged solutions. Ten of these are of systems that are detached, 3 semi-detached, 10 overcontact, and 2 are of systems that appear to be in marginal contact. We discuss these results as well as the advantages and disadvantages of the instrument and of the program.

  8. Cocaine Dependence Treatment Data: Methods for Measurement Error Problems With Predictors Derived From Stationary Stochastic Processes

    PubMed Central

    Guan, Yongtao; Li, Yehua; Sinha, Rajita

    2011-01-01

    In a cocaine dependence treatment study, we use linear and nonlinear regression models to model posttreatment cocaine craving scores and first cocaine relapse time. A subset of the covariates are summary statistics derived from baseline daily cocaine use trajectories, such as baseline cocaine use frequency and average daily use amount. These summary statistics are subject to estimation error and can therefore cause biased estimators for the regression coefficients. Unlike classical measurement error problems, the error we encounter here is heteroscedastic with an unknown distribution, and there are no replicates for the error-prone variables or instrumental variables. We propose two robust methods to correct for the bias: a computationally efficient method-of-moments-based method for linear regression models and a subsampling extrapolation method that is generally applicable to both linear and nonlinear regression models. Simulations and an application to the cocaine dependence treatment data are used to illustrate the efficacy of the proposed methods. Asymptotic theory and variance estimation for the proposed subsampling extrapolation method and some additional simulation results are described in the online supplementary material. PMID:21984854

  9. An examination of the hexokinase method for serum glucose assay using external quality assessment data.

    PubMed

    Westwood, A; Bullock, D G; Whitehead, T P

    1986-01-01

    Hexokinase methods for serum glucose assay appeared to give slightly but consistently higher inter-laboratory coefficients of variation than all methods combined in the UK External Quality Assessment Scheme; their performance over a two-year period was therefore compared with that for three groups of glucose oxidase methods. This assessment showed no intrinsic inferiority in the hexokinase method. The greater variation may be due to the more heterogeneous group of instruments, particularly discrete analysers, on which the method is used. The Beckman Glucose Analyzer and Astra group (using a glucose oxidase method) showed the least inter-laboratory variability but also the lowest mean value. No comment is offered on the absolute accuracy of any of the methods.

  10. Intercomparison of two comparative reactivity method instruments inf the Mediterranean basin during summer 2013

    NASA Astrophysics Data System (ADS)

    Zannoni, N.; Dusanter, S.; Gros, V.; Sarda Esteve, R.; Michoud, V.; Sinha, V.; Locoge, N.; Bonsang, B.

    2015-09-01

    The hydroxyl radical (OH) plays a key role in the atmosphere, as it initiates most of the oxidation processes of volatile organic compounds (VOCs), and can ultimately lead to the formation of ozone and secondary organic aerosols (SOAs). There are still uncertainties associated with the OH budget assessed using current models of atmospheric chemistry and direct measurements of OH sources and sinks have proved to be valuable tools to improve our understanding of the OH chemistry. The total first order loss rate of OH, or total OH reactivity, can be directly measured using three different methods, such as the following: total OH loss rate measurement, laser-induced pump and probe technique and comparative reactivity method. Observations of total OH reactivity are usually coupled to individual measurements of reactive compounds in the gas phase, which are used to calculate the OH reactivity. Studies using the three methods have highlighted that a significant fraction of OH reactivity is often not explained by individually measured reactive compounds and could be associated to unmeasured or unknown chemical species. Therefore accurate and reproducible measurements of OH reactivity are required. The comparative reactivity method (CRM) has demonstrated to be an advantageous technique with an extensive range of applications, and for this reason it has been adopted by several research groups since its development. However, this method also requires careful corrections to derive ambient OH reactivity. Herein we present an intercomparison exercise of two CRM instruments, CRM-LSCE (Laboratoire des Sciences du Climat et de l'Environnement) and CRM-MD (Mines Douai), conducted during July 2013 at the Mediterranean site of Ersa, Cape Corsica, France. The intercomparison exercise included tests to assess the corrections needed by the two instruments to process the raw data sets as well as OH reactivity observations. The observation was divided in three parts: 2 days of plant emissions (8-9 July), 2 days of ambient measurements (10-11 July) and 2 days (12-13 July) of plant emissions. We discuss in detail the experimental approach adopted and how the data sets were processed for both instruments. Corrections required for the two instruments lead to higher values of reactivity in ambient air; overall 20 % increase for CRM-MD and 49 % for CRM-LSCE compared to the raw data. We show that ambient OH reactivity measured by the two instruments agrees very well (correlation described by a linear least squares fit with a slope of 1 and R2 of 0.75). This study highlights that ambient measurements of OH reactivity with differently configured CRM instruments yield consistent results in a low NOx (NO + NO2), terpene rich environment, despite differential corrections relevant to each instrument. Conducting more intercomparison exercises, involving more CRM instruments operated under different ambient and instrumental settings will help in assessing the variability induced due to instrument-specific corrections further.

  11. Measuring Ventilatory Activity with Structured Light Plethysmography (SLP) Reduces Instrumental Observer Effect and Preserves Tidal Breathing Variability in Healthy and COPD

    PubMed Central

    Niérat, Marie-Cécile; Dubé, Bruno-Pierre; Llontop, Claudia; Bellocq, Agnès; Layachi Ben Mohamed, Lila; Rivals, Isabelle; Straus, Christian; Similowski, Thomas; Laveneziana, Pierantonio

    2017-01-01

    The use of a mouthpiece to measure ventilatory flow with a pneumotachograph (PNT) introduces a major perturbation to breathing (“instrumental/observer effect”) and suffices to modify the respiratory behavior. Structured light plethysmography (SLP) is a non-contact method of assessment of breathing pattern during tidal breathing. Firstly, we validated the SLP measurements by comparing timing components of the ventilatory pattern obtained by SLP vs. PNT under the same condition; secondly, we compared SLP to SLP+PNT measurements of breathing pattern to evaluate the disruption of breathing pattern and breathing variability in healthy and COPD subjects. Measurements were taken during tidal breathing with SLP alone and SLP+PNT recording in 30 COPD and healthy subjects. Measurements included: respiratory frequency (Rf), inspiratory, expiratory, and total breath time/duration (Ti, Te, and Tt). Passing-Bablok regression analysis was used to evaluate the interchangeability of timing components of the ventilatory pattern (Rf, Ti, Te, and Tt) between measurements performed under the following experimental conditions: SLP vs. PNT, SLP+PNT vs. SLP, and SLP+PNT vs. PNT. The variability of different ventilatory variables was assessed through their coefficients of variation (CVs). In healthy: according to Passing-Bablok regression, Rf, TI, TE and TT were interchangeable between measurements obtained under the three experimental conditions (SLP vs. PNT, SLP+PNT vs. SLP, and SLP+PNT vs. PNT). All the CVs describing “traditional” ventilatory variables (Rf, Ti, Te, Ti/Te, and Ti/Tt) were significantly smaller in SLP+PNT condition. This was not the case for more “specific” SLP-derived variables. In COPD: according to Passing-Bablok regression, Rf, TI, TE, and TT were interchangeable between measurements obtained under SLP vs. PNT and SLP+PNT vs. PNT, whereas only Rf, TE, and TT were interchangeable between measurements obtained under SLP+PNT vs. SLP. However, most discrete variables were significantly different between the SLP and SLP+PNT conditions and CVs were significantly lower when COPD patients were assessed in the SLP+PNT condition. Measuring ventilatory activity with SLP preserves resting tidal breathing variability, reduces instrumental observer effect and avoids any disruptions in breathing pattern induced by the use of PNT-mouthpiece-nose-clip combination. PMID:28572773

  12. Measuring Ventilatory Activity with Structured Light Plethysmography (SLP) Reduces Instrumental Observer Effect and Preserves Tidal Breathing Variability in Healthy and COPD.

    PubMed

    Niérat, Marie-Cécile; Dubé, Bruno-Pierre; Llontop, Claudia; Bellocq, Agnès; Layachi Ben Mohamed, Lila; Rivals, Isabelle; Straus, Christian; Similowski, Thomas; Laveneziana, Pierantonio

    2017-01-01

    The use of a mouthpiece to measure ventilatory flow with a pneumotachograph (PNT) introduces a major perturbation to breathing ("instrumental/observer effect") and suffices to modify the respiratory behavior. Structured light plethysmography (SLP) is a non-contact method of assessment of breathing pattern during tidal breathing. Firstly, we validated the SLP measurements by comparing timing components of the ventilatory pattern obtained by SLP vs. PNT under the same condition; secondly, we compared SLP to SLP+PNT measurements of breathing pattern to evaluate the disruption of breathing pattern and breathing variability in healthy and COPD subjects. Measurements were taken during tidal breathing with SLP alone and SLP+PNT recording in 30 COPD and healthy subjects. Measurements included: respiratory frequency (R f ), inspiratory, expiratory, and total breath time/duration (Ti, Te, and Tt). Passing-Bablok regression analysis was used to evaluate the interchangeability of timing components of the ventilatory pattern (R f , Ti, Te, and Tt) between measurements performed under the following experimental conditions: SLP vs. PNT, SLP+PNT vs. SLP, and SLP+PNT vs. PNT. The variability of different ventilatory variables was assessed through their coefficients of variation (CVs). In healthy: according to Passing-Bablok regression, Rf, TI, TE and TT were interchangeable between measurements obtained under the three experimental conditions (SLP vs. PNT, SLP+PNT vs. SLP, and SLP+PNT vs. PNT). All the CVs describing "traditional" ventilatory variables (R f , Ti, Te, Ti/Te, and Ti/Tt) were significantly smaller in SLP+PNT condition. This was not the case for more "specific" SLP-derived variables. In COPD: according to Passing-Bablok regression, Rf, TI, TE, and TT were interchangeable between measurements obtained under SLP vs. PNT and SLP+PNT vs. PNT, whereas only Rf, TE, and TT were interchangeable between measurements obtained under SLP+PNT vs. SLP. However, most discrete variables were significantly different between the SLP and SLP+PNT conditions and CVs were significantly lower when COPD patients were assessed in the SLP+PNT condition. Measuring ventilatory activity with SLP preserves resting tidal breathing variability, reduces instrumental observer effect and avoids any disruptions in breathing pattern induced by the use of PNT-mouthpiece-nose-clip combination.

  13. Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy.

    PubMed

    Sonnleitner, Bernhard

    2013-01-01

    The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.

  14. Prediction of future falls in a community dwelling older adult population using instrumented balance and gait analysis.

    PubMed

    Bauer, C M; Gröger, I; Rupprecht, R; Marcar, V L; Gaßmann, K G

    2016-04-01

    The role of instrumented balance and gait assessment when screening for prospective fallers is currently a topic of controversial discussion. This study analyzed the association between variables derived from static posturography, instrumented gait analysis and clinical assessments with the occurrence of prospective falls in a sample of community dwelling older people. In this study 84 older people were analyzed. Based on a prospective occurrence of falls, participants were categorized into fallers and non-fallers. Variables derived from clinical assessments, static posturography and instrumented gait analysis were evaluated with respect to the association with the occurrence of prospective falls using a forward stepwise, binary, logistic regression procedure. Fallers displayed a significantly shorter single support time during walking while counting backwards, increased mediolateral to anteroposterior sway amplitude ratio, increased fast mediolateral oscillations and a larger coefficient (Coeff) of sway direction during various static posturography tests. Previous falls were insignificantly associated with the occurrence of prospective falls. Variables derived from posturography and instrumented gait analysis showed significant associations with the occurrence of prospective falls in a sample of community dwelling older adults.

  15. A 507-year rainfall and runoff reconstruction for the Monsoonal North West, Australia derived from remote paleoclimate archives

    NASA Astrophysics Data System (ADS)

    Verdon-Kidd, Danielle C.; Hancock, Gregory R.; Lowry, John B.

    2017-11-01

    The Monsoonal North West (MNW) region of Australia faces a number of challenges adapting to anthropogenic climate change. These have the potential to impact on a range of industries, including agricultural, pastoral, mining and tourism. However future changes to rainfall regimes remain uncertain due to the inability of Global Climate Models to adequately capture the tropical weather/climate processes that are known to be important for this region. Compounding this is the brevity of the instrumental rainfall record for the MNW, which is unlikely to represent the full range of climatic variability. One avenue for addressing this issue (the focus of this paper) is to identify sources of paleoclimate information that can be used to reconstruct a plausible pre-instrumental rainfall history for the MNW. Adopting this approach we find that, even in the absence of local sources of paleoclimate data at a suitable temporal resolution, remote paleoclimate records can resolve 25% of the annual variability observed in the instrumental rainfall record. Importantly, the 507-year rainfall reconstruction developed using the remote proxies displays longer and more intense wet and dry periods than observed during the most recent 100 years. For example, the maximum number of consecutive years of below (above) average rainfall is 90% (40%) higher in the rainfall reconstruction than during the instrumental period. Further, implications for flood and drought risk are studied via a simple GR1A rainfall runoff model, which again highlights the likelihood of extremes greater than that observed in the limited instrumental record, consistent with previous paleoclimate studies elsewhere in Australia. Importantly, this research can assist in informing climate related risks to infrastructure, agriculture and mining, and the method can readily be applied to other regions in the MNW and beyond.

  16. Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H

    2016-06-01

    Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.

  17. A novel variable baseline visibility detection system and its measurement method

    NASA Astrophysics Data System (ADS)

    Li, Meng; Jiang, Li-hui; Xiong, Xing-long; Zhang, Guizhong; Yao, JianQuan

    2017-10-01

    As an important meteorological observation instrument, the visibility meter can ensure the safety of traffic operation. However, due to the optical system contamination as well as sample error, the accuracy and stability of the equipment are difficult to meet the requirement in the low-visibility environment. To settle this matter, a novel measurement equipment was designed based upon multiple baseline, which essentially acts as an atmospheric transmission meter with movable optical receiver, applying weighted least square method to process signal. Theoretical analysis and experiments in real atmosphere environment support this technique.

  18. Nonparametric instrumental regression with non-convex constraints

    NASA Astrophysics Data System (ADS)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  19. Evaluation of variability and quality control procedures for a receptor-binding assay for paralytic shellfish poisoning toxins.

    PubMed

    Ruberu, S R; Langlois, G W; Masuda, M; Perera, S Kusum

    2012-01-01

    The receptor-binding assay (RBA) method for determining saxatoxin (STX) and its numerous analogues, which cause paralytic shellfish poisoning (PSP) in humans, was evaluated in a single laboratory study. Each step of the assay preparation procedure including the performance of the multi-detector TopCount® instrument was evaluated for its contribution to method variability. The overall inherent RBA variability was determined to be 17%. Variability within the 12 detectors was observed; however, there was no reproducible pattern in detector performance. This observed variability among detectors could be attributed to other factors, such as pipetting errors. In an attempt to reduce the number of plates rejected due to excessive variability in the method's quality control parameters, a statistical approach was evaluated using either Grubbs' test or the Student's t-test for rejecting outliers in the measurement of triplicate wells. This approach improved the ratio of accepted versus rejected plates, saving cost and time for rerunning the assay. However, the potential reduction in accuracy and the lack of improvement in precision suggests caution when using this approach. The current study has recommended an alternate quality control procedure for accepting or rejecting plates in place of the criteria currently used in the published assay, or the alternative of outlier testing. The recommended procedure involves the development of control charts to monitor the critical parameters identified in the published method (QC sample, EC₅₀, slope of calibration curve), with the addition of a fourth critical parameter which is the top value (100% binding) of the calibration curve.

  20. Teachers' Pedagogical Management and Instrumental Performance in Students of an Artistic Higher Education School

    ERIC Educational Resources Information Center

    De La Cruz Bautista, Edwin

    2017-01-01

    This research aims to know the relationship between the variables teachers' pedagogical management and instrumental performance in students from an Artistic Higher Education School. It is a descriptive and correlational research that seeks to find the relationship between both variables. The sample of the study consisted of 30 students of the…

  1. Polychronometry: The Study of Time Variables in Behavior.

    ERIC Educational Resources Information Center

    Mackey, William Francis

    There is a growing need for instrumentation which can enable us to observe and compute phenomena that take place in time. Although problems of observation, computation, interpretation and categorization vary from field to field and from problem to problem, it is possible to design an instrument for use in any situation where time-variables have to…

  2. Predicting Preservice Music Teachers' Performance Success in Instrumental Courses Using Self-Regulated Study Strategies and Predictor Variables

    ERIC Educational Resources Information Center

    Ersozlu, Zehra N.; Nietfeld, John L.; Huseynova, Lale

    2017-01-01

    The purpose of this study was to examine the extent to which self-regulated study strategies and predictor variables predict performance success in instrumental performance college courses. Preservice music teachers (N = 123) from a music education department in two state universities in Turkey completed the Music Self-Regulated Studying…

  3. Can Two Psychotherapy Process Measures Be Dependably Rated Simultaneously? A Generalizability Study

    ERIC Educational Resources Information Center

    Ulvenes, Pal G.; Berggraf, Lene; Hoffart, Asle; Levy, Raymon A.; Ablon, J. Stuart; McCullough, Leigh; Wampold, Bruce E.

    2012-01-01

    Observer ratings in psychotherapy are a common way of collecting information in psychotherapy research. However, human observers are imperfect instruments, and their ratings may be subject to variability from several sources. One source of variability can be raters' assessing more than 1 instrument at a time. The purpose of this research is to…

  4. Bias estimation for the Landsat 8 operational land imager

    USGS Publications Warehouse

    Morfitt, Ron; Vanderwerff, Kelly

    2011-01-01

    The Operational Land Imager (OLI) is a pushbroom sensor that will be a part of the Landsat Data Continuity Mission (LDCM). This instrument is the latest in the line of Landsat imagers, and will continue to expand the archive of calibrated earth imagery. An important step in producing a calibrated image from instrument data is accurately accounting for the bias of the imaging detectors. Bias variability is one factor that contributes to error in bias estimation for OLI. Typically, the bias is simply estimated by averaging dark data on a per-detector basis. However, data acquired during OLI pre-launch testing exhibited bias variation that correlated well with the variation in concurrently collected data from a special set of detectors on the focal plane. These detectors are sensitive to certain electronic effects but not directly to incoming electromagnetic radiation. A method of using data from these special detectors to estimate the bias of the imaging detectors was developed, but found not to be beneficial at typical radiance levels as the detectors respond slightly when the focal plane is illuminated. In addition to bias variability, a systematic bias error is introduced by the truncation performed by the spacecraft of the 14-bit instrument data to 12-bit integers. This systematic error can be estimated and removed on average, but the per pixel quantization error remains. This paper describes the variability of the bias, the effectiveness of a new approach to estimate and compensate for it, as well as the errors due to truncation and how they are reduced.

  5. Poverty, Pregnancy, and Birth Outcomes: A Study of the Earned Income Tax Credit

    PubMed Central

    Rehkopf, David H.

    2015-01-01

    Background Economic interventions are increasingly recognized as a mechanism to address perinatal health outcomes among disadvantaged groups. In the United States, the earned income tax credit (EITC) is the largest poverty alleviation program. Little is known about its effects on perinatal health among recipients and their children. We exploit quasi-random variation in the size of EITC payments over time to examine the effects of income on perinatal health. Methods The study sample includes women surveyed in the 1979 National Longitudinal Survey of Youth (N=2,985) and their children born during 1986–2000 (N=4,683). Outcome variables include utilization of prenatal and postnatal care, use of alcohol and tobacco during pregnancy, term birth, birthweight, and breast-feeding status. We examine the health effects of both household income and EITC payment size using multivariable linear regressions. We employ instrumental variables analysis to estimate the causal effect of income on perinatal health, using EITC payment size as an instrument for household income. Results We find that household income and EITC payment size are associated with improvements in several indicators of perinatal health. Instrumental variables analysis, however, does not reveal a causal association between household income and these health measures. Conclusions Our findings suggest that associations between income and perinatal health may be confounded by unobserved characteristics, but that EITC income improves perinatal health. Future studies should continue to explore the impacts of economic interventions on perinatal health outcomes, and investigate how different forms of income transfers may have different impacts. PMID:26212041

  6. Site-specific 13C content by quantitative isotopic 13C nuclear magnetic resonance spectrometry: a pilot inter-laboratory study.

    PubMed

    Chaintreau, Alain; Fieber, Wolfgang; Sommer, Horst; Gilbert, Alexis; Yamada, Keita; Yoshida, Naohiro; Pagelot, Alain; Moskau, Detlef; Moreno, Aitor; Schleucher, Jürgen; Reniero, Fabiano; Holland, Margaret; Guillou, Claude; Silvestre, Virginie; Akoka, Serge; Remaud, Gérald S

    2013-07-25

    Isotopic (13)C NMR spectrometry, which is able to measure intra-molecular (13)C composition, is of emerging demand because of the new information provided by the (13)C site-specific content of a given molecule. A systematic evaluation of instrumental behaviour is of importance to envisage isotopic (13)C NMR as a routine tool. This paper describes the first collaborative study of intra-molecular (13)C composition by NMR. The main goals of the ring test were to establish intra- and inter-variability of the spectrometer response. Eight instruments with different configuration were retained for the exercise on the basis of a qualification test. Reproducibility at the natural abundance of isotopic (13)C NMR was then assessed on vanillin from three different origins associated with specific δ (13)Ci profiles. The standard deviation was, on average, between 0.9 and 1.2‰ for intra-variability. The highest standard deviation for inter-variability was 2.1‰. This is significantly higher than the internal precision but could be considered good in respect of a first ring test on a new analytical method. The standard deviation of δ (13)Ci in vanillin was not homogeneous over the eight carbons, with no trend either for the carbon position or for the configuration of the spectrometer. However, since the repeatability for each instrument was satisfactory, correction factors for each carbon in vanillin could be calculated to harmonize the results. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Information theoretic comparisons of original and transformed data from Landsat MSS and TM

    NASA Technical Reports Server (NTRS)

    Malila, W. A.

    1985-01-01

    The dispersion and concentration of signal values in transformed data from the Landsat-4 MSS and TM instruments are analyzed using a communications theory approach. The definition of entropy of Shannon was used to quantify information, and the concept of mutual information was employed to develop a measure of information contained in several subsets of variables. Several comparisons of information content are made on the basis of the information content measure, including: system design capacities; data volume occupied by agricultural data; and the information content of original bands and Tasseled Cap variables. A method for analyzing noise effects in MSS and TM data is proposed.

  8. On Variable Geometric Factor Systems for Top-Hat Electrostatic Space Plasma Analyzers

    NASA Technical Reports Server (NTRS)

    Kataria, Dhiren O.; Collinson, Glyn A.

    2010-01-01

    Even in the relatively small region of space that is the Earth's magnetosphere, ion and electron fluxes can vary by several orders of magnitude. Top-hat electrostatic analyzers currently do not possess the dynamic range required to sample plasma under all conditions. The purpose of this study was to compare, through computer simulation, three new electrostatic methods that would allow the sensitivity of a sensor to be varied through control of its geometric factor (GF) (much like an aperture on a camera). The methods studied were inner filter plates, split hemispherical analyzer (SHA) and top-cap electrode. This is the first discussion of the filter plate concept and also the first study where all three systems are studied within a common analyzer design, so that their relative merits could be fairly compared. Filter plates were found to have the important advantage that they facilitate the reduction in instrument sensitivity whilst keeping all other instrument parameters constant. However, it was discovered that filter plates have numerous disadvantages that make such a system impracticable for a top-hat electrostatic analyzer. It was found that both the top-cap electrode and SHA are promising variable geometric factor system (VGFS) concepts for implementation into a top-hat electrostatic analyzer, each with distinct advantages over the other.

  9. A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments

    PubMed Central

    2012-01-01

    Background Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. Methods To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. Results The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. Conclusions The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test. PMID:22894708

  10. A variable acceleration calibration system

    NASA Astrophysics Data System (ADS)

    Johnson, Thomas H.

    2011-12-01

    A variable acceleration calibration system that applies loads using gravitational and centripetal acceleration serves as an alternative, efficient and cost effective method for calibrating internal wind tunnel force balances. Two proof-of-concept variable acceleration calibration systems are designed, fabricated and tested. The NASA UT-36 force balance served as the test balance for the calibration experiments. The variable acceleration calibration systems are shown to be capable of performing three component calibration experiments with an approximate applied load error on the order of 1% of the full scale calibration loads. Sources of error are indentified using experimental design methods and a propagation of uncertainty analysis. Three types of uncertainty are indentified for the systems and are attributed to prediction error, calibration error and pure error. Angular velocity uncertainty is shown to be the largest indentified source of prediction error. The calibration uncertainties using a production variable acceleration based system are shown to be potentially equivalent to current methods. The production quality system can be realized using lighter materials and a more precise instrumentation. Further research is needed to account for balance deflection, forcing effects due to vibration, and large tare loads. A gyroscope measurement technique is shown to be capable of resolving the balance deflection angle calculation. Long term research objectives include a demonstration of a six degree of freedom calibration, and a large capacity balance calibration.

  11. REFERQUAL: a pilot study of a new service quality assessment instrument in the GP exercise referral scheme setting

    PubMed Central

    Cock, Don; Adams, Iain C; Ibbetson, Adrian B; Baugh, Phil

    2006-01-01

    Background The development of an instrument accurately assessing service quality in the GP Exercise Referral Scheme (ERS) industry could potentially inform scheme organisers of the factors that affect adherence rates leading to the implementation of strategic interventions aimed at reducing client drop-out. Methods A modified version of the SERVQUAL instrument was designed for use in the ERS setting and subsequently piloted amongst 27 ERS clients. Results Test re-test correlations were calculated via Pearson's 'r' or Spearman's 'rho', depending on whether the variables were Normally Distributed, to show a significant (mean r = 0.957, SD = 0.02, p < 0.05; mean rho = 0.934, SD = 0.03, p < 0.05) relationship between all items within the questionnaire. In addition, satisfactory internal consistency was demonstrated via Cronbach's 'α'. Furthermore, clients responded favourably towards the usability, wording and applicability of the instrument's items. Conclusion REFERQUAL is considered to represent promise as a suitable tool for future evaluation of service quality within the ERS community. Future research should further assess the validity and reliability of this instrument through the use of a confirmatory factor analysis to scrutinise the proposed dimensional structure. PMID:16725021

  12. Analysis instrument test on mathematical power the material geometry of space flat side for grade 8

    NASA Astrophysics Data System (ADS)

    Kusmaryono, Imam; Suyitno, Hardi; Dwijanto, Karomah, Nur

    2017-08-01

    The main problem of research to determine the quality of test items on the material side of flat geometry to assess students' mathematical power. The method used is quantitative descriptive. The subjects were students of class 8 as many as 20 students. The object of research is the quality of test items in terms of the power of mathematics: validity, reliability, level of difficulty and power differentiator. Instrument mathematical power ratings are tested include: written tests and questionnaires about the disposition of mathematical power. Data were obtained from the field, in the form of test data on the material geometry of space flat side and questionnaires. The results of the test instrument to the reliability of the test item is influenced by many factors. Factors affecting the reliability of the instrument is the number of items, homogeneity test questions, the time required, the uniformity of conditions of the test taker, the homogeneity of the group, the variability problem, and motivation of the individual (person taking the test). Overall, the evaluation results of this study stated that the test instrument can be used as a tool to measure students' mathematical power.

  13. An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns

    NASA Astrophysics Data System (ADS)

    Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.

    2017-12-01

    The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.

  14. Reducing random measurement error in assessing postural load on the back in epidemiologic surveys.

    PubMed

    Burdorf, A

    1995-02-01

    The goal of this study was to design strategies to assess postural load on the back in occupational epidemiology by taking into account the reliability of measurement methods and the variability of exposure among the workers under study. Intermethod reliability studies were evaluated to estimate the systematic bias (accuracy) and random measurement error (precision) of various methods to assess postural load on the back. Intramethod reliability studies were reviewed to estimate random variability of back load over time. Intermethod surveys have shown that questionnaires have a moderate reliability for gross activities such as sitting, whereas duration of trunk flexion and rotation should be assessed by observation methods or inclinometers. Intramethod surveys indicate that exposure variability can markedly affect the reliability of estimates of back load if the estimates are based upon a single measurement over a certain time period. Equations have been presented to evaluate various study designs according to the reliability of the measurement method, the optimum allocation of the number of repeated measurements per subject, and the number of subjects in the study. Prior to a large epidemiologic study, an exposure-oriented survey should be conducted to evaluate the performance of measurement instruments and to estimate sources of variability for back load. The strategy for assessing back load can be optimized by balancing the number of workers under study and the number of repeated measurements per worker.

  15. LIMS for Lasers 2015 for achieving long-term accuracy and precision of δ2H, δ17O, and δ18O of waters using laser absorption spectrometry

    USGS Publications Warehouse

    Coplen, Tyler B.; Wassenaar, Leonard I

    2015-01-01

    Although laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits. METHODS: A Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ(17) O, δ(18) O, and δ(2) H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales. RESULTS: Cost-free LIMS for Lasers 2015 enables users to obtain improved δ(17) O, δ(18) O, and δ(2) H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ(2) HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale. CONCLUSIONS: LIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ(2) H, δ(17) O, and δ(18) O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits easily.

  16. Implant positioning in TKA: comparison between conventional and patient-specific instrumentation.

    PubMed

    Ferrara, Ferdinando; Cipriani, Antonio; Magarelli, Nicola; Rapisarda, Santi; De Santis, Vincenzo; Burrofato, Aaron; Leone, Antonio; Bonomo, Lorenzo

    2015-04-01

    The number of total knee arthroplasty (TKA) procedures continuously increases, with good to excellent results. In the last few years, new surgical techniques have been developed to improve prosthesis positioning. In this context, patient-specific instrumentation is included. The goal of this study was to compare the perioperative parameters and the spatial positioning of prosthetic components in TKA procedures performed with patient-specific instrumentation vs traditional TKA. In this prospective comparative randomized study, 15 patients underwent TKA with 3-dimensional magnetic resonance imaging (MRI) preoperative planning (patient-specific instrumentation group) and 15 patients underwent traditional TKA (non-patient-specific instrumentation group). All patients underwent postoperative computed tomography (CT) examination. In the patient-specific instrumentation group, preoperative data planning regarding femoral and tibial bone resection was correlated with intraoperative measurements. Surgical time, length of hospitalization, and intraoperative and postoperative bleeding were compared between the 2 groups. Positioning of implants on postoperative CT was assessed for both groups. Data planned with 3-dimensional MRI regarding the depth of bone cuts showed good to excellent correlation with intraoperative measurements. The patient-specific instrumentation group showed better perioperative outcomes and good correlation between the spatial positioning of prosthetic components planned preoperatively and that seen on postoperative CT. Less variability was found in the patient-specific instrumentation group than in the non-patient-specific instrumentation group in spatial orientation of prosthetic components. Preoperative planning with 3-dimensional MRI in TKA has a better perioperative outcome compared with the traditional method. Use of patient-specific instrumentation can also improve the spatial positioning of both prosthetic components. Copyright 2015, SLACK Incorporated.

  17. Comparing surgical trays with redundant instruments with trays with reduced instruments: a cost analysis

    PubMed Central

    John-Baptiste, A.; Sowerby, L.J.; Chin, C.J.; Martin, J.; Rotenberg, B.W.

    2016-01-01

    Background: When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). Methods: We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. Results: The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Interpretation: Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems. PMID:27975045

  18. Suwannee River flow variability 1550-2005 CE reconstructed from a multispecies tree-ring network

    NASA Astrophysics Data System (ADS)

    Harley, Grant L.; Maxwell, Justin T.; Larson, Evan; Grissino-Mayer, Henri D.; Henderson, Joseph; Huffman, Jean

    2017-01-01

    Understanding the long-term natural flow regime of rivers enables resource managers to more accurately model water level variability. Models for managing water resources are important in Florida where population increase is escalating demand on water resources and infrastructure. The Suwannee River is the second largest river system in Florida and the least impacted by anthropogenic disturbance. We used new and existing tree-ring chronologies from multiple species to reconstruct mean March-October discharge for the Suwannee River during the period 1550-2005 CE and place the short period of instrumental flows (since 1927 CE) into historical context. We used a nested principal components regression method to maximize the use of chronologies with varying time coverage in the network. Modeled streamflow estimates indicated that instrumental period flow conditions do not adequately capture the full range of Suwannee River flow variability beyond the observational period. Although extreme dry and wet events occurred in the gage record, pluvials and droughts that eclipse the intensity and duration of instrumental events occurred during the 16-19th centuries. The most prolonged and severe dry conditions during the past 450 years occurred during the 1560s CE. In this prolonged drought period mean flow was estimated at 17% of the mean instrumental period flow. Significant peaks in spectral density at 2-7, 10, 45, and 85-year periodicities indicated the important influence of coupled oceanic-atmospheric processes on Suwannee River streamflow over the past four centuries, though the strength of these periodicities varied over time. Future water planning based on current flow expectations could prove devastating to natural and human systems if a prolonged and severe drought mirroring the 16th and 18th century events occurred. Future work in the region will focus on updating existing tree-ring chronologies and developing new collections from moisture-sensitive sites to improve understandings of past hydroclimate in the region.

  19. Validation of a Spectral Method for Quantitative Measurement of Color in Protein Drug Solutions.

    PubMed

    Yin, Jian; Swartz, Trevor E; Zhang, Jian; Patapoff, Thomas W; Chen, Bartolo; Marhoul, Joseph; Shih, Norman; Kabakoff, Bruce; Rahimi, Kimia

    2016-01-01

    A quantitative spectral method has been developed to precisely measure the color of protein solutions. In this method, a spectrophotometer is utilized for capturing the visible absorption spectrum of a protein solution, which can then be converted to color values (L*a*b*) that represent human perception of color in a quantitative three-dimensional space. These quantitative values (L*a*b*) allow for calculating the best match of a sample's color to a European Pharmacopoeia reference color solution. In order to qualify this instrument and assay for use in clinical quality control, a technical assessment was conducted to evaluate the assay suitability and precision. Setting acceptance criteria for this study required development and implementation of a unique statistical method for assessing precision in 3-dimensional space. Different instruments, cuvettes, protein solutions, and analysts were compared in this study. The instrument accuracy, repeatability, and assay precision were determined. The instrument and assay are found suitable for use in assessing color of drug substances and drug products and is comparable to the current European Pharmacopoeia visual assessment method. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. In this study, we established a statistical method for assessing precision in 3-dimensional space and demonstrated that the quantitative spectral method is comparable with respect to precision and accuracy to the current European Pharmacopoeia visual assessment method. © PDA, Inc. 2016.

  20. Comparison and covalidation of ozone anomalies and variability observed in SBUV(/2) and Umkehr northern midlatitude ozone profile estimates

    NASA Astrophysics Data System (ADS)

    Petropavlovskikh, I.; Ahn, Changwoo; Bhartia, P. K.; Flynn, L. E.

    2005-03-01

    This analysis presents comparisons of upper-stratosphere ozone information observed by two independent systems: the Solar Backscatter UltraViolet (SBUV and SBUV/2) satellite instruments, and ground-based Dobson spectrophotometers. Both the new SBUV Version 8 and the new UMK04 profile retrieval algorithms are optimized for studying long-term variability and trends in ozone. Trend analyses of the ozone time series from the SBUV(/2) data set are complex because of the multiple instruments involved, changes in the instruments' geo-location, and short periods of overlaps for inter-calibrations among different instruments. Three northern middle latitudes Dobson ground stations (Arosa, Boulder, and Tateno) are used in this analysis to validate the trend quality of the combined 25-year SBUV/2 time series, 1979 to 2003. Generally, differences between the satellite and ground-based data do not suggest any significant time-dependent shifts or trends. The shared features confirm the value of these data sets for studies of ozone variability.

  1. Rotatable Small Permanent Magnet Array for Ultra-Low Field Nuclear Magnetic Resonance Instrumentation: A Concept Study.

    PubMed

    Vogel, Michael W; Giorni, Andrea; Vegh, Viktor; Pellicer-Guridi, Ruben; Reutens, David C

    2016-01-01

    We studied the feasibility of generating the variable magnetic fields required for ultra-low field nuclear magnetic resonance relaxometry with dynamically adjustable permanent magnets. Our motivation was to substitute traditional electromagnets by distributed permanent magnets, increasing system portability. The finite element method (COMSOL®) was employed for the numerical study of a small permanent magnet array to calculate achievable magnetic field strength, homogeneity, switching time and magnetic forces. A manually operated prototype was simulated and constructed to validate the numerical approach and to verify the generated magnetic field. A concentric small permanent magnet array can be used to generate strong sample pre-polarisation and variable measurement fields for ultra-low field relaxometry via simple prescribed magnet rotations. Using the array, it is possible to achieve a pre-polarisation field strength above 100 mT and variable measurement fields ranging from 20-50 μT with 200 ppm absolute field homogeneity within a field-of-view of 5 x 5 x 5 cubic centimetres. A dynamic small permanent magnet array can generate multiple highly homogeneous magnetic fields required in ultra-low field nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI) instruments. This design can significantly reduce the volume and energy requirements of traditional systems based on electromagnets, improving portability considerably.

  2. Validation of the organizational culture assessment instrument.

    PubMed

    Heritage, Brody; Pollock, Clare; Roberts, Lynne

    2014-01-01

    Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI) has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102) Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged.

  3. Domestic violence on children: development and validation of an instrument to evaluate knowledge of health professionals 1

    PubMed Central

    Oliveira, Lanuza Borges; Soares, Fernanda Amaral; Silveira, Marise Fagundes; de Pinho, Lucinéia; Caldeira, Antônio Prates; Leite, Maísa Tavares de Souza

    2016-01-01

    ABSTRACT Objective: to develop and validate an instrument to evaluate the knowledge of health professionals about domestic violence on children. Method: this was a study conducted with 194 physicians, nurses and dentists. A literature review was performed for preparation of the items and identification of the dimensions. Apparent and content validation was performed using analysis of three experts and 27 professors of the pediatric health discipline. For construct validation, Cronbach's alpha was used, and the Kappa test was applied to verify reproducibility. The criterion validation was conducted using the Student's t-test. Results: the final instrument included 56 items; the Cronbach alpha was 0.734, the Kappa test showed a correlation greater than 0.6 for most items, and the Student t-test showed a statistically significant value to the level of 5% for the two selected variables: years of education and using the Family Health Strategy. Conclusion: the instrument is valid and can be used as a promising tool to develop or direct actions in public health and evaluate knowledge about domestic violence on children. PMID:27556878

  4. Validation of the Organizational Culture Assessment Instrument

    PubMed Central

    Heritage, Brody; Pollock, Clare; Roberts, Lynne

    2014-01-01

    Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI) has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102) Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged. PMID:24667839

  5. System and method for correcting attitude estimation

    NASA Technical Reports Server (NTRS)

    Josselson, Robert H. (Inventor)

    2010-01-01

    A system includes an angular rate sensor disposed in a vehicle for providing angular rates of the vehicle, and an instrument disposed in the vehicle for providing line-of-sight control with respect to a line-of-sight reference. The instrument includes an integrator which is configured to integrate the angular rates of the vehicle to form non-compensated attitudes. Also included is a compensator coupled across the integrator, in a feed-forward loop, for receiving the angular rates of the vehicle and outputting compensated angular rates of the vehicle. A summer combines the non-compensated attitudes and the compensated angular rates of the to vehicle to form estimated vehicle attitudes for controlling the instrument with respect to the line-of-sight reference. The compensator is configured to provide error compensation to the instrument free-of any feedback loop that uses an error signal. The compensator may include a transfer function providing a fixed gain to the received angular rates of the vehicle. The compensator may, alternatively, include a is transfer function providing a variable gain as a function of frequency to operate on the received angular rates of the vehicle.

  6. Analysis of intraosseous samples using point of care technology--an experimental study in the anaesthetised pig.

    PubMed

    Strandberg, Gunnar; Eriksson, Mats; Gustafsson, Mats G; Lipcsey, Miklós; Larsson, Anders

    2012-11-01

    Intraosseous access is an essential method in emergency medicine when other forms of vascular access are unavailable and there is an urgent need for fluid or drug therapy. A number of publications have discussed the suitability of using intraosseous access for laboratory testing. We aimed to further evaluate this issue and to study the accuracy and precision of intraosseous measurements. Five healthy, anaesthetised pigs were instrumented with bilateral tibial intraosseous cannulae and an arterial catheter. Samples were collected hourly for 6h and analysed for blood gases, acid base status, haemoglobin and electrolytes using an I-Stat point of care analyser. There was no clinically relevant difference between results from left and right intraosseous sites. The variability of the intraosseous sample values, measured as the coefficient of variance (CV), was maximally 11%, and smaller than for the arterial sample values for all variables except SO2. For most variables, there seems to be some degree of systematic difference between intraosseous and arterial results. However, the direction of this difference seems to be predictable. Based on our findings in this animal model, cartridge based point of care instruments appear suitable for the analysis of intraosseous samples. The agreement between intraosseous and arterial analysis seems to be good enough for the method to be clinically useful. The precision, quantified in terms of CV, is at least as good for intraosseous as for arterial analysis. There is no clinically important difference between samples from left and right tibia, indicating a good reproducibility. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, E; Sisterson, Douglas

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess,more » and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty: 1) identifying how the uncertainty of individual ARM measurements is currently expressed, 2) identifying a consistent approach to measurement uncertainty, and then 3) reclassifying ARM instrument measurement uncertainties in a common framework.« less

  8. Percentage exposure of root dentin collagen after application of two irrigation protocols with manual or rotary instrumentation and two methacrylate resin-based sealers.

    PubMed

    González-López, Santiago; Martín-Altuve, Ernesto; Bolaños-Carmona, Victoria; Sánchez-Sánchez, Purificación; Rodríguez-Navarro, Alejandro

    2013-10-01

    To compare the percentage of collagen exposed in dentin root thirds after two irrigation protocols with manual or rotary instrumentation using two methacrylate resin-based sealers. Forty-eight single-root human teeth were prepared with manual (n = 24) or nickeltitanium ProFile rotary (n = 24) instrumentation, using 5% NaOCl between instruments and 5 ml 17% EDTA as final irrigant or 20% citric acid + 2% chlorhexidine (CHX) between instruments and as the final irrigant. RealSeal or EndoREZ were used as filling materials. One 1-mm slice per third was abraded and stained with Masson's trichrome method. Mean exposed collagen values were obtained in four areas from each section (at 60X magnification) and a complete factorial ANOVA was used to analyze the influence of the study variables. Non-parametric Mann-Whitney's test was used to compare groups. Differences with p < 0.05 were considered significant. A significantly higher percentage of collagen was exposed in all thirds with the use of the 20% citric acid + 2% CHX protocol with rotary vs manual instrumentation, but percent collagen exposed did not differ as a function of the filling material. After the 5% NaOCl + 17% EDTA protocol, the percentage of collagen exposed did not differ between rotary and manual instrumentation but was higher with the use of RealSeal. The highest percentage exposure of collagen was with 20% citric acid + 2% CHX using rotary instrumentation, regardless of the filling material.

  9. Investigation of Music Student Efficacy as Influenced by Age, Experience, Gender, Ethnicity, and Type of Instrument Played in South Carolina

    ERIC Educational Resources Information Center

    White, Norman

    2010-01-01

    The purpose of this research study was to quantitatively examine South Carolina high school instrumental music students' self-efficacy as measured by the Generalized Self-Efficacy (GSE) instrument (Schwarzer & Jerusalem, 1993). The independent variables of age, experience, gender, ethnicity, and type of instrument played) were correlated with…

  10. Flux-gate magnetometer spin axis offset calibration using the electron drift instrument

    NASA Astrophysics Data System (ADS)

    Plaschke, Ferdinand; Nakamura, Rumi; Leinweber, Hannes K.; Chutter, Mark; Vaith, Hans; Baumjohann, Wolfgang; Steller, Manfred; Magnes, Werner

    2014-10-01

    Spin-stabilization of spacecraft immensely supports the in-flight calibration of on-board flux-gate magnetometers (FGMs). From 12 calibration parameters in total, 8 can be easily obtained by spectral analysis. From the remaining 4, the spin axis offset is known to be particularly variable. It is usually determined by analysis of Alfvénic fluctuations that are embedded in the solar wind. In the absence of solar wind observations, the spin axis offset may be obtained by comparison of FGM and electron drift instrument (EDI) measurements. The aim of our study is to develop methods that are readily usable for routine FGM spin axis offset calibration with EDI. This paper represents a major step forward in this direction. We improve an existing method to determine FGM spin axis offsets from EDI time-of-flight measurements by providing it with a comprehensive error analysis. In addition, we introduce a new, complementary method that uses EDI beam direction data instead of time-of-flight data. Using Cluster data, we show that both methods yield similarly accurate results, which are comparable yet more stable than those from a commonly used solar wind-based method.

  11. Cosmological Distance Scale to Gamma-Ray Bursts

    NASA Astrophysics Data System (ADS)

    Azzam, W. J.; Linder, E. V.; Petrosian, V.

    1993-05-01

    The source counts or the so-called log N -- log S relations are the primary data that constrain the spatial distribution of sources with unknown distances, such as gamma-ray bursts. In order to test galactic, halo, and cosmological models for gamma-ray bursts we compare theoretical characteristics of the log N -- log S relations to those obtained from data gathered by the BATSE instrument on board the Compton Observatory (GRO) and other instruments. We use a new and statistically correct method, that takes proper account of the variable nature of the triggering threshold, to analyze the data. Constraints on models obtained by this comparison will be presented. This work is supported by NASA grants NAGW 2290, NAG5 2036, and NAG5 1578.

  12. State variations in Medicaid enrollment and utilization of substance use services: Results from a National Longitudinal Study.

    PubMed

    Mojtabai, Ramin; Feder, Kenneth A; Kealhofer, Marc; Krawczyk, Noa; Storr, Carla; Tormohlen, Kayla N; Young, Andrea S; Olfson, Mark; Crum, Rosa M

    2018-06-01

    Medicaid enrollment varies considerably among states. This study examined the association of Medicaid enrollment with the use of substance health services in the longitudinal National Epidemiologic Survey on Alcohol and Related Conditions of 2001-2005. Instrumental variable methods were used to assess endogeneity of individual-level Medicaid enrollment using state-level data as instruments. Compared to the uninsured, Medicaid covered adults were more likely to use substance use disorder treatment services over the next three years. States that have opted to expand Medicaid enrollment under the Affordable Care Act will likely experience further increases in the use of these service over the coming years. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Timing Analysis with INTEGRAL: Comparing Different Reconstruction Algorithms

    NASA Technical Reports Server (NTRS)

    Grinberg, V.; Kreykenboehm, I.; Fuerst, F.; Wilms, J.; Pottschmidt, K.; Bel, M. Cadolle; Rodriquez, J.; Marcu, D. M.; Suchy, S.; Markowitz, A.; hide

    2010-01-01

    INTEGRAL is one of the few instruments capable of detecting X-rays above 20keV. It is therefore in principle well suited for studying X-ray variability in this regime. Because INTEGRAL uses coded mask instruments for imaging, the reconstruction of light curves of X-ray sources is highly non-trivial. We present results from the comparison of two commonly employed algorithms, which primarily measure flux from mask deconvolution (ii-lc-extract) and from calculating the pixel illuminated fraction (ii-light). Both methods agree well for timescales above about 10 s, the highest time resolution for which image reconstruction is possible. For higher time resolution, ii-light produces meaningful results, although the overall variance of the lightcurves is not preserved.

  14. Hemoglobin A1c Point-of-Care Assays; a New World with a Lot of Consequences!

    PubMed Central

    Lenters-Westra, Erna; Slingerland, Robbert J.

    2009-01-01

    Background Point-of-care instruments for the measurement of hemoglobin A1c (HbA1c) may improve the glycemic control of people with diabetes by providing a rapid result if the performance of the instruments used is acceptable. A 0.5% HbA1c difference between successive results is considered a clinically relevant change. With this in mind, the In2it from Bio-Rad and the DCA Vantage from Siemens were evaluated according to Clinical and Laboratory Standards Institute (CLSI) protocols. Methods The CLSI protocols EP-5 and EP-9 were applied to investigate precision, accuracy, and bias. The bias was compared with three certified secondary reference measurement procedures. Differences between capillary and venous blood were investigated by an end-user group consisting of nurse practitioners at a diabetes care center. Results At HbA1c levels of 5.1 and 11.2%, total coefficients of variation (CV) for the In2it were 4.9 and 3.3%, respectively, and for the DCA Vantage were 1.7 to 1.8% and 3.7 to 5.5% depending on the lot number of the cartridges. Method comparisons showed significant lot number-dependent results for the In2it and the DCA Vantage compared with the three reference methods. No overall difference was observed between capillary and venous blood for both methods. Conclusion Performance results of the In2it and the DCA Vantage showed variable and lot number-dependent results. To maintain the interlaboratory CV of 5% for HbA1c, the Clinical Laboratory Improvement Amendments rules for waived point-of-care instruments should be revised. An obligation for participating in external quality schemes and taking adequate action should be considered for POC instruments that perform poorly. PMID:20144277

  15. Interpretation of tropospheric ozone variability in data with different vertical and temporal resolution

    NASA Astrophysics Data System (ADS)

    Petropavlovskikh, I. V.; Disterhoft, P.; Johnson, B. J.; Rieder, H. E.; Manney, G. L.; Daffer, W.

    2012-12-01

    This work attributes tropospheric ozone variability derived from the ground-based Dobson and Brewer Umkehr measurements and from ozone sonde data to local sources and transport. It assesses capability and limitations in both types of measurements that are often used to analyze long- and short-term variability in tropospheric ozone time series. We will address the natural and instrument-related contribution to the variability found in both Umkehr and sonde data. Validation of Umkehr methods is often done by intercomparisons against independent ozone measuring techniques such as ozone sounding. We will use ozone-sounding in its original and AK-smoothed vertical profiles for assessment of ozone inter-annual variability over Boulder, CO. We will discuss possible reasons for differences between different ozone measuring techniques and its effects on the derived ozone trends. Next to standard evaluation techniques we utilize a STL-decomposition method to address temporal variability and trends in the Boulder Umkehr data. Further, we apply a statistical modeling approach to the ozone data set to attribute ozone variability to individual driving forces associated with natural and anthropogenic causes. To this aim we follow earlier work applying a backward selection method (i.e., a stepwise elimination procedure out of a set of total 44 explanatory variables) to determine those explanatory variables which contribute most significantly to the observed variability. We will present also some results associated with completeness (sampling rate) of the existing data sets. We will also use MERRA (Modern-Era Retrospective analysis for Research and Applications) re-analysis results selected for Boulder location as a transfer function in understanding of the effects that the temporal sampling and vertical resolution bring into trend and ozone variability analysis. Analyzing intra-annual variability in ozone measurements over Boulder, CO, in relation to the upper tropospheric subtropical and polar jets, we will address the stratospheric and tropospheric intrusions in the middle latitude troposphere ozone field.

  16. Outcomes of Posterolateral Fusion with and without Instrumentation and of Interbody Fusion for Isthmic Spondylolisthesis: A Prospective Study.

    PubMed

    Endler, Peter; Ekman, Per; Möller, Hans; Gerdhem, Paul

    2017-05-03

    Various methods for the treatment of isthmic spondylolisthesis are available. The aim of this study was to compare outcomes after posterolateral fusion without instrumentation, posterolateral fusion with instrumentation, and interbody fusion. The Swedish Spine Register was used to identify 765 patients who had been operated on for isthmic spondylolisthesis and had at least preoperative and 2-year outcome data; 586 of them had longer follow-up (a mean of 6.9 years). The outcome measures were a global assessment of leg and back pain, the Oswestry Disability Index (ODI), the EuroQol-5 Dimensions (EQ-5D) Questionnaire, the Short Form-36 (SF-36), a visual analog scale (VAS) for back and leg pain, and satisfaction with treatment. Data on additional lumbar spine surgery was searched for in the register, with the mean duration of follow-up for this variable being 10.6 years after the index procedure. Statistical analyses were performed with analysis of covariance or competing-risks proportional hazards regression, adjusted for baseline differences in the studied variables, smoking, employment status, and level of fusion. Posterolateral fusion without instrumentation was performed in 102 patients; posterolateral fusion with instrumentation, in 452; and interbody fusion, in 211. At 1 year, improvement was reported in the global assessment for back pain by 54% of the patients who had posterolateral fusion without instrumentation, 68% of those treated with posterolateral fusion with instrumentation, and 70% of those treated with interbody fusion (p = 0.009). The VAS for back pain and reported satisfaction with treatment showed similar patterns (p = 0.003 and p = 0.017, respectively), whereas other outcomes did not differ among the treatment groups at 1 year. At 2 years, the global assessment for back pain indicated improvement in 57% of the patients who had undergone posterolateral fusion without instrumentation, 70% of those who had posterolateral fusion with instrumentation, and 71% of those treated with interbody fusion (p = 0.022). There were no significant outcome differences at the mean 6.9-year follow-up interval. There was an increased hazard ratio for additional lumbar spine surgery after interbody fusion (4.34; 95% confidence interval [CI] = 1.71 to 11.03) and posterolateral fusion with instrumentation (2.56; 95% CI = 1.02 to 6.42) compared with after posterolateral fusion without instrumentation (1.00; reference). Fusion with instrumentation, with or without interbody fusion, was associated with more improvement in back pain scores and higher satisfaction with treatment compared with fusion without instrumentation at 1 year, but the difference was attenuated with longer follow-up. Fusion with instrumentation was associated with a significantly higher risk of additional spine surgery. Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

  17. Mapping Variables.

    ERIC Educational Resources Information Center

    Stone, Mark H.; Wright, Benjamin D.; Stenner, A. Jackson

    1999-01-01

    Describes mapping variables, the principal technique for planning and constructing a test or rating instrument. A variable map is also useful for interpreting results. Provides several maps to show the importance and value of mapping a variable by person and item data. (Author/SLD)

  18. Variably Transmittive, Electronically-Controlled Eyewear

    NASA Technical Reports Server (NTRS)

    Chapman, John J. (Inventor); Glaab, Louis J. (Inventor); Schott, Timothy D. (Inventor); Howell, Charles T. (Inventor); Fleck, Vincent J. (Inventor)

    2013-01-01

    A system and method for flight training and evaluation of pilots comprises electronically activated vision restriction glasses that detect the pilot's head position and automatically darken and restrict the pilot's ability to see through the front and side windscreens when the pilot-in-training attempts to see out the windscreen. Thus, the pilot-in-training sees only within the aircraft cockpit, forcing him or her to fly by instruments in the most restricted operational mode.

  19. A rapid and simple method for the determination of psychoactive alkaloids by CE-UV: application to Peganum Harmala seed infusions.

    PubMed

    Tascón, Marcos; Benavente, Fernando; Vizioli, Nora M; Gagliardi, Leonardo G

    2017-04-01

    The β-carboline alkaloids of the harmala (HAlks) group are compounds widely spread in many natural sources, but found at relatively high levels in some specific plants like Peganum harmala (Syrian rue) or Banisteriopsis caapi. HAlks are a reversible Mono Amino Oxidase type A Inhibitor (MAOI) and, as a consequence, these plants or their extracts can be used to produce psychotropic effects when are combined with psychotropic drugs based on amino groups. Since the occurrence and the levels of the HAlks in natural sources are subject to significant variability, more widespread use is not clinical but recreational or ritual, for example B. caapi is a known part of the Ayahuasca ritual mixture. The lack of simple methods to control the variable levels of these compounds in natural sources restricts the possibilities to dose in strict quantities and, as a consequence, limits its use with pharmacological or clinical purposes. In this work, we present a fast, simple, and robust method of quantifying simultaneously the six HAlks more frequently found in plants, i.e., harmine, harmaline, harmol, harmalol, harmane, and norharmane, by capillary electrophoresis instruments equipped with the more common detector UV. The method is applied to analyze these HAlks in P. Harmala seeds infusion which is a frequent intake form for these HAlks. The method is validated in three different instruments in order to evaluate the transferability and to compare the performances between them. In this case, harmaline, harmine, and harmol were found in the infusion samples. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. No Time for Dead Time: Use the Fourier Amplitude Differences to Normalize Dead-time-affected Periodograms

    NASA Astrophysics Data System (ADS)

    Bachetti, Matteo; Huppenkothen, Daniela

    2018-02-01

    Dead time affects many of the instruments used in X-ray astronomy, by producing a strong distortion in power density spectra. This can make it difficult to model the aperiodic variability of the source or look for quasi-periodic oscillations. Whereas in some instruments a simple a priori correction for dead-time-affected power spectra is possible, this is not the case for others such as NuSTAR, where the dead time is non-constant and long (∼2.5 ms). Bachetti et al. (2015) suggested the cospectrum obtained from light curves of independent detectors within the same instrument as a possible way out, but this solution has always only been a partial one: the measured rms was still affected by dead time because the width of the power distribution of the cospectrum was modulated by dead time in a frequency-dependent way. In this Letter, we suggest a new, powerful method to normalize dead-time-affected cospectra and power density spectra. Our approach uses the difference of the Fourier amplitudes from two independent detectors to characterize and filter out the effect of dead time. This method is crucially important for the accurate modeling of periodograms derived from instruments affected by dead time on board current missions like NuSTAR and Astrosat, but also future missions such as IXPE.

  1. Assessing the impact of natural policy experiments on socioeconomic inequalities in health: how to apply commonly used quantitative analytical methods?

    PubMed

    Hu, Yannan; van Lenthe, Frank J; Hoffmann, Rasmus; van Hedel, Karen; Mackenbach, Johan P

    2017-04-20

    The scientific evidence-base for policies to tackle health inequalities is limited. Natural policy experiments (NPE) have drawn increasing attention as a means to evaluating the effects of policies on health. Several analytical methods can be used to evaluate the outcomes of NPEs in terms of average population health, but it is unclear whether they can also be used to assess the outcomes of NPEs in terms of health inequalities. The aim of this study therefore was to assess whether, and to demonstrate how, a number of commonly used analytical methods for the evaluation of NPEs can be applied to quantify the effect of policies on health inequalities. We identified seven quantitative analytical methods for the evaluation of NPEs: regression adjustment, propensity score matching, difference-in-differences analysis, fixed effects analysis, instrumental variable analysis, regression discontinuity and interrupted time-series. We assessed whether these methods can be used to quantify the effect of policies on the magnitude of health inequalities either by conducting a stratified analysis or by including an interaction term, and illustrated both approaches in a fictitious numerical example. All seven methods can be used to quantify the equity impact of policies on absolute and relative inequalities in health by conducting an analysis stratified by socioeconomic position, and all but one (propensity score matching) can be used to quantify equity impacts by inclusion of an interaction term between socioeconomic position and policy exposure. Methods commonly used in economics and econometrics for the evaluation of NPEs can also be applied to assess the equity impact of policies, and our illustrations provide guidance on how to do this appropriately. The low external validity of results from instrumental variable analysis and regression discontinuity makes these methods less desirable for assessing policy effects on population-level health inequalities. Increased use of the methods in social epidemiology will help to build an evidence base to support policy making in the area of health inequalities.

  2. An instrumental variable approach finds no associated harm or benefit from early dialysis initiation in the United States

    PubMed Central

    Scialla, Julia J.; Liu, Jiannong; Crews, Deidra C.; Guo, Haifeng; Bandeen-Roche, Karen; Ephraim, Patti L.; Tangri, Navdeep; Sozio, Stephen M.; Shafi, Tariq; Miskulin, Dana C.; Michels, Wieneke M.; Jaar, Bernard G.; Wu, Albert W.; Powe, Neil R.; Boulware, L. Ebony

    2014-01-01

    The estimated glomerular filtration rate (eGFR) at dialysis initiation has been rising. Observational studies suggest harm, but may be confounded by unmeasured factors. As instrumental variable methods may be less biased we performed a retrospective cohort study of 310,932 patients starting dialysis between 2006 to 2008 and registered in the United States Renal Data System in order to describe geographic variation in eGFR at dialysis initiation and determine its association with mortality. Patients were grouped into 804 health service areas by zip code. Individual eGFR at dialysis initiation averaged 10.8 ml/min/1.73m2 but varied geographically. Only 11% of the variation in mean health service areas-level eGFR at dialysis initiation was accounted for by patient characteristics. We calculated demographic-adjusted mean eGFR at dialysis initiation in the health service areas using the 2006 and 2007 incident cohort as our instrument and estimated the association between individual eGFR at dialysis initiation and mortality in the 2008 incident cohort using the 2 stage residual inclusion method. Among 89,547 patients starting dialysis in 2008 with eGFR 5 to 20 ml/min/1.73m2, eGFR at initiation was not associated with mortality over a median of 15.5 months [hazard ratio 1.025 per 1 ml/min/1.73m2 for eGFR 5 to 14 ml/min/1.73m2; and 0.973 per 1 ml/min/1.73m2 for eGFR 14 to 20 ml/min/1.73m2]. Thus, there was no associated harm or benefit from early dialysis initiation in the United States. PMID:24786707

  3. Predicting College Women's Career Plans: Instrumentality, Work, and Family

    ERIC Educational Resources Information Center

    Savela, Alexandra E.; O'Brien, Karen M.

    2016-01-01

    This study examined how college women's instrumentality and expectations about combining work and family predicted early career development variables. Specifically, 177 undergraduate women completed measures of instrumentality (i.e., traits such as ambition, assertiveness, and risk taking), willingness to compromise career for family, anticipated…

  4. A critical appraisal of instruments to measure outcomes of interprofessional education.

    PubMed

    Oates, Matthew; Davidson, Megan

    2015-04-01

    Interprofessional education (IPE) is believed to prepare health professional graduates for successful collaborative practice. A range of instruments have been developed to measure the outcomes of IPE. An understanding of the psychometric properties of these instruments is important if they are to be used to measure the effectiveness of IPE. This review set out to identify instruments available to measure outcomes of IPE and collaborative practice in pre-qualification health professional students and to critically appraise the psychometric properties of validity, responsiveness and reliability against contemporary standards for instrument design. Instruments were selected from a pool of extant instruments and subjected to critical appraisal to determine whether they satisfied inclusion criteria. The qualitative and psychometric attributes of the included instruments were appraised using a checklist developed for this review. Nine instruments were critically appraised, including the widely adopted Readiness for Interprofessional Learning Scale (RIPLS) and the Interdisciplinary Education Perception Scale (IEPS). Validity evidence for instruments was predominantly based on test content and internal structure. Ceiling effects and lack of scale width contribute to the inability of some instruments to detect change in variables of interest. Limited reliability data were reported for two instruments. Scale development and scoring protocols were generally reported by instrument developers, but the inconsistent application of scoring protocols for some instruments was apparent. A number of instruments have been developed to measure outcomes of IPE in pre-qualification health professional students. Based on reported validity evidence and reliability data, the psychometric integrity of these instruments is limited. The theoretical test construction paradigm on which instruments have been developed may be contributing to the failure of some instruments to detect change in variables of interest following an IPE intervention. These limitations should be considered in any future research on instrument design. © 2015 John Wiley & Sons Ltd.

  5. Towards scar-free surgery: An analysis of the increasing complexity from laparoscopic surgery to NOTES

    PubMed Central

    Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.

    2014-01-01

    Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811

  6. Instrumentation Failure after Partial Corpectomy with Instrumentation of a Metastatic Spine

    PubMed Central

    Park, Sung Bae; Kim, Ki Jeong; Han, Sanghyun; Oh, Sohee; Kim, Chi Heon; Chung, Chun Kee

    2018-01-01

    Objective To identify the perioperative factors associated with instrument failure in patients undergoing a partial corpectomy with instrumentation (PCI) for spinal metastasis. Methods We assessed the one hundred twenty-four patients with who underwent PCI for a metastatic spine from 1987 to 2011. Outcome measure was the risk factor related to implantation failure. The preoperative factors analyzed were age, sex, ambulation, American Spinal Injury Association grade, bone mineral density, use of steroid, primary tumor site, number of vertebrae with metastasis, extra-bone metastasis, preoperative adjuvant chemotherapy, and preoperative spinal radiotherapy. The intraoperative factors were the number of fixed vertebrae, fixation in osteolytic vertebrae, bone grafting, and type of surgical approach. The postoperative factors included postoperative adjuvant chemotherapy and spinal radiotherapy. This study was supported by the National Research Foundation grant funded by government. There were no study-specific biases related to conflicts of interest. Results There were 15 instrumentation failures (15/124, 12.1%). Preoperative ambulatory status and primary tumor site were not significantly related to the development of implant failure. There were no significant associations between insertion of a bone graft into the partial corpectomy site and instrumentation failure. The preoperative and operative factors analyzed were not significantly related to instrumentation failure. In univariable and multivariable analyses, postoperative spinal radiotherapy was the only significant variable related to instrumentation failure (p=0.049 and 0.050, respectively). Conclusion When performing PCI in patients with spinal metastasis followed by postoperative spinal radiotherapy, the surgeon may consider the possibility of instrumentation failure and find other strategies for augmentation than the use of a bone graft for fusion. PMID:29631384

  7. Estimation of Staphylococcus aureus growth parameters from turbidity data: characterization of strain variation and comparison of methods.

    PubMed

    Lindqvist, R

    2006-07-01

    Turbidity methods offer possibilities for generating data required for addressing microorganism variability in risk modeling given that the results of these methods correspond to those of viable count methods. The objectives of this study were to identify the best approach for determining growth parameters based on turbidity data and use of a Bioscreen instrument and to characterize variability in growth parameters of 34 Staphylococcus aureus strains of different biotypes isolated from broiler carcasses. Growth parameters were estimated by fitting primary growth models to turbidity growth curves or to detection times of serially diluted cultures either directly or by using an analysis of variance (ANOVA) approach. The maximum specific growth rates in chicken broth at 17 degrees C estimated by time to detection methods were in good agreement with viable count estimates, whereas growth models (exponential and Richards) underestimated growth rates. Time to detection methods were selected for strain characterization. The variation of growth parameters among strains was best described by either the logistic or lognormal distribution, but definitive conclusions require a larger data set. The distribution of the physiological state parameter ranged from 0.01 to 0.92 and was not significantly different from a normal distribution. Strain variability was important, and the coefficient of variation of growth parameters was up to six times larger among strains than within strains. It is suggested to apply a time to detection (ANOVA) approach using turbidity measurements for convenient and accurate estimation of growth parameters. The results emphasize the need to consider implications of strain variability for predictive modeling and risk assessment.

  8. Mendelian randomization with Egger pleiotropy correction and weakly informative Bayesian priors.

    PubMed

    Schmidt, A F; Dudbridge, F

    2017-12-15

    The MR-Egger (MRE) estimator has been proposed to correct for directional pleiotropic effects of genetic instruments in an instrumental variable (IV) analysis. The power of this method is considerably lower than that of conventional estimators, limiting its applicability. Here we propose a novel Bayesian implementation of the MR-Egger estimator (BMRE) and explore the utility of applying weakly informative priors on the intercept term (the pleiotropy estimate) to increase power of the IV (slope) estimate. This was a simulation study to compare the performance of different IV estimators. Scenarios differed in the presence of a causal effect, the presence of pleiotropy, the proportion of pleiotropic instruments and degree of 'Instrument Strength Independent of Direct Effect' (InSIDE) assumption violation. Based on empirical plasma urate data, we present an approach to elucidate a prior distribution for the amount of pleiotropy. A weakly informative prior on the intercept term increased power of the slope estimate while maintaining type 1 error rates close to the nominal value of 0.05. Under the InSIDE assumption, performance was unaffected by the presence or absence of pleiotropy. Violation of the InSIDE assumption biased all estimators, affecting the BMRE more than the MRE method. Depending on the prior distribution, the BMRE estimator has more power at the cost of an increased susceptibility to InSIDE assumption violations. As such the BMRE method is a compromise between the MRE and conventional IV estimators, and may be an especially useful approach to account for observed pleiotropy. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  9. Rationale and Design of the Registry for Stones of the Kidney and Ureter (ReSKU): A Prospective Observational Registry to Study the Natural History of Urolithiasis Patients

    PubMed Central

    Chang, Helena C.; Tzou, David T.; Usawachintachit, Manint; Duty, Brian D.; Hsi, Ryan S.; Harper, Jonathan D.; Sorensen, Mathew D.; Stoller, Marshall L.; Sur, Roger L.

    2016-01-01

    Abstract Objectives: Registry-based clinical research in nephrolithiasis is critical to advancing quality in urinary stone disease management and ultimately reducing stone recurrence. A need exists to develop Health Insurance Portability and Accountability Act (HIPAA)-compliant registries that comprise integrated electronic health record (EHR) data using prospectively defined variables. An EHR-based standardized patient database—the Registry for Stones of the Kidney and Ureter (ReSKU™)—was developed, and herein we describe our implementation outcomes. Materials and Methods: Interviews with academic and community endourologists in the United States, Canada, China, and Japan identified demographic, intraoperative, and perioperative variables to populate our registry. Variables were incorporated into a HIPAA-compliant Research Electronic Data Capture database linked to text prompts and registration data within the Epic EHR platform. Specific data collection instruments supporting New patient, Surgery, Postoperative, and Follow-up clinical encounters were created within Epic to facilitate automated data extraction into ReSKU. Results: The number of variables within each instrument includes the following: New patient—60, Surgery—80, Postoperative—64, and Follow-up—64. With manual data entry, the mean times to complete each of the clinic-based instruments were (minutes) as follows: New patient—12.06 ± 2.30, Postoperative—7.18 ± 1.02, and Follow-up—8.10 ± 0.58. These times were significantly reduced with the use of ReSKU structured clinic note templates to the following: New patient—4.09 ± 1.73, Postoperative—1.41 ± 0.41, and Follow-up—0.79 ± 0.38. With automated data extraction from Epic, manual entry is obviated. Conclusions: ReSKU is a longitudinal prospective nephrolithiasis registry that integrates EHR data, lowering the barriers to performing high quality clinical research and quality outcome assessments in urinary stone disease. PMID:27758162

  10. Development of a new linearly variable edge filter (LVEF)-based compact slit-less mini-spectrometer

    NASA Astrophysics Data System (ADS)

    Mahmoud, Khaled; Park, Seongchong; Lee, Dong-Hoon

    2018-02-01

    This paper presents the development of a compact charge-coupled detector (CCD) spectrometer. We describe the design, concept and characterization of VNIR linear variable edge filter (LVEF)- based mini-spectrometer. The new instrument has been realized for operation in the 300 nm to 850 nm wavelength range. The instrument consists of a linear variable edge filter in front of CCD array. Low-size, light-weight and low-cost could be achieved using the linearly variable filters with no need to use any moving parts for wavelength selection as in the case of commercial spectrometers available in the market. This overview discusses the main components characteristics, the main concept with the main advantages and limitations reported. Experimental characteristics of the LVEFs are described. The mathematical approach to get the position-dependent slit function of the presented prototype spectrometer and its numerical de-convolution solution for a spectrum reconstruction is described. The performance of our prototype instrument is demonstrated by measuring the spectrum of a reference light source.

  11. Exploration of an oculometer-based model of pilot workload

    NASA Technical Reports Server (NTRS)

    Krebs, M. J.; Wingert, J. W.; Cunningham, T.

    1977-01-01

    Potential relationships between eye behavior and pilot workload are discussed. A Honeywell Mark IIA oculometer was used to obtain the eye data in a fixed base transport aircraft simulation facility. The data were analyzed to determine those parameters of eye behavior which were related to changes in level of task difficulty of the simulated manual approach and landing on instruments. A number of trends and relationships between eye variables and pilot ratings were found. A preliminary equation was written based on the results of a stepwise linear regression. High variability in time spent on various instruments was related to differences in scanning strategy among pilots. A more detailed analysis of individual runs by individual pilots was performed to investigate the source of this variability more closely. Results indicated a high degree of intra-pilot variability in instrument scanning. No consistent workload related trends were found. Pupil diameter which had demonstrated a strong relationship to task difficulty was extensively re-exmained.

  12. A new method for measuring lung deposition efficiency of airborne nanoparticles in a single breath

    PubMed Central

    Jakobsson, Jonas K. F.; Hedlund, Johan; Kumlin, John; Wollmer, Per; Löndahl, Jakob

    2016-01-01

    Assessment of respiratory tract deposition of nanoparticles is a key link to understanding their health impacts. An instrument was developed to measure respiratory tract deposition of nanoparticles in a single breath. Monodisperse nanoparticles are generated, inhaled and sampled from a determined volumetric lung depth after a controlled residence time in the lung. The instrument was characterized for sensitivity to inter-subject variability, particle size (22, 50, 75 and 100 nm) and breath-holding time (3–20 s) in a group of seven healthy subjects. The measured particle recovery had an inter-subject variability 26–50 times larger than the measurement uncertainty and the results for various particle sizes and breath-holding times were in accordance with the theory for Brownian diffusion and values calculated from the Multiple-Path Particle Dosimetry model. The recovery was found to be determined by residence time and particle size, while respiratory flow-rate had minor importance in the studied range 1–10 L/s. The instrument will be used to investigate deposition of nanoparticles in patients with respiratory disease. The fast and precise measurement allows for both diagnostic applications, where the disease may be identified based on particle recovery, and for studies with controlled delivery of aerosol-based nanomedicine to specific regions of the lungs. PMID:27819335

  13. Chemometric optimization of the robustness of the near infrared spectroscopic method in wheat quality control.

    PubMed

    Pojić, Milica; Rakić, Dušan; Lazić, Zivorad

    2015-01-01

    A chemometric approach was applied for the optimization of the robustness of the NIRS method for wheat quality control. Due to the high number of experimental (n=6) and response variables to be studied (n=7) the optimization experiment was divided into two stages: screening stage in order to evaluate which of the considered variables were significant, and optimization stage to optimize the identified factors in the previously selected experimental domain. The significant variables were identified by using fractional factorial experimental design, whilst Box-Wilson rotatable central composite design (CCRD) was run to obtain the optimal values for the significant variables. The measured responses included: moisture, protein and wet gluten content, Zeleny sedimentation value and deformation energy. In order to achieve the minimal variation in responses, the optimal factor settings were found by minimizing the propagation of error (POE). The simultaneous optimization of factors was conducted by desirability function. The highest desirability of 87.63% was accomplished by setting up experimental conditions as follows: 19.9°C for sample temperature, 19.3°C for ambient temperature and 240V for instrument voltage. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Sequential Measurement of Intermodal Variability in Public Transportation PM2.5 and CO Exposure Concentrations.

    PubMed

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2016-08-16

    A sequential measurement method is demonstrated for quantifying the variability in exposure concentration during public transportation. This method was applied in Hong Kong by measuring PM2.5 and CO concentrations along a route connecting 13 transportation-related microenvironments within 3-4 h. The study design takes into account ventilation, proximity to local sources, area-wide air quality, and meteorological conditions. Portable instruments were compacted into a backpack to facilitate measurement under crowded transportation conditions and to quantify personal exposure by sampling at nose level. The route included stops next to three roadside monitors to enable comparison of fixed site and exposure concentrations. PM2.5 exposure concentrations were correlated with the roadside monitors, despite differences in averaging time, detection method, and sampling location. Although highly correlated in temporal trend, PM2.5 concentrations varied significantly among microenvironments, with mean concentration ratios versus roadside monitor ranging from 0.5 for MTR train to 1.3 for bus terminal. Measured inter-run variability provides insight regarding the sample size needed to discriminate between microenvironments with increased statistical significance. The study results illustrate the utility of sequential measurement of microenvironments and policy-relevant insights for exposure mitigation and management.

  15. Tropical tropospheric ozone and biomass burning.

    PubMed

    Thompson, A M; Witte, J C; Hudson, R D; Guo, H; Herman, J R; Fujiwara, M

    2001-03-16

    New methods for retrieving tropospheric ozone column depth and absorbing aerosol (smoke and dust) from the Earth Probe-Total Ozone Mapping Spectrometer (EP/TOMS) are used to follow pollution and to determine interannual variability and trends. During intense fires over Indonesia (August to November 1997), ozone plumes, decoupled from the smoke below, extended as far as India. This ozone overlay a regional ozone increase triggered by atmospheric responses to the El Niño and Indian Ocean Dipole. Tropospheric ozone and smoke aerosol measurements from the Nimbus 7 TOMS instrument show El Niño signals but no tropospheric ozone trend in the 1980s. Offsets between smoke and ozone seasonal maxima point to multiple factors determining tropical tropospheric ozone variability.

  16. Propensity-score matching in economic analyses: comparison with regression models, instrumental variables, residual inclusion, differences-in-differences, and decomposition methods.

    PubMed

    Crown, William H

    2014-02-01

    This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.

  17. Incorporating Measurement Error from Modeled Air Pollution Exposures into Epidemiological Analyses.

    PubMed

    Samoli, Evangelia; Butland, Barbara K

    2017-12-01

    Outdoor air pollution exposures used in epidemiological studies are commonly predicted from spatiotemporal models incorporating limited measurements, temporal factors, geographic information system variables, and/or satellite data. Measurement error in these exposure estimates leads to imprecise estimation of health effects and their standard errors. We reviewed methods for measurement error correction that have been applied in epidemiological studies that use model-derived air pollution data. We identified seven cohort studies and one panel study that have employed measurement error correction methods. These methods included regression calibration, risk set regression calibration, regression calibration with instrumental variables, the simulation extrapolation approach (SIMEX), and methods under the non-parametric or parameter bootstrap. Corrections resulted in small increases in the absolute magnitude of the health effect estimate and its standard error under most scenarios. Limited application of measurement error correction methods in air pollution studies may be attributed to the absence of exposure validation data and the methodological complexity of the proposed methods. Future epidemiological studies should consider in their design phase the requirements for the measurement error correction method to be later applied, while methodological advances are needed under the multi-pollutants setting.

  18. Variable threshold method for ECG R-peak detection.

    PubMed

    Kew, Hsein-Ping; Jeong, Do-Un

    2011-10-01

    In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.

  19. Training to acquire psychomotor skills for endoscopic endonasal surgery using a personal webcam trainer.

    PubMed

    Hirayama, Ryuichi; Fujimoto, Yasunori; Umegaki, Masao; Kagawa, Naoki; Kinoshita, Manabu; Hashimoto, Naoya; Yoshimine, Toshiki

    2013-05-01

    Existing training methods for neuroendoscopic surgery have mainly emphasized the acquisition of anatomical knowledge and procedures for operating an endoscope and instruments. For laparoscopic surgery, various training systems have been developed to teach handling of an endoscope as well as the manipulation of instruments for speedy and precise endoscopic performance using both hands. In endoscopic endonasal surgery (EES), especially using a binostril approach to the skull base and intradural lesions, the learning of more meticulous manipulation of instruments is mandatory, and it may be necessary to develop another type of training method for acquiring psychomotor skills for EES. Authors of the present study developed an inexpensive, portable personal trainer using a webcam and objectively evaluated its utility. Twenty-five neurosurgeons volunteered for this study and were divided into 2 groups, a novice group (19 neurosurgeons) and an experienced group (6 neurosurgeons). Before and after the exercises of set tasks with a webcam box trainer, the basic endoscopic skills of each participant were objectively assessed using the virtual reality simulator (LapSim) while executing 2 virtual tasks: grasping and instrument navigation. Scores for the following 11 performance variables were recorded: instrument time, instrument misses, instrument path length, and instrument angular path (all of which were measured in both hands), as well as tissue damage, max damage, and finally overall score. Instrument time was indicated as movement speed; instrument path length and instrument angular path as movement efficiency; and instrument misses, tissue damage, and max damage as movement precision. In the novice group, movement speed and efficiency were significantly improved after the training. In the experienced group, significant improvement was not shown in the majority of virtual tasks. Before the training, significantly greater movement speed and efficiency were demonstrated in the experienced group, but no difference in movement precision was shown between the 2 groups. After the training, no significant differences were shown between the 2 groups in the majority of the virtual tasks. Analysis revealed that the webcam trainer improved the basic skills of the novices, increasing movement speed and efficiency without sacrificing movement precision. Novices using this unique webcam trainer showed improvement in psychomotor skills for EES. The authors believe that training in terms of basic endoscopic skills is meaningful and that the webcam training system can play a role in daily off-the-job training for EES.

  20. Internal consistency of the self-reporting questionnaire-20 in occupational groups

    PubMed Central

    Santos, Kionna Oliveira Bernardes; Carvalho, Fernando Martins; de Araújo, Tânia Maria

    2016-01-01

    ABSTRACT OBJECTIVE To assess the internal consistency of the measurements of the Self-Reporting Questionnaire (SRQ-20) in different occupational groups. METHODS A validation study was conducted with data from four surveys with groups of workers, using similar methods. A total of 9,959 workers were studied. In all surveys, the common mental disorders were assessed via SRQ-20. The internal consistency considered the items belonging to dimensions extracted by tetrachoric factor analysis for each study. Item homogeneity assessment compared estimates of Cronbach’s alpha (KD-20), the alpha applied to a tetrachoric correlation matrix and stratified Cronbach’s alpha. RESULTS The SRQ-20 dimensions showed adequate values, considering the reference parameters. The internal consistency of the instrument items, assessed by stratified Cronbach’s alpha, was high (> 0.80) in the four studies. CONCLUSIONS The SRQ-20 showed good internal consistency in the professional categories evaluated. However, there is still a need for studies using alternative methods and additional information able to refine the accuracy of latent variable measurement instruments, as in the case of common mental disorders. PMID:27007682

  1. Measuring the Sensitivity and Construct Validity of 6 Utility Instruments in 7 Disease Areas.

    PubMed

    Richardson, Jeff; Iezzi, Angelo; Khan, Munir A; Chen, Gang; Maxwell, Aimee

    2016-02-01

    Health services that affect quality of life (QoL) are increasingly evaluated using cost utility analyses (CUA). These commonly employ one of a small number of multiattribute utility instruments (MAUI) to assess the effects of the health service on utility. However, the MAUI differ significantly, and the choice of instrument may alter the outcome of an evaluation. The present article has 2 objectives: 1) to compare the results of 3 measures of the sensitivity of 6 MAUI and the results of 6 tests of construct validity in 7 disease areas and 2) to rank the MAUI by each of the test results in each disease area and by an overall composite index constructed from the tests. Patients and the general public were administered a battery of instruments, which included the 6 MAUI, disease-specific QoL instruments (DSI), and 6 other comparator instruments. In each disease area, instrument sensitivity was measured 3 ways: by the unadjusted mean difference in utility between public and patient groups, by the value of the effect size, and by the correlation between MAUI and DSI scores. Content and convergent validity were tested by comparison of MAUI utilities and scores from the 6 comparator instruments. These included 2 measures of health state preferences, measures of subjective well-being and capabilities, and generic measures of physical and mental QoL derived from the SF-36. The apparent sensitivity of instruments varied significantly with the measurement method and by disease area. Validation test results varied with the comparator instruments. Notwithstanding this variability, the 15D, AQoL-8D, and the SF-6D generally achieved better test results than the QWB and EQ-5D-5L. © The Author(s) 2015.

  2. 8 years of Solar Spectral Irradiance Variability Observed from the ISS with the SOLAR/SOLSPEC Instrument

    NASA Astrophysics Data System (ADS)

    Damé, Luc; Bolsée, David; Meftah, Mustapha; Irbah, Abdenour; Hauchecorne, Alain; Bekki, Slimane; Pereira, Nuno; Cessateur, Marchand; Gäel; , Marion; et al.

    2016-10-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its variability in the UV, as measured by SOLAR/SOLSPEC for 8 years. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  3. Towards Student Instrumentation of Computer-Based Algebra Systems in University Courses

    ERIC Educational Resources Information Center

    Stewart, Sepideh; Thomas, Michael O. J.; Hannah, John

    2005-01-01

    There are many perceived benefits of using technology, such as computer algebra systems, in undergraduate mathematics courses. However, attaining these benefits sometimes proves elusive. Some of the key variables are the teaching approach and the student instrumentation of the technology. This paper considers the instrumentation of computer-based…

  4. Instructional Interactions of Kindergarten Mathematics Classrooms: Validating a Direct Observation Instrument

    ERIC Educational Resources Information Center

    Doabler, Christian; Smolkowski, Keith; Fien, Hank; Kosty, Derek B.; Cary, Mari Strand

    2010-01-01

    In this paper, the authors report research focused directly on the validation of the Coding of Academic Teacher-Student interactions (CATS) direct observation instrument. They use classroom information gathered by the CATS instrument to better understand the potential mediating variables hypothesized to influence student achievement. Their study's…

  5. 26 CFR 1.1275-5 - Variable rate debt instruments.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... nonpublicly traded property. A debt instrument (other than a tax-exempt obligation) that would otherwise... variations in the cost of newly borrowed funds in the currency in which the debt instrument is denominated... on the yield of actively traded personal property (within the meaning of section 1092(d)(1)). (ii...

  6. Variable aperture-based ptychographical iterative engine method.

    PubMed

    Sun, Aihui; Kong, Yan; Meng, Xin; He, Xiaoliang; Du, Ruijun; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-02-01

    A variable aperture-based ptychographical iterative engine (vaPIE) is demonstrated both numerically and experimentally to reconstruct the sample phase and amplitude rapidly. By adjusting the size of a tiny aperture under the illumination of a parallel light beam to change the illumination on the sample step by step and recording the corresponding diffraction patterns sequentially, both the sample phase and amplitude can be faithfully reconstructed with a modified ptychographical iterative engine (PIE) algorithm. Since many fewer diffraction patterns are required than in common PIE and the shape, the size, and the position of the aperture need not to be known exactly, this proposed vaPIE method remarkably reduces the data acquisition time and makes PIE less dependent on the mechanical accuracy of the translation stage; therefore, the proposed technique can be potentially applied for various scientific researches. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  7. Serious Game and Virtual World Training: Instrumentation and Assessment

    DTIC Science & Technology

    2012-12-10

    Effectiveness of EEG Neurofeedback Training for ADHD in a Clinical Setting as Measured by Changes in T.O.V.A. Scores, Behavioral Ratings, and WISC-R...Human Physiological Data Collection Methods 24 4.3.1 Electroencephalography ( EEG ) 24 4.3.2 Galvanic Skin Response (GSR) and Heart Rate Variability...Collecting Human Data 24 8 Participant Wearing a 32-Channel EEG Cap 25 9 Future Force Warrior Example Combat Armor 27 10 Screenshot of the Organic

  8. A Grazing Incidence Spectrograph as Applied to Vacuum Ultraviolet, Soft X-Ray, Pulsed Plasma Sources.

    DTIC Science & Technology

    A 2.2-meter variable angle of incidence grazing incidence spectrograph is described for photographic recording of spectra down to 10A. Also a method for determining the absolute total fluence from a pulsed plasma source, knowing the absolute sensitivity of the instrument, is described. Spectra are presented from a low-inductance sliding spark gap and a 20-kj dense plasma focus . A program for spectram analysis is included. (Modified author abstract)

  9. Damage Effects Identified By Scatter Evaluation Of Supersmooth Surfaces

    NASA Astrophysics Data System (ADS)

    Stowell, W. K.; Orazio, Fred D.

    1983-12-01

    The surface quality of optics used in an extremely sensitive laser instrument, such as a Ring Laser Gyro (RLG), is critical. This has led to the development of a Variable Angle Scatterometer at the Air Force Wright Aeronautical Laboratories at Wright-Patterson Air Force Base, which can detect low level light scatter from the high quality optics used in RLG's, without first overcoating with metals. With this instrument we have been able to identify damage effects that occur during the typical processing and handling of optics which cause wide variation in subsequent measurements depending on when, in the process, one takes data. These measurements indicate that techniques such as a Total Integrated Scatter (TIS) may be inadequate for standards on extremely low scatter optics because of the lack of sensitivity of the method on such surfaces. The general term for optical surfaces better than the lowest level of the scratch-dig standards has become "supersmooth", and is seen in technical literature as well as in advertising. A performance number, such as Bidirectional Radiation Distribution Function (BRDF), which can be measured from the uncoated optical surface by equipment such as the Variable Angle Scatterometer (VAS) is proposed as a method of generating better optical surface specifications. Data show that surfaces of average BRDF values near 10 parts per billion per steriadian (0.010 PPM/Sr) for 0-(301 = 0.5, are now possible and measurable.

  10. Damage Effects Identified By Scatter Evaluation Of Supersmooth Surfaces

    NASA Astrophysics Data System (ADS)

    Stowell, W. K.

    1984-10-01

    The surface quality of optics used in an extremely sensitive laser instrument, such as a Ring Laser Gyro (RLG), is critical. This has led to the development of a Variable Angle Scatterometer at the Air Force Wright Aeronautical Laboratories at Wright-Patterson Air Force Base, which can detect low level light scatter from the high quality optics used in RLG's, without first overcoating with metals. With this instrument we have been able to identify damage effects that occur during the typical processing and handling of optics which cause wide variation in subsequent measurements depending on when, in the process, one takes data. These measurements indicate that techniques such as a Total Integrated Scatter (TIS) may be inadequate for standards on extremely low scatter optics because of the lack of sensitivity of the method on such surfaces. The general term for optical surfaces better than the lowest level of the scratch-dig standards has become "supersmooth", and is seen in technical literature as well as in advertising. A performance number, such as Bidirectional Radiation Distribution Function (BRDF), which can be measured from the uncoated optical surface by equipment such as the Variable Angle Scatterometer (VAS) is proposed as a method of generating better optical surface specifications. Data show that surfaces of average BRDF values near 10 parts per billion per steriadian (0.010 PPM/Sr) for 0-(301 = 0.5, are now possible and measurable.

  11. SeaWiFS technical report series. Volume 5: Ocean optics protocols for SeaWiFS validation

    NASA Technical Reports Server (NTRS)

    Mueller, James L.; Austin, Roswell W.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor)

    1992-01-01

    Protocols are presented for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS accuracy goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review rationale. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibration instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April, 1991). This report is the proceedings of that workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are a first prescription to approach unprecedented measurement accuracies implied by the SeaWiFS goals, and research and development are needed to improve the state-of-the-art in specific areas. The protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle.

  12. Multimode laser beam analyzer instrument using electrically programmable optics.

    PubMed

    Marraccini, Philip J; Riza, Nabeel A

    2011-12-01

    Presented is a novel design of a multimode laser beam analyzer using a digital micromirror device (DMD) and an electronically controlled variable focus lens (ECVFL) that serve as the digital and analog agile optics, respectively. The proposed analyzer is a broadband laser characterization instrument that uses the agile optics to smartly direct light to the required point photodetectors to enable beam measurements of minimum beam waist size, minimum waist location, divergence, and the beam propagation parameter M(2). Experimental results successfully demonstrate these measurements for a 500 mW multimode test laser beam with a wavelength of 532 nm. The minimum beam waist, divergence, and M(2) experimental results for the test laser are found to be 257.61 μm, 2.103 mrad, 1.600 and 326.67 μm, 2.682 mrad, 2.587 for the vertical and horizontal directions, respectively. These measurements are compared to a traditional scan method and the results of the beam waist are found to be within error tolerance of the demonstrated instrument.

  13. Investigation of potential factors affecting the measurement of dew point temperature in oil-soaked transformers

    NASA Astrophysics Data System (ADS)

    Kraus, Adam H.

    Moisture within a transformer's insulation system has been proven to degrade its dielectric strength. When installing a transformer in situ, one method used to calculate the moisture content of the transformer insulation is to measure the dew point temperature of the internal gas volume of the transformer tank. There are two instruments commercially available that are designed for dew point temperature measurement: the Alnor Model 7000 Dewpointer and the Vaisala DRYCAPRTM Hand-Held Dewpoint Meter DM70. Although these instruments perform an identical task, the design technology behind each instrument is vastly different. When the Alnor Dewpointer and Vaisala DM70 instruments are used to measure the dew point of the internal gas volume simultaneously from a pressurized transformer, their differences in dew point measurement have been observed to vary as much as 30 °F. There is minimal scientific research available that focuses on the process of measuring dew point of a gas inside a pressurized transformer, let alone this observed phenomenon. The primary objective of this work was to determine what effect certain factors potentially have on dew point measurements of a transformer's internal gas volume, in hopes of understanding the root cause of this phenomenon. Three factors that were studied include (1) human error, (2) the use of calibrated and out-of-calibration instruments, and (3) the presence of oil vapor gases in the dry air sample, and their subsequent effects on the Q-value of the sampled gas. After completing this portion of testing, none of the selected variables proved to be a direct cause of the observed discrepancies between the two instruments. The secondary objective was to validate the accuracy of each instrument as compared to its respective published range by testing against a known dew point temperature produced by a humidity generator. In a select operating range of -22 °F to -4 °F, both instruments were found to be accurate and within their specified tolerances. This temperature range is frequently encountered in oil-soaked transformers, and demonstrates that both instruments can measure accurately over a limited, yet common, range despite their different design methodologies. It is clear that there is another unknown factor present in oil-soaked transformers that is causing the observed discrepancy between these instruments. Future work will include testing on newly manufactured or rewound transformers in order to investigate other variables that could be causing this discrepancy.

  14. Geophysics From Terrestrial Time-Variable Gravity Measurements

    NASA Astrophysics Data System (ADS)

    Van Camp, Michel; de Viron, Olivier; Watlet, Arnaud; Meurers, Bruno; Francis, Olivier; Caudron, Corentin

    2017-12-01

    In a context of global change and increasing anthropic pressure on the environment, monitoring the Earth system and its evolution has become one of the key missions of geosciences. Geodesy is the geoscience that measures the geometric shape of the Earth, its orientation in space, and gravity field. Time-variable gravity, because of its high accuracy, can be used to build an enhanced picture and understanding of the changing Earth. Ground-based gravimetry can determine the change in gravity related to the Earth rotation fluctuation, to celestial body and Earth attractions, to the mass in the direct vicinity of the instruments, and to vertical displacement of the instrument itself on the ground. In this paper, we review the geophysical questions that can be addressed by ground gravimeters used to monitor time-variable gravity. This is done in relation to the instrumental characteristics, noise sources, and good practices. We also discuss the next challenges to be met by ground gravimetry, the place that terrestrial gravimetry should hold in the Earth observation system, and perspectives and recommendations about the future of ground gravity instrumentation.

  15. Effect of corruption on healthcare satisfaction in post-soviet nations: A cross-country instrumental variable analysis of twelve countries.

    PubMed

    Habibov, Nazim

    2016-03-01

    There is the lack of consensus about the effect of corruption on healthcare satisfaction in transitional countries. Interpreting the burgeoning literature on this topic has proven difficult due to reverse causality and omitted variable bias. In this study, the effect of corruption on healthcare satisfaction is investigated in a set of 12 Post-Socialist countries using instrumental variable regression on the sample of 2010 Life in Transition survey (N = 8655). The results indicate that experiencing corruption significantly reduces healthcare satisfaction. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  16. Validity and reliability of the Behavior Problems Inventory, the Aberrant Behavior Checklist, and the Repetitive Behavior Scale-Revised among infants and toddlers at risk for intellectual or developmental disabilities: a multi-method assessment approach.

    PubMed

    Rojahn, Johannes; Schroeder, Stephen R; Mayo-Ortega, Liliana; Oyama-Ganiko, Rosao; LeBlanc, Judith; Marquis, Janet; Berke, Elizabeth

    2013-05-01

    Reliable and valid assessment of aberrant behaviors is essential in empirically verifying prevention and intervention for individuals with intellectual or developmental disabilities (IDD). Few instruments exist which assess behavior problems in infants. The current longitudinal study examined the performance of three behavior-rating scales for individuals with IDD that have been proven psychometrically sound in older populations: the Aberrant Behavior Checklist (ABC), the Behavior Problems Inventory (BPI-01), and the Repetitive Behavior Scale - Revised (RBS-R). Data were analyzed for 180 between six and 36 months old children at risk for IDD. Internal consistency (Cronbach's α) across the subscales of the three instruments was variable. Test-retest reliability of the three BPI-01 subscales ranged from .68 to .77 for frequency ratings and from .65 to .80 for severity ratings (intraclass correlation coefficients). Using a multitrait-multimethod matrix approach high levels of convergent and discriminant validity across the three instruments was found. As anticipated, there was considerable overlap in the information produced by the three instruments; however, each behavior-rating instrument also contributed unique information. Our findings support using all three scales in conjunction if possible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. The clinical learning environment and supervision by staff nurses: developing the instrument.

    PubMed

    Saarikoski, Mikko; Leino-Kilpi, Helena

    2002-03-01

    The aims of this study were (1) to describe students' perceptions of the clinical learning environment and clinical supervision and (2) to develop an evaluation scale by using the empirical results of this study. The data were collected using the Clinical Learning Environment and Supervision instrument (CLES). The instrument was based on the literature review of earlier studies. The derived instrument was tested empirically in a study involving nurse students (N=416) from four nursing colleges in Finland. The results demonstrated that the method of supervision, the number of separate supervision sessions and the psychological content of supervisory contact within a positive ward atmosphere are the most important variables in the students' clinical learning. The results also suggest that ward managers can create the conditions of a positive ward culture and a positive attitude towards students and their learning needs. The construct validity of the instrument was analysed by using exploratory factor analysis. The analysis indicated that the most important factor in the students' clinical learning is the supervisory relationship. The two most important factors constituting a 'good' clinical learning environment are the management style of the ward manager and the premises of nursing on the ward. The results of the factor analysis support the theoretical construction of the clinical learning environment modelled by earlier empirical studies.

  18. TREMOR: A wireless MEMS accelerograph for dense arrays

    USGS Publications Warehouse

    Evans, J.R.; Hamstra, R.H.; Kundig, C.; Camina, P.; Rogers, J.A.

    2005-01-01

    The ability of a strong-motion network to resolve wavefields can be described on three axes: frequency, amplitude, and space. While the need for spatial resolution is apparent, for practical reasons that axis is often neglected. TREMOR is a MEMS-based accelerograph using wireless Internet to minimize lifecycle cost. TREMOR instruments can economically augment traditional ones, residing between them to improve spatial resolution. The TREMOR instrument described here has dynamic range of 96 dB between ??2 g, or 102 dB between ??4 g. It is linear to ???1% of full scale (FS), with a response function effectively shaped electronically. We developed an economical, very low noise, accurate (???1%FS) temperature compensation method. Displacement is easily recovered to 10-cm accuracy at full bandwidth, and better with care. We deployed prototype instruments in Oakland, California, beginning in 1998, with 13 now at mean spacing of ???3 km - one of the most densely instrumented urban centers in the United States. This array is among the quickest in returning (PGA, PGV, Sa) vectors to ShakeMap, ???75 to 100 s. Some 13 events have been recorded. A ShakeMap and an example of spatial variability are shown. Extensive tests of the prototypes for a commercial instrument are described here and in a companion paper. ?? 2005, Earthquake Engineering Research Institute.

  19. Ultrasonic correlator versus signal averager as a signal to noise enhancement instrument

    NASA Technical Reports Server (NTRS)

    Kishoni, Doron; Pietsch, Benjamin E.

    1989-01-01

    Ultrasonic inspection of thick and attenuating materials is hampered by the reduced amplitudes of the propagated waves to a degree that the noise is too high to enable meaningful interpretation of the data. In order to overcome the low Signal to Noise (S/N) ratio, a correlation technique has been developed. In this method, a continuous pseudo-random pattern generated digitally is transmitted and detected by piezoelectric transducers. A correlation is performed in the instrument between the received signal and a variable delayed image of the transmitted one. The result is shown to be proportional to the impulse response of the investigated material, analogous to a signal received from a pulsed system, with an improved S/N ratio. The degree of S/N enhancement depends on the sweep rate. This paper describes the correlator, and compares it to the method of enhancing S/N ratio by averaging the signals. The similarities and differences between the two are highlighted and the potential advantage of the correlator system is explained.

  20. Comparison of a multispectral vision system and a colorimeter for the assessment of meat color.

    PubMed

    Trinderup, Camilla H; Dahl, Anders; Jensen, Kirsten; Carstensen, Jens Michael; Conradsen, Knut

    2015-04-01

    The color assessment ability of a multispectral vision system is investigated by a comparison study with color measurements from a traditional colorimeter. The experiment involves fresh and processed meat samples. Meat is a complex material; heterogeneous with varying scattering and reflectance properties, so several factors can influence the instrumental assessment of meat color. In order to assess whether two methods are equivalent, the variation due to these factors must be taken into account. A statistical analysis was conducted and showed that on a calibration sheet the two instruments are equally capable of measuring color. Moreover the vision system provides a more color rich assessment of fresh meat samples with a glossier surface, than the colorimeter. Careful studies of the different sources of variation enable an assessment of the order of magnitude of the variability between methods accounting for other sources of variation leading to the conclusion that color assessment using a multispectral vision system is superior to traditional colorimeter assessments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Palladium-based Mass-Tag Cell Barcoding with a Doublet-Filtering Scheme and Single Cell Deconvolution Algorithm

    PubMed Central

    Zunder, Eli R.; Finck, Rachel; Behbehani, Gregory K.; Amir, El-ad D.; Krishnaswamy, Smita; Gonzalez, Veronica D.; Lorang, Cynthia G.; Bjornson, Zach; Spitzer, Matthew H.; Bodenmiller, Bernd; Fantl, Wendy J.; Pe’er, Dana; Nolan, Garry P.

    2015-01-01

    SUMMARY Mass-tag cell barcoding (MCB) labels individual cell samples with unique combinatorial barcodes, after which they are pooled for processing and measurement as a single multiplexed sample. The MCB method eliminates variability between samples in antibody staining and instrument sensitivity, reduces antibody consumption, and shortens instrument measurement time. Here, we present an optimized MCB protocol with several improvements over previously described methods. The use of palladium-based labeling reagents expands the number of measurement channels available for mass cytometry and reduces interference with lanthanide-based antibody measurement. An error-detecting combinatorial barcoding scheme allows cell doublets to be identified and removed from the analysis. A debarcoding algorithm that is single cell-based rather than population-based improves the accuracy and efficiency of sample deconvolution. This debarcoding algorithm has been packaged into software that allows rapid and unbiased sample deconvolution. The MCB procedure takes 3–4 h, not including sample acquisition time of ~1 h per million cells. PMID:25612231

  2. Ultrasonic correlator versus signal averager as a signal to noise enhancement instrument

    NASA Technical Reports Server (NTRS)

    Kishoni, Doron; Pietsch, Benjamin E.

    1990-01-01

    Ultrasonic inspection of thick and attenuating materials is hampered by the reduce amplitudes of the propagated waves to a degree that the noise is too high to enable meaningful interpretation of the data. In order to overcome the low signal to noise ratio (S/N), a correlation technique has been developed. In this method, a continuous pseudo-random pattern generated digitally is transmitted and detected by piezoelectric transducers. A correlation is performed in the instrument between the received signal and a variable delayed image of the transmitted one. The result is shown to be proportional to the impulse response of the investigated material, analogous to a signal received from a pulsed system, with an improved S/N ratio. The degree of S/N enhancement depends on the sweep rate. The correlator is described, and it is compared to the method of enhancing S/N ratio by averaging the signals. The similarities and differences between the two are highlighted and the potential advantage of the correlator system is explained.

  3. Chromotomography for a rotating-prism instrument using backprojection, then filtering.

    PubMed

    Deming, Ross W

    2006-08-01

    A simple closed-form solution is derived for reconstructing a 3D spatial-chromatic image cube from a set of chromatically dispersed 2D image frames. The algorithm is tailored for a particular instrument in which the dispersion element is a matching set of mechanically rotated direct vision prisms positioned between a lens and a focal plane array. By using a linear operator formalism to derive the Tikhonov-regularized pseudoinverse operator, it is found that the unique minimum-norm solution is obtained by applying the adjoint operator, followed by 1D filtering with respect to the chromatic variable. Thus the filtering and backprojection (adjoint) steps are applied in reverse order relative to an existing method. Computational efficiency is provided by use of the fast Fourier transform in the filtering step.

  4. Communication of mechanically ventilated patients in intensive care units

    PubMed Central

    Martinho, Carina Isabel Ferreira; Rodrigues, Inês Tello Rato Milheiras

    2016-01-01

    Objective The aim of this study was to translate and culturally and linguistically adapt the Ease of Communication Scale and to assess the level of communication difficulties for patients undergoing mechanical ventilation with orotracheal intubation, relating these difficulties to clinical and sociodemographic variables. Methods This study had three stages: (1) cultural and linguistic adaptation of the Ease of Communication Scale; (2) preliminary assessment of its psychometric properties; and (3) observational, descriptive-correlational and cross-sectional study, conducted from March to August 2015, based on the Ease of Communication Scale - after extubation answers and clinical and sociodemographic variables of 31 adult patients who were extubated, clinically stable and admitted to five Portuguese intensive care units. Results Expert analysis showed high agreement on content (100%) and relevance (75%). The pretest scores showed a high acceptability regarding the completion of the instrument and its usefulness. The Ease of Communication Scale showed excellent internal consistency (0.951 Cronbach's alpha). The factor analysis explained approximately 81% of the total variance with two scale components. On average, the patients considered the communication experiences during intubation to be "quite hard" (2.99). No significant correlation was observed between the communication difficulties reported and the studied sociodemographic and clinical variables, except for the clinical variable "number of hours after extubation" (p < 0.05). Conclusion This study translated and adapted the first assessment instrument of communication difficulties for mechanically ventilated patients in intensive care units into European Portuguese. The preliminary scale validation suggested high reliability. Patients undergoing mechanical ventilation reported that communication during intubation was "quite hard", and these communication difficulties apparently existed regardless of the presence of other clinical and/or sociodemographic variables. PMID:27410408

  5. SeaWiFS technical report series. Volume 25: Ocean optics protocols for SeaWiFS validation, revision 1

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Mueller, James L.; Austin, Roswell W.

    1995-01-01

    This report presents protocols for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS uncertainty goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review the rationale for measuring each variable. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibrating instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April 1991). This report began as the proceedings of the workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are an evolving prescription to allow the research community to approach the unprecedented measurement uncertainties implied by the SeaWiFS goals; research and development are needed to improve the state-of-the-art in specific areas. These protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle. The present edition (Revision 1) incorporates new protocols in several areas, including expanded protocol descriptions for Case-2 waters and other improvements, as contributed by several members of the SeaWiFS Science Team.

  6. Early meteorological records from Latin-America and the Caribbean during the 18th and 19th centuries

    NASA Astrophysics Data System (ADS)

    Domínguez-Castro, Fernando; Vaquero, José Manuel; Gallego, María Cruz; Farrona, Ana María Marín; Antuña-Marrero, Juan Carlos; Cevallos, Erika Elizabeth; Herrera, Ricardo García; de La Guía, Cristina; Mejía, Raúl David; Naranjo, José Manuel; Del Rosario Prieto, María; Ramos Guadalupe, Luis Enrique; Seiner, Lizardo; Trigo, Ricardo Machado; Villacís, Marcos

    2017-11-01

    This paper provides early instrumental data recovered for 20 countries of Latin-America and the Caribbean (Argentina, Bahamas, Belize, Brazil, British Guiana, Chile, Colombia, Costa Rica, Cuba, Ecuador, France (Martinique and Guadalupe), Guatemala, Jamaica, Mexico, Nicaragua, Panama, Peru, Puerto Rico, El Salvador and Suriname) during the 18th and 19th centuries. The main meteorological variables retrieved were air temperature, atmospheric pressure, and precipitation, but other variables, such as humidity, wind direction, and state of the sky were retrieved when possible. In total, more than 300,000 early instrumental data were rescued (96% with daily resolution). Especial effort was made to document all the available metadata in order to allow further post-processing. The compilation is far from being exhaustive, but the dataset will contribute to a better understanding of climate variability in the region, and to enlarging the period of overlap between instrumental data and natural/documentary proxies.

  7. Early meteorological records from Latin-America and the Caribbean during the 18th and 19th centuries.

    PubMed

    Domínguez-Castro, Fernando; Vaquero, José Manuel; Gallego, María Cruz; Farrona, Ana María Marín; Antuña-Marrero, Juan Carlos; Cevallos, Erika Elizabeth; Herrera, Ricardo García; de la Guía, Cristina; Mejía, Raúl David; Naranjo, José Manuel; Del Rosario Prieto, María; Ramos Guadalupe, Luis Enrique; Seiner, Lizardo; Trigo, Ricardo Machado; Villacís, Marcos

    2017-11-14

    This paper provides early instrumental data recovered for 20 countries of Latin-America and the Caribbean (Argentina, Bahamas, Belize, Brazil, British Guiana, Chile, Colombia, Costa Rica, Cuba, Ecuador, France (Martinique and Guadalupe), Guatemala, Jamaica, Mexico, Nicaragua, Panama, Peru, Puerto Rico, El Salvador and Suriname) during the 18th and 19th centuries. The main meteorological variables retrieved were air temperature, atmospheric pressure, and precipitation, but other variables, such as humidity, wind direction, and state of the sky were retrieved when possible. In total, more than 300,000 early instrumental data were rescued (96% with daily resolution). Especial effort was made to document all the available metadata in order to allow further post-processing. The compilation is far from being exhaustive, but the dataset will contribute to a better understanding of climate variability in the region, and to enlarging the period of overlap between instrumental data and natural/documentary proxies.

  8. The impact of extracurricular activities participation on youth delinquent behaviors: An instrumental variables approach.

    PubMed

    Han, Sehee; Lee, Jonathan; Park, Kyung-Gook

    2017-07-01

    The purpose of this study was to examine the association between extracurricular activities (EA) participation and youth delinquency while tackling an endogeneity problem of EA participation. Using survey data of 12th graders in South Korea (n = 1943), this study employed an instrumental variables approach to address the self-selection problem of EA participation as the data for this study was based on an observational study design. We found a positive association between EA participation and youth delinquency based on conventional regression analysis. By contrast, we found a negative association between EA participation and youth delinquency based on an instrumental variables approach. These results indicate that caution should be exercised when we interpret the effect of EA participation on youth delinquency based on observational study designs. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  9. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    PubMed

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.

  10. Importance of Intrinsic and Instrumental Value of Education in Pakistan

    ERIC Educational Resources Information Center

    Kumar, Mahendar

    2017-01-01

    Normally, effectiveness of any object or thing is judged by two values; intrinsic and instrumental. To compare intrinsic value of education with instrumental value, this study has used the following variables: getting knowledge for its own sake, getting knowledge for social status, getting knowledge for job or business endeavor and getting…

  11. Do genetic risk scores for body mass index predict risk of phobic anxiety? Evidence for a shared genetic risk factor

    PubMed Central

    Walter, Stefan; Glymour, M. Maria; Koenen, Karestan; Liang, Liming; Tchetgen Tchetgen, Eric J; Cornelis, Marilyn; Chang, Shun-Chiao; Rewak, Marissa; Rimm, Eric; Kawachi, Ichiro; Kubzansky, Laura D.

    2015-01-01

    Background Obesity and anxiety are often linked but the direction of effects is not clear. Methods Using genetic instrumental variable (IV) analyses in a sample of 5911 female participants from the Nurses´ Health Study (NHS, initiated in 1976) and 3697 male participants from the Health Professional Follow-up Study (HPFS, initiated in 1986), we aim to determine whether obesity increases symptoms of phobic anxiety. FTO, MC4R, and a genetic risk score (GRS) based on 32 single nucleotide polymorphisms that significantly predict body mass index (BMI), were used as instrumental variables. “Functional” GRS corresponding with specific biological pathways that shape BMI (adipogenesis, appetite, and cardio-pulmonary), were considered. Phobic anxiety as measured by the Crown Crisp Experimental Index (CCI) in 2004 in NHS and 2000 in HPFS was the main outcome. Results In observational analysis, a one unit higher BMI was associated with higher phobic anxiety symptoms (women NHS: beta=0.05; 95% Confidence Interval (CI): 0.030 – 0.068 and men, HPFS, beta = 0.04; 95% CI: 0.016 – 0.071). IV analyses showed that BMI instrumented by FTO was associated with higher phobic anxiety symptoms (p = 0.005) but BMI instrumented by GRS was not (p=0.256). Functional GRS scores showed heterogeneous, non-significant effects of BMI on phobic anxiety symptoms. Conclusions Our findings do not provide conclusive evidence in favor of the hypothesis that higher BMI leads to higher levels of phobic anxiety, but rather suggest that genes that influence obesity, in particular FTO, may have direct effects on phobic anxiety, i.e., that obesity and phobic anxiety may share common genetic determinants. PMID:25065638

  12. Cross-cultural adaptation and reproducibility of the Brazilian-Portuguese version of the modified FRESNO Test to evaluate the competence in evidence based practice by physical therapists

    PubMed Central

    Silva, Anderson M.; Costa, Lucíola C. M.; Comper, Maria L.; Padula, Rosimeire S.

    2016-01-01

    BACKGROUND: The Modified Fresno Test was developed to assess knowledge and skills of both physical therapy (PT) professionals and students to use evidence-based practice (EBP). OBJECTIVES: To translate the Modified Fresno Test into Brazilian-Portuguese and to evaluate the test's reproducibility. METHOD: The first step consisted of adapting the instrument into the Brazilian-Portuguese language. Then, a total of 57 participants, including PT students, PT professors and PT practitioners, completed the translated instrument. The responses from the participants were used to evaluate reproducibility of the translated instrument. Internal consistency was calculated using the Cronbach's alpha. Reliability was calculated using the intraclass correlation coefficient (ICC) for continuous variables, and the Kappa coefficient (K) for categorical variables. The agreement was assessed using the standard error of the measurement (SEM). RESULTS: The cross-cultural adaptation process was appropriate, providing an adequate Brazilian-Portuguese version of the instrument. The internal consistency was good (α=0.769). The reliability for inter- and intra-rater assessment were ICC=0.89 (95% CI 0.82 to 0.93); for evaluator 1 was ICC=0.85 (95% CI 0.57 to 0.93); and for evaluator 2 was ICC=0.98 (95% CI 0.97 to 0.99). The SEM was 13.04 points for inter-rater assessment, 12.57 points for rater 1 and 4.59 points for rater 2. CONCLUSION: The Brazilian-Portuguese language version of the Modified Fresno Test showed satisfactory results in terms of reproducibility. The Modified Fresno Test will allow physical therapy professionals and students to be evaluated on the use of understanding EBP. PMID:26786079

  13. Observing variable stars at the University of Athens Observatory

    NASA Astrophysics Data System (ADS)

    Gazeas, K.; Manimanis, V. N.; Niarchos, P. G.

    In 1999 the University of Athens installed a 0.4-m Cassegrain telescope (CCT-16, by DFM Engineering) on the roof of the Department of Astrophysics, Astronomy and Mechanics, equipped with a ST-8 CCD camera and Bessel UBVRI filters. Although the telescope was built for educational purposes, we found it can be a perfect research instrument, as we can obtain fine quality light curves of bright variable stars, even from a place close to the city center. Light curves of the δ Scuti star V1162 Ori and of the sdB star PG 1336-018 are presented, showing the ability of a 40-cm telescope to detect negligible luminosity fluctuations of relatively bright variable stars. To date, we succeed in making photometry of stars down to 15th magnitude with satisfactory results. We expect to achieve even better results in the future, as our methods still improve, and as the large number of relatively bright stars gives us the chance to study various fields of CCD photometry of variables.

  14. Applying Propensity Score Methods in Medical Research: Pitfalls and Prospects

    PubMed Central

    Luo, Zhehui; Gardiner, Joseph C.; Bradley, Cathy J.

    2012-01-01

    The authors review experimental and nonexperimental causal inference methods, focusing on assumptions for the validity of instrumental variables and propensity score (PS) methods. They provide guidance in four areas for the analysis and reporting of PS methods in medical research and selectively evaluate mainstream medical journal articles from 2000 to 2005 in the four areas, namely, examination of balance, overlapping support description, use of estimated PS for evaluation of treatment effect, and sensitivity analyses. In spite of the many pitfalls, when appropriately evaluated and applied, PS methods can be powerful tools in assessing average treatment effects in observational studies. Appropriate PS applications can create experimental conditions using observational data when randomized controlled trials are not feasible and, thus, lead researchers to an efficient estimator of the average treatment effect. PMID:20442340

  15. Impact of Financial Incentives for Prenatal Care on Birth Outcomes and Spending

    PubMed Central

    Rosenthal, Meredith B; Li, Zhonghe; Robertson, Audra D; Milstein, Arnold

    2009-01-01

    Objective To evaluate the impact of offering US$100 each to patients and their obstetricians or midwives for timely and comprehensive prenatal care on low birth weight, neonatal intensive care admissions, and total pediatric health care spending in the first year of life. Data Sources/Study Setting Claims and enrollment profiles of the predominantly low-income and Hispanic participants of a union-sponsored, health insurance plan from 1998 to 2001. Study Design Panel data analysis of outcomes and spending for participants and nonparticipants using instrumental variables to account for selection bias. Data Collection/Abstraction Methods Data provided were analyzed using t-tests and chi-squared tests to compare maternal characteristics and birth outcomes for incentive program participants and nonparticipants, with and without instrumental variables to address selection bias. Adjusted variables were analyzed using logistic regression models. Principle Findings Participation in the incentive program was significantly associated with lower odds of neonatal intensive care unit admission (0.45; 95 percent CI, 0.23–0.88) and spending in the first year of life (estimated elasticity of −0.07; 95 percent CI, −0.12 to −0.01), but not low birth weight (0.53; 95 percent CI, 0.23–1.18). Conclusion The use of patient and physician incentives may be an effective mechanism for improving use of recommended prenatal care and associated outcomes, particularly among low-income women. PMID:19619248

  16. The effect of aircraft control forces on pilot performance during instrument landings in a flight simulator.

    PubMed

    Hewson, D J; McNair, P J; Marshall, R N

    2001-07-01

    Pilots may have difficulty controlling aircraft at both high and low force levels due to larger variability in force production at these force levels. The aim of this study was to measure the force variability and landing performance of pilots during an instrument landing in a flight simulator. There were 12 pilots who were tested while performing 5 instrument landings in a flight simulator, each of which required different control force inputs. Pilots can produce the least force when pushing the control column to the right, therefore the force levels for the landings were set relative to each pilot's maximum aileron-right force. The force levels for the landings were 90%, 60%, and 30% of maximal aileron-right force, normal force, and 25% of normal force. Variables recorded included electromyographic activity (EMG), aircraft control forces, aircraft attitude, perceived exertion and deviation from glide slope and heading. Multivariate analysis of variance was used to test for differences between landings. Pilots were least accurate in landing performance during the landing at 90% of maximal force (p < 0.05). There was also a trend toward decreased landing performance during the landing at 25% of normal force. Pilots were more variable in force production during the landings at 60% and 90% of maximal force (p < 0.05). Pilots are less accurate at performing instrument landings when control forces are high due to the increased variability of force production. The increase in variability at high force levels is most likely associated with motor unit recruitment, rather than rate coding. Aircraft designers need to consider the reduction in pilot performance at high force levels, as well as pilot strength limits when specifying new standards.

  17. Sources of Differences in On-Orbit Total Solar Irradiance Measurements and Description of Proposed Laboratory Intercomparison

    NASA Technical Reports Server (NTRS)

    Butler, J.J.; Johnson, B. C.; Rice, J. P.; Shirley, E. L.; Barnes, R.A.

    2008-01-01

    There is a 5 W/sq m (about 0.35 %) difference between current on-orbit Total Solar Irradiance (TSI) measurements. On 18-20 July 2005, a workshop was held at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland that focused on understanding possible reasons for this difference, through an examination of the instrument designs, calibration approaches, and appropriate measurement equations. The instruments studied in that workshop included the Active Cavity Radiometer Irradiance Monitor III (ACRIM III) on the Active Cavity Radiometer Irradiance Monitor SATellite (ACRIMSAT), the Total Irradiance Monitor (TIM) on the Solar Radiation and Climate Experiment (SORCE), the Variability of solar IRradiance and Gravity Oscillations (VIRGO) on the Solar and Heliospheric Observatory (SOHO), and the Earth Radiation Budget Experiment (ERBE) on the Earth Radiation Budget Satellite (ERBS). Presentations for each instrument included descriptions of its design, its measurement equation and uncertainty budget, and the methods used to assess on-orbit degradation. The workshop also included a session on satellite- and ground-based instrument comparisons and a session on laboratory-based comparisons and the application of new laboratory comparison techniques. The workshop has led to investigations of the effects of diffraction and of aperture area measurements on the differences between instruments. In addition, a laboratory-based instrument comparison is proposed that uses optical power measurements (with lasers that underEll the apertures of the TSI instruments), irradiance measurements (with lasers that overfill the apertures of the TSI instrument), and a cryogenic electrical substitution radiometer as a standard for comparing the instruments. A summary of the workshop and an overview of the proposed research efforts are presented here.

  18. Sources of Differences in On-Orbital Total Solar Irradiance Measurements and Description of a Proposed Laboratory Intercomparison

    PubMed Central

    Butler, J. J; Johnson, B. C; Rice, J. P; Shirley, E. L; Barnes, R. A

    2008-01-01

    There is a 5 W/m2 (about 0.35 %) difference between current on-orbit Total Solar Irradiance (TSI) measurements. On 18–20 July 2005, a workshop was held at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland that focused on understanding possible reasons for this difference, through an examination of the instrument designs, calibration approaches, and appropriate measurement equations. The instruments studied in that workshop included the Active Cavity Radiometer Irradiance Monitor III (ACRIM III) on the Active Cavity Radiometer Irradiance Monitor SATellite (ACRIMSAT), the Total Irradiance Monitor (TIM) on the Solar Radiation and Climate Experiment (SORCE), the Variability of solar IRradiance and Gravity Oscillations (VIRGO) on the Solar and Heliospheric Observatory (SOHO), and the Earth Radiation Budget Experiment (ERBE) on the Earth Radiation Budget Satellite (ERBS). Presentations for each instrument included descriptions of its design, its measurement equation and uncertainty budget, and the methods used to assess on-orbit degradation. The workshop also included a session on satellite- and ground-based instrument comparisons and a session on laboratory-based comparisons and the application of new laboratory comparison techniques. The workshop has led to investigations of the effects of diffraction and of aperture area measurements on the differences between instruments. In addition, a laboratory-based instrument comparison is proposed that uses optical power measurements (with lasers that underfill the apertures of the TSI instruments), irradiance measurements (with lasers that overfill the apertures of the TSI instrument), and a cryogenic electrical substitution radiometer as a standard for comparing the instruments. A summary of the workshop and an overview of the proposed research efforts are presented here. PMID:27096120

  19. Functional leg length discrepancy between theories and reliable instrumental assessment: a study about newly invented NPoS system

    PubMed Central

    Mahmoud, Asmaa; Abundo, Paolo; Basile, Luisanna; Albensi, Caterina; Marasco, Morena; Bellizzi, Letizia; Galasso, Franco; Foti, Calogero

    2017-01-01

    Summary Background In spite the instinct social&financial impact of Leg Length Discrepancy (LLD), controversial and conflicting results still exist regarding a reliable assessment/correction method. For proper management it’s essential to discriminate between anatomical&functional Leg Length Discrepancy (FLLD). With the newly invented NPoS (New Postural Solution), under the umbrella of the collaboration of PRM Department, Tor Vergata University with Baro Postural Instruments srl, positive results were observed in both measuring& compensating the hemi-pelvic antero-medial rotation in FLLD through personalized bilateral heel raise using two NPoS components: Foot Image System (FIS) and Postural Optimizer System (POS). This led our research interest to test the validity of NPoS as a preliminary step before evaluating its implementations in postural disorders. Methods After clinical evaluation, 4 subjects with FLLD have been assessed by NPoS. Over a period of 2 months, every subject was evaluated 12 times by two different operators, 48 measurements in total, results have been verified in correlation to BTS GaitLab results. Results Intra-Operator&inter-operator variability analysis showed statistically insignificant differences, while inter-method variability between NPoS and BTS parameters expressed a linear correlation. Conclusion Results suggest a significant validity of NPoS in assessment&correction of FLLD, with high degree of reproducibility with minimal operator dependency. This can be considered a base for promising clinical implications of NPoS as a reliable cost effective postural assessment/corrective tool. Level of evidence V. PMID:29264341

  20. Comparison of the Cytobrush®, dermatological curette and oral CDx® brush test as methods for obtaining samples of RNA for molecular analysis of oral cytology.

    PubMed

    Reboiras-López, M D; Pérez-Sayáns, M; Somoza-Martín, J M; Gayoso-Diz, P; Barros-Angueira, F; Gándara-Rey, J M; García-García, A

    2012-06-01

    Interest in oral exfoliative cytology has increased with the availability of molecular markers that may lead to the earlier diagnosis of oral squamous cell carcinoma. This research aims to compare the efficacy of three different instruments (Cytobrush, curette and Oral CDx brush) in providing adequate material for molecular analysis. One hundred and four cytological samples obtained from volunteer healthy subjects were analysed using all three instruments. The clinical and demographical variables under study were age, sex and smoking habits. The three instruments were compared for their ability to obtain adequate samples and for the amount of RNA obtained using quantitative real-time polymerase chain reaction (PCR-qRT) analysis of the Abelson (ABL) housekeeping gene. RNA of the ABL gene has been quantified by number of copies. Adequate samples were more likely to be obtained with a curette (90.6%) or Oral CDx (80.0%) than a Cytobrush (48.6%); P < 0.001. Similarly, the RNA quantification was 17.64 ± 21.10 with a curette, 16.04 ± 15.81 with Oral CDx and 6.82 ± 6.71 with a Cytobrush. There were statistically significant differences between the Cytobrush and curette (P = 0.008) and between the Cytobrush and OralCDx (P = 0.034). There was no difference according to the demographical variables. Oral exfoliative cytology is a simple, non-invasive technique that provides sufficient RNA to perform studies on gene expression. Although material was obtained with all three instruments, adequate samples were more likely to be obtained with the curette or Oral CDx than with a Cytobrush. The Oral CDx is a less aggressive instrument than the curette, so could be a useful tool in a clinical setting. © 2011 Blackwell Publishing Ltd.

  1. Spectrophotometers for plutonium monitoring in HB-line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lascola, R. J.; O'Rourke, P. E.; Kyser, E. A.

    2016-02-12

    This report describes the equipment, control software, calibrations for total plutonium and plutonium oxidation state, and qualification studies for the instrument. It also provides a detailed description of the uncertainty analysis, which includes source terms associated with plutonium calibration standards, instrument drift, and inter-instrument variability. Also included are work instructions for instrument, flow cell, and optical fiber setup, work instructions for routine maintenance, and drawings and schematic diagrams.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garimella, Sarvesh; Rothenberg, Daniel A.; Wolf, Martin J.

    This study investigates the measurement of ice nucleating particle (INP) concentrations and sizing of crystals using continuous flow diffusion chambers (CFDCs). CFDCs have been deployed for decades to measure the formation of INPs under controlled humidity and temperature conditions in laboratory studies and by ambient aerosol populations. These measurements have, in turn, been used to construct parameterizations for use in models by relating the formation of ice crystals to state variables such as temperature and humidity as well as aerosol particle properties such as composition and number. We show here that assumptions of ideal instrument behavior are not supported by measurements mademore » with a commercially available CFDC, the SPectrometer for Ice Nucleation (SPIN), and the instrument on which it is based, the Zurich Ice Nucleation Chamber (ZINC). Non-ideal instrument behavior, which is likely inherent to varying degrees in all CFDCs, is caused by exposure of particles to different humidities and/or temperatures than predicated from instrument theory of operation. This can result in a systematic, and variable, underestimation of reported INP concentrations. Here we find here variable correction factors from 1.5 to 9.5, consistent with previous literature values. We use a machine learning approach to show that non-ideality is most likely due to small-scale flow features where the aerosols are combined with sheath flows. Machine learning is also used to minimize the uncertainty in measured INP concentrations. Finally, we suggest that detailed measurement, on an instrument-by-instrument basis, be performed to characterize this uncertainty.« less

  3. Uncertainty in counting ice nucleating particles with continuous flow diffusion chambers

    NASA Astrophysics Data System (ADS)

    Garimella, Sarvesh; Rothenberg, Daniel A.; Wolf, Martin J.; David, Robert O.; Kanji, Zamin A.; Wang, Chien; Rösch, Michael; Cziczo, Daniel J.

    2017-09-01

    This study investigates the measurement of ice nucleating particle (INP) concentrations and sizing of crystals using continuous flow diffusion chambers (CFDCs). CFDCs have been deployed for decades to measure the formation of INPs under controlled humidity and temperature conditions in laboratory studies and by ambient aerosol populations. These measurements have, in turn, been used to construct parameterizations for use in models by relating the formation of ice crystals to state variables such as temperature and humidity as well as aerosol particle properties such as composition and number. We show here that assumptions of ideal instrument behavior are not supported by measurements made with a commercially available CFDC, the SPectrometer for Ice Nucleation (SPIN), and the instrument on which it is based, the Zurich Ice Nucleation Chamber (ZINC). Non-ideal instrument behavior, which is likely inherent to varying degrees in all CFDCs, is caused by exposure of particles to different humidities and/or temperatures than predicated from instrument theory of operation. This can result in a systematic, and variable, underestimation of reported INP concentrations. We find here variable correction factors from 1.5 to 9.5, consistent with previous literature values. We use a machine learning approach to show that non-ideality is most likely due to small-scale flow features where the aerosols are combined with sheath flows. Machine learning is also used to minimize the uncertainty in measured INP concentrations. We suggest that detailed measurement, on an instrument-by-instrument basis, be performed to characterize this uncertainty.

  4. Long-term infrared monitoring of stellar sources from earth orbit

    NASA Technical Reports Server (NTRS)

    Maran, S. P.; Heinsheimer, T. F.; Stocker, T. L.; Anand, S. P. S.; Chapman, R. D.; Hobbs, R. W.; Michalitsanos, A. G.; Wright, F. H.; Kipp, S. L.

    1976-01-01

    A program is discussed which involved monitoring the photometric activity of 18 bright variable IR stars at 2.7 microns with satellite- and rocket-borne instrumentation in the period from 1971 to 1975. The stellar sample includes 3 Lb variables, 8 semiregular variables, 5 Mira-type variables, and 2 previously unknown and unclassified IR variables. Detailed light curves of many of these stars were determined for intervals of 3 yr or more; spectra from 2.7 to 20 microns were constructed for nine of them using data obtained entirely with instruments above the atmosphere. Photometric IR light curves and other data are presented for SW Virginis, R Aquilae, S Scuti, IRC 00265, RT Hydrae, S Orionis, S Canis Minoris, Omicron Ceti, and R Leonis. Several hypotheses concerning the interpretation of the IR data are examined.

  5. Effects of thermal deformation on optical instruments for space application

    NASA Astrophysics Data System (ADS)

    Segato, E.; Da Deppo, V.; Debei, S.; Cremonese, G.

    2017-11-01

    Optical instruments for space missions work in hostile environment, it's thus necessary to accurately study the effects of ambient parameters variations on the equipment. In particular optical instruments are very sensitive to ambient conditions, especially temperature. This variable can cause dilatations and misalignments of the optical elements, and can also lead to rise of dangerous stresses in the optics. Their displacements and the deformations degrade the quality of the sampled images. In this work a method for studying the effects of the temperature variations on the performance of imaging instrument is presented. The optics and their mountings are modeled and processed by a thermo-mechanical Finite Element Model (FEM) analysis, then the output data, which describe the deformations of the optical element surfaces, are elaborated using an ad hoc MATLAB routine: a non-linear least square optimization algorithm is adopted to determine the surface equations (plane, spherical, nth polynomial) which best fit the data. The obtained mathematical surface representations are then directly imported into ZEMAX for sequential raytracing analysis. The results are the variations of the Spot Diagrams, of the MTF curves and of the Diffraction Ensquared Energy due to simulated thermal loads. This method has been successfully applied to the Stereo Camera for the BepiColombo mission reproducing expected operative conditions. The results help to design and compare different optical housing systems for a feasible solution and show that it is preferable to use kinematic constraints on prisms and lenses to minimize the variation of the optical performance of the Stereo Camera.

  6. Cyclic fatigue testing of nickel-titanium endodontic instruments.

    PubMed

    Pruett, J P; Clement, D J; Carnes, D L

    1997-02-01

    Cyclic fatigue of nickel-titanium, engine-driven instruments was studied by determining the effect of canal curvature and operating speed on the breakage of Lightspeed instruments. A new method of canal curvature evaluation that addressed both angle and abruptness of curvature was introduced. Canal curvature was simulated by constructing six curved stainless-steel guide tubes with angles of curvature of 30, 45, or 60 degrees, and radii of curvature of 2 or 5 mm. Size #30 and #40 Light-speed instruments were placed through the guide tubes and the heads secured in the collet of a Mangtrol Dynamometer. A simulated operating load of 10 g-cm was applied. Instruments were able to rotate freely in the test apparatus at speeds of 750, 1300, or 2000 rpm until separation occurred. Cycles to failure were determined. Cycles to failure were not affected by rpm. Instruments did not separate at the head, but rather at the point of maximum flexure of the shaft, corresponding to the midpoint of curvature within the guide tube. The instruments with larger diameter shafts, #40, failed after significantly fewer cycles than did #30 instruments under identical test conditions. Multivariable analysis of variance indicated that cycles to failure significantly decreased as the radius of curvature decreased from 5 mm to 2 mm and as the angle of curvature increased greater than 30 degrees (p < 0.05, power = 0.9). Scanning electron microscopic evaluation revealed ductile fracture as the fatigue failure mode. These results indicate that, for nickel-titanium, engine-driven rotary instruments, the radius of curvature, angle of curvature, and instrument size are more important than operating speed for predicting separation. This study supports engineering concepts of cyclic fatigue failure and suggests that standardized fatigue tests of nickel-titanium rotary instruments should include dynamic operation in a flexed state. The results also suggest that the effect of the radius of curvature as an independent variable should be considered when evaluating studies of root canal instrumentation.

  7. Comparison of mid-infrared transmission spectroscopy with biochemical methods for the determination of macronutrients in human milk.

    PubMed

    Silvestre, Dolores; Fraga, Miriam; Gormaz, María; Torres, Ester; Vento, Máximo

    2014-07-01

    The variability of human milk (HM) composition renders analysis of its components essential for optimal nutrition of preterm fed either with donor's or own mother's milk. To fulfil this requirement, various analytical instruments have been subjected to scientific and clinical evaluation. The objective of this study was to evaluate the suitability of a rapid method for the analysis of macronutrients in HM as compared with the analytical methods applied by cow's milk industry. Mature milk from 39 donors was analysed using an infrared human milk analyser (HMA) and compared with biochemical reference laboratory methods. The statistical analysis was based on the use of paired data tests. The use of an infrared HMA for the analysis of lipids, proteins and lactose in HM proved satisfactory as regards the rapidity, simplicity and the required sample volume. The instrument afforded good linearity and precision in application to all three nutrients. However, accuracy was not acceptable when compared with the reference methods, with overestimation of the lipid content and underestimation of the amount of proteins and lactose contents. The use of mid-infrared HMA might become the standard for rapid analysis of HM once standardisation and rigorous and systematic calibration is provided. © 2012 John Wiley & Sons Ltd.

  8. Generic patient-reported outcomes in child health research: a review of conceptual content using World Health Organization definitions.

    PubMed

    Fayed, Nora; de Camargo, Olaf Kraus; Kerr, Elizabeth; Rosenbaum, Peter; Dubey, Ankita; Bostan, Cristina; Faulhaber, Markus; Raina, Parminder; Cieza, Alarcos

    2012-12-01

    Our aims were to (1) describe the conceptual basis of popular generic instruments according to World Health Organization (WHO) definitions of functioning, disability, and health (FDH), and quality of life (QOL) with health-related quality of life (HRQOL) as a subcomponent of QOL; (2) map the instruments to the International Classification of Functioning, Disability and Health (ICF); and (3) provide information on how the analyzed instruments were used in the literature. This should enable users to make valid choices about which instruments have the desired content for a specific context or purpose. Child health-based literature over a 5-year period was reviewed to find research employing health status and QOL/HRQOL instruments. WHO definitions of FDH and QOL were applied to each item of the 15 most used instruments to differentiate measures of FDH and QOL/HRQOL. The ICF was used to describe the health and health-related content (if any) in those instruments. Additional aspects of instrument use were extracted from these articles. Many instruments that were used to measure QOL/HRQOL did not reflect WHO definitions of QOL. The ICF domains within instruments were highly variable with respect to whether body functions, activities and participation, or environment were emphasized. There is inconsistency among researchers about how to measure HRQOL and QOL. Moreover, when an ICF content analysis is applied, there is variability among instruments in the health components included and emphasized. Reviewing content is important for matching instruments to their intended purpose. © The Authors. Developmental Medicine & Child Neurology © 2012 Mac Keith Press.

  9. SeaWiFS Postlaunch Technical Report Series. Volume 3; The SeaBOARR-98 Field Campaign

    NASA Technical Reports Server (NTRS)

    Zibordi, Giuseppe; Lazin, Gordana; McLean, Scott; Firestone, Elaine R. (Editor); Hooker, Stanford B. (Editor)

    1999-01-01

    This report documents the scientific activities during the first Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Bio-Optical Algorithm Round-Robin (SeaBOARR-98) experiment, which took place from 5-17 July 1998, at the Acqua Alta Oceanographic Tower (AAOT) in the northern Adriatic Sea off the coast of Italy. The ultimate objective of the SeaBOARR activity is to evaluate the effect of different measurement protocols on bio-optical algorithms using data from a variety of field campaigns. The SeaBOARR-98 field campaign was concerned with collecting a high quality data set of simultaneous in-water and above-water radiometric measurements. The deployment goals documented in this report were to: a) use four different surface glint correction methods to compute water-leaving radiances, L W (lambda), from above-water data; b) use two different in-water profiling systems and three different methods to compute L W (lambda) from in-water data (one making measurements at a fixed distance from the tower, 7.5 m, and the other at variable distances up to 29 m away); c) use instruments with a common calibration history to minimize intercalibration uncertainties; d) monitor the calibration drift of the instruments in the field with a second generation SeaWiFS Quality Monitor (SQM-II), to separate differences in methods from changes in instrument performance; and e) compare the L W (lambda) values estimated from the above-water and in-water measurements. In addition to describing the instruments deployed and the data collected, a preliminary analysis of the data is presented, and the kind of follow-on work that is needed to completely assess the estimation of L W (lambda) from above-water and in-water measurements is discussed.

  10. Age-Related Changes in Bimanual Instrument Playing with Rhythmic Cueing

    PubMed Central

    Kim, Soo Ji; Cho, Sung-Rae; Yoo, Ga Eul

    2017-01-01

    Deficits in bimanual coordination of older adults have been demonstrated to significantly limit their functioning in daily life. As a bimanual sensorimotor task, instrument playing has great potential for motor and cognitive training in advanced age. While the process of matching a person’s repetitive movements to auditory rhythmic cueing during instrument playing was documented to involve motor and attentional control, investigation into whether the level of cognitive functioning influences the ability to rhythmically coordinate movement to an external beat in older populations is relatively limited. Therefore, the current study aimed to examine how timing accuracy during bimanual instrument playing with rhythmic cueing differed depending on the degree of participants’ cognitive aging. Twenty one young adults, 20 healthy older adults, and 17 older adults with mild dementia participated in this study. Each participant tapped an electronic drum in time to the rhythmic cueing provided using both hands simultaneously and in alternation. During bimanual instrument playing with rhythmic cueing, mean and variability of synchronization errors were measured and compared across the groups and the tempo of cueing during each type of tapping task. Correlations of such timing parameters with cognitive measures were also analyzed. The results showed that the group factor resulted in significant differences in the synchronization errors-related parameters. During bimanual tapping tasks, cognitive decline resulted in differences in synchronization errors between younger adults and older adults with mild dimentia. Also, in terms of variability of synchronization errors, younger adults showed significant differences in maintaining timing performance from older adults with and without mild dementia, which may be attributed to decreased processing time for bimanual coordination due to aging. Significant correlations were observed between variability of synchronization errors and performance of cognitive tasks involving executive control and cognitive flexibility when asked for bimanual coordination in response to external timing cues at adjusted tempi. Also, significant correlations with cognitive measures were more prevalent in variability of synchronization errors during alternative tapping compared to simultaneous tapping. The current study supports that bimanual tapping may be predictive of cognitive processing of older adults. Also, tempo and type of movement required for instrument playing both involve cognitive and motor loads at different levels, and such variables could be important factors for determining the complexity of the task and the involved task requirements for interventions using instrument playing. PMID:29085309

  11. The productivity of mental health care: an instrumental variable approach.

    PubMed

    Lu, Mingshan

    1999-06-01

    BACKGROUND: Like many other medical technologies and treatments, there is a lack of reliable evidence on treatment effectiveness of mental health care. Increasingly, data from non-experimental settings are being used to study the effect of treatment. However, as in a number of studies using non-experimental data, a simple regression of outcome on treatment shows a puzzling negative and significant impact of mental health care on the improvement of mental health status, even after including a large number of potential control variables. The central problem in interpreting evidence from real-world or non-experimental settings is, therefore, the potential "selection bias" problem in observational data set. In other words, the choice/quantity of mental health care may be correlated with other variables, particularly unobserved variables, that influence outcome and this may lead to a bias in the estimate of the effect of care in conventional models. AIMS OF THE STUDY: This paper addresses the issue of estimating treatment effects using an observational data set. The information in a mental health data set obtained from two waves of data in Puerto Rico is explored. The results using conventional models - in which the potential selection bias is not controlled - and that from instrumental variable (IV) models - which is what was proposed in this study to correct for the contaminated estimation from conventional models - are compared. METHODS: Treatment effectiveness is estimated in a production function framework. Effectiveness is measured as the improvement in mental health status. To control for the potential selection bias problem, IV approaches are employed. The essence of the IV method is to use one or more instruments, which are observable factors that influence treatment but do not directly affect patient outcomes, to isolate the effect of treatment variation that is independent of unobserved patient characteristics. The data used in this study are the first (1992-1993) and second (1993-1994) wave of the ongoing longitudinal study Mental Health Care Utilization Among Puerto Ricans, which includes information for an island-wide probability sample of over 3000 adults living in poor areas of Puerto Rico. The instrumental variables employed in this study are travel distance and health insurance sources. RESULTS: It is very noticeable that in this study, treatment effects were found to be negative in all conventional models (in some cases, highly significant). However, after the IV method was applied, the estimated marginal effects of treatment became positive. Sensitivity analysis partly supports this conclusion. According to the IV estimation results, treatment is productive for the group in most need of mental health care. However, estimations do not find strong enough evidence to demonstrate treatment effects on other groups with less or no need. The results in this paper also suggest an important impact of the following factors on the probability of improvement in mental health status: baseline mental health status, previous treatment, sex, marital status and education. DISCUSSION: The IV approach provides a practical way to reduce the selection bias due to the confounding of treatment with unmeasured variables. The limitation of this study is that the instruments explored did not perform well enough in some IV equations, therefore the predictive power remains questionable. The most challenging part of applying the IV approach is on finding "good" instruments which influence the choice/quantity of treatment yet do not introduce further bias by being directly correlated with treatment outcome. CONCLUSIONS: The results in this paper are supportive of the concerns on the credibility of evaluation results using observation data set when the endogeneity of the treatment variable is not controlled. Unobserved factors contribute to the downward bias in the conventional models. The IV approach is shown to be an appropriate method to reduce the selection bias for the group in most need for mental health care, which is also the group of most policy and treatment concern. IMPLICATIONS FOR HEALTH CARE PROVISION AND USE: The results of this work have implications for resource allocation in mental health care. Evidence is found that mental health care provided in Puerto Rico is productive, and is most helpful for persons in most need for mental health care. According to what estimated from the IV models, on the margin, receiving formal mental health care significantly increases the probability of obtaining a better mental health outcome by 19.2%, and one unit increase in formal treatment increased the probability of becoming healthier by 6.2% to 8.4%. Consistent with other mental health literature, an individual's baseline mental health status is found to be significantly related to the probability of improvement in mental health status: individuals with previous treatment history are less likely to improve. Among demographic factors included in the production function, being female, married, and high education were found to contribute to a higher probability of improvement. IMPLICATION FOR FURTHER RESEARCH: In order to provide accurate evidence of treatment effectiveness of medical technologies to support decision making, it is important that the selection bias be controlled as rigorously as possible when using information from a non-experimental setting. More data and a longer panel are also needed to provide more valid evidence. tion.

  12. High Precision Seawater Sr/Ca Measurements in the Florida Keys by Inductively Coupled Plasma Atomic Emission Spectrometry: Analytical Method and Implications for Coral Paleothermometry

    NASA Astrophysics Data System (ADS)

    Khare, A.; Kilbourne, K. H.; Schijf, J.

    2017-12-01

    Standard methods of reconstructing past sea surface temperatures (SSTs) with coral skeletal Sr/Ca ratios assume the seawater Sr/Ca ratio is constant. However, there is little data to support this assumption, in part because analytical techniques capable of determining seawater Sr/Ca with sufficient accuracy and precision are expensive and time consuming. We demonstrate a method to measure seawater Sr/Ca using inductively coupled plasma atomic emission spectrometry where we employ an intensity ratio calibration routine that reduces the self- matrix effects of calcium and cancels out the matrix effects that are common to both calcium and strontium. A seawater standard solution cross-calibrated with multiple instruments is used to correct for long-term instrument drift and any remnant matrix effects. The resulting method produces accurate seawater Sr/Ca determinations rapidly, inexpensively, and with a precision better than 0.2%. This method will make it easier for coral paleoclimatologists to quantify potentially problematic fluctuations in seawater Sr/Ca at their study locations. We apply our method to test for variability in surface seawater Sr/Ca along the Florida Keys Reef Tract. We are collecting winter and summer samples for two years in a grid with eleven nearshore to offshore transects across the reef, as well as continuous samples collected by osmotic pumps at four locations adjacent to our grid. Our initial analysis of the grid samples indicates a trend of decreasing Sr/Ca values offshore potentially due to a decreasing groundwater influence. The values differ by as much as 0.05 mmol/mol which could lead to an error of 1°C in mean SST reconstructions. Future work involves continued sampling in the Florida Keys to test for seasonal and interannual variability in seawater Sr/Ca, as well as collecting data from small reefs in the Virgin Islands to test the stability of seawater Sr/Ca under different geologic, hydrologic and hydrographic environments.

  13. Sonification of Kepler Field SU UMa Cataclysmic Variable Stars V344 Lyr and V1504 Cyg

    NASA Technical Reports Server (NTRS)

    Tutchton, Roxanne M.; Wood, Matt A.; Still, Martin D.; Howell, Steve B.; Cannizzo, John K.; Smale, Alan P.

    2012-01-01

    Sonification is the conversion of quantitative data into sound. In this work we explain the methods used in the sonification of light curves provided by the Kepler instrument from Q2 through Q6 for the cataclysmic variable systems V344 Lyr and V1504 Cyg . Both systems are SU UMa stars showing dwarf nova outbursts and superoutbursts as well as positive and negative superhumps. Focused sonifications were done from average pulse shapes of each superhump, and separate sonifications of the full, residual light curves were done for both stars. The audio of these data reflected distinct patterns within the evolutions of supercycles and superhumps that matched pervious observations and proved to be effective aids in data analysis.

  14. LIMS for Lasers 2015 for achieving long-term accuracy and precision of δ2H, δ17O, and δ18O of waters using laser absorption spectrometry

    USGS Publications Warehouse

    Coplen, Tyler B.; Wassenaar, Leonard I

    2015-01-01

    RationaleAlthough laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits.MethodsA Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ17O, δ18O, and δ2H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales.ResultsCost-free LIMS for Lasers 2015 enables users to obtain improved δ17O, δ18O, and δ2H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ2HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale.ConclusionsLIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ2H, δ17O, and δ18O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits easily. Published in 2015. This article is a U.S. Government work and is in the public domain in the USA.

  15. Surface Meteorological Station - Astoria, OR (AST) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2017-10-23

    A variety of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  16. Surface Meteorological Station - ESRL Short Tower, Bonneville - Raw Data

    DOE Data Explorer

    McCaffrey, Katherine

    2017-10-23

    A diversity of instruments are used to measure various quantities related to meteorology and precipitation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  17. Surface Meteorological Station - ESRL Short Tower, Condon - Reviewed Data

    DOE Data Explorer

    McCaffrey, Katherine

    2017-10-23

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  18. Surface Meteorological Station - ESRL Short Tower, Troutdale - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2017-12-11

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  19. Surface Meteorological Station - ESRL Short Tower, Prineville - Raw Data

    DOE Data Explorer

    McCaffrey, Katherine

    2017-10-23

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  20. Surface Meteorological Station - ESRL Short Tower, Troutdale - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2017-12-11

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  1. Surface Meteorological Station - ESRL Short Tower, Prineville - Reviewed Data

    DOE Data Explorer

    McCaffrey, Katherine

    2017-10-23

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  2. Surface Meteorological Station - ESRL Short Tower, Bonneville - Reviewed Data

    DOE Data Explorer

    McCaffrey, Katherine

    2017-10-23

    A diversity of instruments are used to measure various quantities related to meteorology and precipitation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  3. Surface Meteorological Station - North Bend, OR (OTH) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2017-10-23

    A variety of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  4. Surface Meteorological Station - ESRL Short Tower, Condon - Raw Data

    DOE Data Explorer

    McCaffrey, Katherine

    2017-10-23

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  5. Percentage Energy from Fat Screener: Overview

    Cancer.gov

    A short assessment instrument to estimate an individual's usual intake of percentage energy from fat. The foods asked about on the instrument were selected because they were the most important predictors of variability in percentage energy.

  6. Surface Meteorological Station - Forks, WA (FKS) - Raw Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gottas, Daniel

    A variety of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  7. Surface Meteorological Station - Forks, WA (FKS) - Reviewed Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gottas, Daniel

    A variety of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  8. An initial examination of Singaporean seventh and eighth graders' views of nature of science

    NASA Astrophysics Data System (ADS)

    Lin, Tzung-Jin; Goh, Amos Yoong Shin; Chai, Ching Sing; Tsai, Chin-Chung

    2013-07-01

    Background and purpose . Research in nature of science (NOS) among Asia-Pacific countries such as Singapore is arguably scarce. This study aimed to survey Singaporean secondary school students' views of NOS with a newly developed instrument named Students' Views of Nature of Science (SVNOS), which included various key aspects of NOS that are generally agreed upon by the science education community. Moreover, the relations between some demographic factors, including gender and grade, and students' views of NOS were explored. Sample, design and method In total, 359 Singaporean seventh and eighth graders were invited to participate in this survey. The reliability, validity and structure of the SVNOS instrument were ensured by confirmatory factor analysis. A two-way multivariate analysis of variance was then conducted to determine the interaction effects between the gender variable and the grade-level variable. Results and conclusion The results indicated that the SVNOS instrument is reliable and valid to assess students' views of NOS regarding seven distinct NOS dimensions. The male students were more prone to have constructivist-oriented views of NOS in the most of the SVNOS dimensions, while the female students conveyed more non-objective views of NOS. In addition, the eighth graders revealed more empiricist-oriented views of NOS than the seventh graders in several SVNOS dimensions. This result seems to contradict the results of previous studies that students' views of NOS may reflect a developmental trend with their increasing educational experiences.

  9. Cross-Cultural adaptation of the General Functioning Scale of the Family

    PubMed Central

    Pires, Thiago; de Assis, Simone Gonçalves; Avanci, Joviana Quintes; Pesce, Renata Pires

    2016-01-01

    ABSTRACT OBJECTIVE To describe the process of cross-cultural adaptation of the General Functioning Scale of the Family, a subscale of the McMaster Family Assessment Device, for the Brazilian population. METHODS The General Functioning Scale of the Family was translated into Portuguese and administered to 500 guardians of children in the second grade of elementary school in public schools of Sao Gonçalo, Rio de Janeiro, Southeastern Brazil. The types of equivalences investigated were: conceptual and of items, semantic, operational, and measurement. The study involved discussions with experts, translations and back-translations of the instrument, and psychometric assessment. Reliability and validity studies were carried out by internal consistency testing (Cronbach’s alpha), Guttman split-half correlation model, Pearson correlation coefficient, and confirmatory factor analysis. Associations between General Functioning of the Family and variables theoretically associated with the theme (father’s or mother’s drunkenness and violence between parents) were estimated by odds ratio. RESULTS Semantic equivalence was between 90.0% and 100%. Cronbach’s alpha ranged from 0.79 to 0.81, indicating good internal consistency of the instrument. Pearson correlation coefficient ranged between 0.303 and 0.549. Statistical association was found between the general functioning of the family score and the theoretically related variables, as well as good fit quality of the confirmatory analysis model. CONCLUSIONS The results indicate the feasibility of administering the instrument to the Brazilian population, as it is easy to understand and a good measurement of the construct of interest. PMID:27355464

  10. The Role of Initial Attack and Performer Expertise on Instrument Identification

    ERIC Educational Resources Information Center

    Cassidy, Jane W.; Schlegel, Amanda L.

    2016-01-01

    The purpose of this study was to examine the role initial attack and expertise play in the identification of instrumental tones. A stimulus CD was made of 32 excerpts of instrumental tones. Sixteen possible combinations of the variables of initial attack (present or absent), expertise (beginner versus professional), and timbre (flute, clarinet,…

  11. Is it feasible to estimate radiosonde biases from interlaced measurements?

    NASA Astrophysics Data System (ADS)

    Kremser, Stefanie; Tradowsky, Jordis S.; Rust, Henning W.; Bodeker, Greg E.

    2018-05-01

    Upper-air measurements of essential climate variables (ECVs), such as temperature, are crucial for climate monitoring and climate change detection. Because of the internal variability of the climate system, many decades of measurements are typically required to robustly detect any trend in the climate data record. It is imperative for the records to be temporally homogeneous over many decades to confidently estimate any trend. Historically, records of upper-air measurements were primarily made for short-term weather forecasts and as such are seldom suitable for studying long-term climate change as they lack the required continuity and homogeneity. Recognizing this, the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) has been established to provide reference-quality measurements of climate variables, such as temperature, pressure, and humidity, together with well-characterized and traceable estimates of the measurement uncertainty. To ensure that GRUAN data products are suitable to detect climate change, a scientifically robust instrument replacement strategy must always be adopted whenever there is a change in instrumentation. By fully characterizing any systematic differences between the old and new measurement system a temporally homogeneous data series can be created. One strategy is to operate both the old and new instruments in tandem for some overlap period to characterize any inter-instrument biases. However, this strategy can be prohibitively expensive at measurement sites operated by national weather services or research institutes. An alternative strategy that has been proposed is to alternate between the old and new instruments, so-called interlacing, and then statistically derive the systematic biases between the two instruments. Here we investigate the feasibility of such an approach specifically for radiosondes, i.e. flying the old and new instruments on alternating days. Synthetic data sets are used to explore the applicability of this statistical approach to radiosonde change management.

  12. Decoupling Solar Variability and Instrument Trends Using the Multiple Same-Irradiance-Level (MuSIL) Analysis Technique

    NASA Astrophysics Data System (ADS)

    Woods, Thomas N.; Eparvier, Francis G.; Harder, Jerald; Snow, Martin

    2018-05-01

    The solar spectral irradiance (SSI) dataset is a key record for studying and understanding the energetics and radiation balance in Earth's environment. Understanding the long-term variations of the SSI over timescales of the 11-year solar activity cycle and longer is critical for many Sun-Earth research topics. Satellite measurements of the SSI have been made since the 1970s, most of them in the ultraviolet, but recently also in the visible and near-infrared. A limiting factor for the accuracy of previous solar variability results is the uncertainties for the instrument degradation corrections, which need fairly large corrections relative to the amount of solar cycle variability at some wavelengths. The primary objective of this investigation has been to separate out solar cycle variability and any residual uncorrected instrumental trends in the SSI measurements from the Solar Radiation and Climate Experiment (SORCE) mission and the Thermosphere, Mesosphere, Ionosphere, Energetic, and Dynamics (TIMED) mission. A new technique called the Multiple Same-Irradiance-Level (MuSIL) analysis has been developed, which examines an SSI time series at different levels of solar activity to provide long-term trends in an SSI record, and the most common result is a downward trend that most likely stems from uncorrected instrument degradation. This technique has been applied to each wavelength in the SSI records from SORCE (2003 - present) and TIMED (2002 - present) to provide new solar cycle variability results between 27 nm and 1600 nm with a resolution of about 1 nm at most wavelengths. This technique, which was validated with the highly accurate total solar irradiance (TSI) record, has an estimated relative uncertainty of about 5% of the measured solar cycle variability. The MuSIL results are further validated with the comparison of the new solar cycle variability results from different solar cycles.

  13. Advances in satellite remote sensing of environmental variables for epidemiological applications.

    PubMed

    Goetz, S J; Prince, S D; Small, J

    2000-01-01

    Earth-observing satellites have provided an unprecedented view of the land surface but have been exploited relatively little for the measurement of environmental variables of particular relevance to epidemiology. Recent advances in techniques to recover continuous fields of air temperature, humidity, and vapour pressure deficit from remotely sensed observations have significant potential for disease vector monitoring and related epidemiological applications. We report on the development of techniques to map environmental variables with relevance to the prediction of the relative abundance of disease vectors and intermediate hosts. Improvements to current methods of obtaining information on vegetation properties, canopy and surface temperature and soil moisture over large areas are also discussed. Algorithms used to measure these variables incorporate visible, near-infrared and thermal infrared radiation observations derived from time series of satellite-based sensors, focused here primarily but not exclusively on the Advanced Very High Resolution Radiometer (AVHRR) instruments. The variables compare favourably with surface measurements over a broad array of conditions at several study sites, and maps of retrieved variables captured patterns of spatial variability comparable to, and locally more accurate than, spatially interpolated meteorological observations. Application of multi-temporal maps of these variables are discussed in relation to current epidemiological research on the distribution and abundance of some common disease vectors.

  14. To Evacuate or Shelter in Place: Implications of Universal Hurricane Evacuation Policies on Nursing Home Residents

    PubMed Central

    Dosa, David; Hyer, Kathryn; Thomas, Kali; Swaminathan, Shailender; Feng, Zhanlian; Brown, Lisa; Mor, Vincent

    2011-01-01

    Objective To examine the differential morbidity/mortality associated with evacuation versus sheltering in place for nursing home (NH) residents exposed to the 4 most recent Gulf-hurricanes Methods Observational study using Medicare claims, and NH data sources. We compared the differential mortality/morbidity for long-stay residents exposed to 4 recent hurricanes (Katrina, Rita, Gustav, and Ike) relative to those residing at the same NHs over the same time periods during the prior 2 non-hurricane years as a control. Using an instrumental variable analysis, we then evaluated the independent effect of evacuation on outcomes at 90 days. Results Among 36,389 NH residents exposed to a storm, the 30 and 90 day mortality/hospitalization rates increased compared to non-hurricane control years. There were a cumulative total of 277 extra deaths and 872 extra hospitalizations at 30 days. At 90 days, 579 extra deaths and 544 extra hospitalizations were observed. Using the instrumental variable analysis, evacuation increased the probability of death at 90 days from 2.7-5.3% and hospitalization by 1.8-8.3%, independent of other factors. Conclusion Among residents exposed to hurricanes, evacuation significantly exacerbated subsequent morbidity/mortality. PMID:21885350

  15. Effects of trunk stability on isometric knee extension muscle strength measurement while sitting.

    PubMed

    Hirano, Masahiro; Gomi, Masahiro; Katoh, Munenori

    2016-09-01

    [Purpose] This study aimed to investigate the effect of trunk stability on isometric knee extension muscle strength measurement while sitting by performing simultaneous measurements with a handheld dynamometer (HHD) and an isokinetic dynamometer (IKD) in the same seated condition. [Subjects and Methods] The subjects were 30 healthy volunteers. Isometric knee extension muscle strength was simultaneously measured with a HHD and an IKD by using an IKD-specific chair. The measurement was performed twice. Measurement instrument variables and the number of measurements were examined by using the analysis of variance and correlation tests. [Results] The measurement instrument variables and the number of measurements were not significantly different. The correlation coefficients between the HHD and IKD measurements were ≥0.96. [Conclusion] Isometric knee extension muscle strength measurement using the HHD in the sitting position resulted in a lower value than that using the IKD, presumably because of the effect of trunk stability on the measurement. In the same seated posture with trunk stability, no significant difference in measurement values was observed between the HHD and IKD. The present findings suggest that trunk stability while seated during isometric knee extension muscle strength measurement influenced the HHD measurement.

  16. Manufacturing challenge: An employee perception of the impact of BEM variables on motivation

    NASA Astrophysics Data System (ADS)

    Nyaude, Alaster

    The study examines the impact of Thomas F. Gilbert's Behavior Engineering Model (BEM) variables on employee perception of motivation at an aerospace equipment manufacturing plant in Georgia. The research process involved literature review, and determination of an appropriate survey instrument for the study. The Hersey-Chevalier modified PROBE instrument (Appendix C) was used with Dr Roger Chevalier's validation. The participants' responses were further examined to determine the influence of demographic control variables of age, gender, length of service with the company and education on employee perception of motivation. The results indicated that the top three highly motivating variables were knowledge and skills, capacity and resources. Knowledge and skills was perceived to be highly motivating, capacity as second highly motivating and resources as the third highly motivating variable. Interestingly, the fourth highly motivating variable was information, the fifth was motives and the sixth was incentives. The results also showed that demographic control variables had no influence on employee perception of motivation. Further research may be required to understand to what extend these BEM variables impact employee perceptions of motivation.

  17. 8 years of Solar Spectral Irradiance Observations from the ISS with the SOLAR/SOLSPEC Instrument

    NASA Astrophysics Data System (ADS)

    Damé, L.; Bolsée, D.; Meftah, M.; Irbah, A.; Hauchecorne, A.; Bekki, S.; Pereira, N.; Cessateur, G.; Marchand, M.; Thiéblemont, R.; Foujols, T.

    2016-12-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its UV variability, as measured by SOLAR/SOLSPEC. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  18. Instrumental colour of Iberian ham subcutaneous fat and lean (biceps femoris): Influence of crossbreeding and rearing system.

    PubMed

    Carrapiso, Ana I; García, Carmen

    2005-10-01

    The influence of crossbreeding (Iberian vs Iberian×Duroc 50% pigs) and rearing system (Montanera vs Pienso) on the instrumental colour of Iberian ham (subcutaneous fat and biceps femoris muscle) and the relationships to sensory appearance and chemical composition were researched by using a factorial design. In subcutaneous fat, a significant effect (p<0.05) of crossbreeding and rearing system was found: b* and chroma were larger in hams from Iberian pigs than from Iberian×Duroc (50%) pigs, and L*, a* and chroma were larger in Pienso hams than in Montanera hams. CIEL*a*b* variables of subcutaneous fat were closely related to subcutaneous fatty acid composition, the largest correlationships involving L* (L* and 18:0, 0.652, p<0.001; L* and 18:1, -0.616, p<0.001). Instrumental colour variables and sensory appearance were also correlated (L* and fat pinkness, -0.539, p<0.001). In lean (biceps femoris), instrumental colour data was not affected by crossbreeding and rearing system. CIEL*a*b* variables were not related to chemical composition (moisture, NaCl, intramuscular fat and pigment content), although they were correlated to sensory appearance (L* and marbling, 0.419, p=0.014).

  19. Multivariate evaluation of the cutting performance of rotary instruments with electric and air-turbine handpieces.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Chochlidakis, Konstantinos; Ercoli, Carlo

    2016-10-01

    Laboratory studies of tooth preparation often involve single values for all variables other than the one being tested. In contrast, in clinical settings, not all variables can be adequately controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but, in clinical practice, the instrument must make different cuts with individual dentists applying different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies difficult. The purpose of this in vitro study was to examine the effects of 9 process variables on the dental cutting of rotary cutting instruments used with an electric handpiece and compare them with those of a previous study that used an air-turbine handpiece. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using an electric handpiece in a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures with Macor blocks as the cutting substrate. Analysis of variance (ANOVA) was used to assess the statistical significance (α=.05). Four variables (targeted applied load, cut length, diamond grit size, and cut type) consistently produced large, statistically significant effects, whereas 5 variables (rotation per minute, number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate) produced relatively small, statistically insignificant effects. These results are generally similar to those previously found for an air-turbine handpiece. Regardless of whether an electric or air-turbine handpiece was used, the control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances and hardware choices. These results highlight the greater importance of local clinical conditions (procedure, dentist) in understanding dental cutting as opposed to other hardware-related factors. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  20. Performance evaluation of the microINR® point-of-care INR-testing system.

    PubMed

    Joubert, J; van Zyl, M C; Raubenheimer, J

    2018-04-01

    Point-of-care International Normalised Ratio (INR) testing is used frequently. We evaluated the microINR ® POC system for accuracy, precision and measurement repeatability, and investigated instrument and test chip variability and error rates. Venous blood INRs of 210 patients on warfarin were obtained with Thromborel ® S on the Sysmex CS-2100i ® analyser and compared with capillary blood microINR ® values. Precision was assessed using control materials. Measurement repeatability was calculated on 51 duplicate finger-prick INRs. Triplicate finger-prick INRs using three different instruments (30 patients) and three different test chip lots (29 patients) were used to evaluate instrument and test chip variability. Linear regression analysis of microINR ® and Sysmex CS2100i ® values showed a correlation coefficient of 0.96 (P < .0001) and a positive proportional bias of 4.4%. Dosage concordance was 93.8% and clinical agreement 95.7%. All acceptance criteria based on ISO standard 17593:2007 system accuracy requirements were met. Control material coefficients of variation (CV) varied from 6.2% to 16.7%. The capillary blood measurement repeatability CV was 7.5%. No significant instrument (P = .93) or test chip (P = .81) variability was found, and the error rate was low (2.8%). The microINR ® instrument is accurate and precise for monitoring warfarin therapy. © 2017 John Wiley & Sons Ltd.

  1. Hand-held instrument should relieve hematoma pressure

    NASA Technical Reports Server (NTRS)

    Raggio, L. J.; Robertson, T. L.

    1967-01-01

    Portable instrument relieves hematomas beneath fingernails and toenails without surgery. This device simplifies the operative procedure with an instant variable heating tip, adjustable depth settings and interchangeable tip sizes for cauterizing small areas and relieving pressurized clots.

  2. Surface Meteorological Station - ESRL Short Tower, Wasco Airport - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2017-12-11

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  3. Surface Meteorological Station - ESRL Short Tower, Wasco Airport - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2017-12-11

    A diversity of instruments are used to measure various quantities related to meteorology, precipitation, and radiation near the Earth’s surface. Typically, a standard suite of instruments is deployed to monitor meteorological state variables.

  4. The contextual effects of social capital on health: a cross-national instrumental variable analysis.

    PubMed

    Kim, Daniel; Baum, Christopher F; Ganz, Michael L; Subramanian, S V; Kawachi, Ichiro

    2011-12-01

    Past research on the associations between area-level/contextual social capital and health has produced conflicting evidence. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167,344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in both sexes when country population density and corruption were used as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Previous findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within and across countries may be large. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. The contextual effects of social capital on health: a cross-national instrumental variable analysis

    PubMed Central

    Kim, Daniel; Baum, Christopher F; Ganz, Michael; Subramanian, S V; Kawachi, Ichiro

    2011-01-01

    Past observational studies of the associations of area-level/contextual social capital with health have revealed conflicting findings. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167 344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in women and men using country population density and corruption as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Past findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within countries may be large. PMID:22078106

  6. Instrument Suite for Vertical Characterization of the Ionosphere-Thermosphere System

    NASA Technical Reports Server (NTRS)

    Herrero, Federico; Jones, Hollis; Finne, Theodore; Nicholas, Andrew

    2012-01-01

    A document describes a suite that provides four simultaneous ion and neutral-atom measurements as a function of altitude, with variable sensitivity for neutral atmospheric species. The variable sensitivity makes it possible to extend the measurements over the altitude range of 100 to more than 700 km. The four instruments in the suite are (1) a neutral wind-temperature spectrometer (WTS), (2) an ion-drift ion-temperature spectrometer (IDTS), (3) a neutral mass spectrometer (NMS), and (4) an ion mass spectrometer (IMS).

  7. Liquid Medication Dosing Errors in Children: Role of Provider Counseling Strategies

    PubMed Central

    Yin, H. Shonna; Dreyer, Benard P.; Moreira, Hannah A.; van Schaick, Linda; Rodriguez, Luis; Boettger, Susanne; Mendelsohn, Alan L.

    2014-01-01

    Objective To examine the degree to which recommended provider counseling strategies, including advanced communication techniques and dosing instrument provision, are associated with reductions in parent liquid medication dosing errors. Methods Cross-sectional analysis of baseline data on provider communication and dosing instrument provision from a study of a health literacy intervention to reduce medication errors. Parents whose children (<9 years) were seen in two urban public hospital pediatric emergency departments (EDs) and were prescribed daily dose liquid medications self-reported whether they received counseling about their child’s medication, including advanced strategies (teachback, drawings/pictures, demonstration, showback) and receipt of a dosing instrument. Primary dependent variable: observed dosing error (>20% deviation from prescribed). Multivariate logistic regression analyses performed, controlling for: parent age, language, country, ethnicity, socioeconomic status, education, health literacy (Short Test of Functional Health Literacy in Adults); child age, chronic disease status; site. Results Of 287 parents, 41.1% made dosing errors. Advanced counseling and instrument provision in the ED were reported by 33.1% and 19.2%, respectively; 15.0% reported both. Advanced counseling and instrument provision in the ED were associated with decreased errors (30.5 vs. 46.4%, p=0.01; 21.8 vs. 45.7%, p=0.001). In adjusted analyses, ED advanced counseling in combination with instrument provision was associated with a decreased odds of error compared to receiving neither (AOR 0.3; 95% CI 0.1–0.7); advanced counseling alone and instrument alone were not significantly associated with odds of error. Conclusion Provider use of advanced counseling strategies and dosing instrument provision may be especially effective in reducing errors when used together. PMID:24767779

  8. Review and discussion of homogenisation methods for climate data

    NASA Astrophysics Data System (ADS)

    Ribeiro, S.; Caineta, J.; Costa, A. C.

    2016-08-01

    The quality of climate data is of extreme relevance, since these data are used in many different contexts. However, few climate time series are free from non-natural irregularities. These inhomogeneities are related to the process of collecting, digitising, processing, transferring, storing and transmitting climate data series. For instance, they can be caused by changes of measuring instrumentation, observing practices or relocation of weather stations. In order to avoid errors and bias in the results of analysis that use those data, it is particularly important to detect and remove those non-natural irregularities prior to their use. Moreover, due to the increase of storage capacity, the recent gathering of massive amounts of weather data implies also a toilsome effort to guarantee its quality. The process of detection and correction of irregularities is named homogenisation. A comprehensive summary and description of the available homogenisation methods is critical to climatologists and other experts, who are looking for a homogenisation method wholly considered as the best. The effectiveness of homogenisation methods depends on the type, temporal resolution and spatial variability of the climatic variable. Several comparison studies have been published so far. However, due to the absence of time series where irregularities are known, only a few of those comparisons indicate the level of success of the homogenisation methods. This article reviews the characteristics of the most important procedures used in the homogenisation of climatic variables based on a thorough literature research. It also summarises many methods applications in order to illustrate their applicability, which may help climatologists and other experts to identify adequate method(s) for their particular needs. This review study also describes comparison studies, which evaluated the efficiency of homogenisation methods, and provides a summary of conclusions and lessons learned regarding good practices for the use of homogenisation methods.

  9. [A survey instrument for evaluating psychological variables and risky sexual behavior among young adults at two university centers in Mexico].

    PubMed

    Piña López, Julio A; Robles Montijo, Susana; Rivera Icedo, Blanca M

    2007-11-01

    To measure the psychometric attributes of a survey instrument designed to evaluate historical and context variables that lead to high-risk sexual behaviors among a sample of university students in Mexico. Cross-sectional study of a sample of 1 346 university students in Mexico: 784 from the Sonora State Center for Higher Education in Hermosillo, Sonora, or 33.2% of its total enrollment; and 562 from the National Autonomous University of Mexico, at Tlalnepantla campus in Mexico State, or 23.5% of its total enrollment. The study took place in Hermosillo during the month of October 2006 and in Tlalnepantla from January to March 2006. The survey had 11 questions on sociodemographics, 7 on risky sexual behaviors, 22 on related motives, 8 on social context, and 6 on physical status prior to sexual relations. The survey was evaluated in terms of how well the questions were understood, its conceptual validity, and reliability. The final version of the survey instrument was composed of 44 questions. The reliability analysis produced an overall Cronbach alpha value of 0.821, taking into account all the variables combined and grouped by factor. Three factors were found that together accounted for 38.36% of the total variance: reasons for not using a condom in the first sexual relationship or throughout life, reasons for inconsistent use of a condom with a casual sex partner, and willingness to become sexually active and to engage in casual sex. The psychometric attributes of this survey instrument were found to be satisfactory. Those interested in using this instrument should become familiar with the theoretical model on which it is based, since understanding the results depends on properly defining the historical and context variables, and their interaction.

  10. Uncertainty in counting ice nucleating particles with continuous flow diffusion chambers

    DOE PAGES

    Garimella, Sarvesh; Rothenberg, Daniel A.; Wolf, Martin J.; ...

    2017-09-14

    This study investigates the measurement of ice nucleating particle (INP) concentrations and sizing of crystals using continuous flow diffusion chambers (CFDCs). CFDCs have been deployed for decades to measure the formation of INPs under controlled humidity and temperature conditions in laboratory studies and by ambient aerosol populations. These measurements have, in turn, been used to construct parameterizations for use in models by relating the formation of ice crystals to state variables such as temperature and humidity as well as aerosol particle properties such as composition and number. We show here that assumptions of ideal instrument behavior are not supported by measurements mademore » with a commercially available CFDC, the SPectrometer for Ice Nucleation (SPIN), and the instrument on which it is based, the Zurich Ice Nucleation Chamber (ZINC). Non-ideal instrument behavior, which is likely inherent to varying degrees in all CFDCs, is caused by exposure of particles to different humidities and/or temperatures than predicated from instrument theory of operation. This can result in a systematic, and variable, underestimation of reported INP concentrations. Here we find here variable correction factors from 1.5 to 9.5, consistent with previous literature values. We use a machine learning approach to show that non-ideality is most likely due to small-scale flow features where the aerosols are combined with sheath flows. Machine learning is also used to minimize the uncertainty in measured INP concentrations. Finally, we suggest that detailed measurement, on an instrument-by-instrument basis, be performed to characterize this uncertainty.« less

  11. Variable Stars in the Field of V729 Aql

    NASA Astrophysics Data System (ADS)

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  12. Evaluating nursing administration instruments.

    PubMed

    Huber, D L; Maas, M; McCloskey, J; Scherb, C A; Goode, C J; Watson, C

    2000-05-01

    To identify and evaluate available measures that can be used to examine the effects of management innovations in five important areas: autonomy, conflict, job satisfaction, leadership, and organizational climate. Management interventions target the context in which care is delivered and through which evidence for practice diffuses. These innovations need to be evaluated for their effects on desired outcomes. However, busy nurses may not have the time to locate, evaluate, and select instruments to measure expected nursing administration outcomes without research-based guidance. Multiple and complex important contextual variables need psychometrically sound and easy-to-use measurement instruments identified for use in both practice and research. An expert focus group consensus methodology was used in this evaluation research to review available instruments in the five areas and evaluate which of these instruments are psychometrically sound and easy to use in the practice setting. The result is a portfolio of measures, clustered by concept and displayed on a spreadsheet. Retrieval information is provided. The portfolio includes the expert consensus judgment as well as useful descriptive information. The research reported here identifies psychometrically sound and easy-to-use instruments for measuring five key variables to be included in a portfolio. The results of this study can be used as a beginning for saving time in instrument selection and as an aid for determining the best instrument for measuring outcomes from a clinical or management intervention.

  13. Aquarius Instrument Science Calibration During the Risk Reduction Phase

    NASA Technical Reports Server (NTRS)

    Ruf, Christopher S.

    2004-01-01

    This final report presents the results of work performed under NASA Grant NAG512726 during the period 15 January 2003 through 30 June 2004. An analysis was performed of a possible vicarious calibration method for use by Aquarius to monitor and stabilize the absolute and relative calibration of its microwave radiometer. Stationary statistical properties of the brightness temperature (T(sub B)) measured by a low Earth orbiting radiometer operating at 1.4135 GHz are considered as a means of validating its absolute calibration. The global minimum, maximum, and average T(sub B) are considered, together with a vicarious cold reference method that detects the presence of a sharp lower bound on naturally occurring values for T(sub B). Of particular interest is the reliability with which these statistics can be extracted from a realistic distribution of T(sub B) measurements that would be observed by a typical sensor. Simulations of measurements are performed that include the effects of instrument noise and variable environmental factors such as the global water vapor and ocean surface temperature, salinity and wind distributions. Global minima can vary widely due to instrument noise and are not a reliable calibration reference. Global maxima are strongly influenced by several environmental factors as well as instrument noise and are even less stationary. Global averages are largely insensitive to instrument noise and, in most cases, to environmental conditions as well. The global average T(sub B) varies at only the 0.1 K RMS level except in cases of anomalously high winds, when it can increase considerably more. The vicarious cold reference is similarly insensitive to instrument effects and most environmental factors. It is not significantly affected by high wind conditions. The stability of the vicarious reference is, however, found to be somewhat sensitive (at the several tenths of Kelvins level) to variations in the background cold space brightness, T(sub c). The global average is much less sensitive to this parameter and so using two approaches together can be mutually beneficial.

  14. Electron-Beam Diagnostic Methods for Hypersonic Flow Diagnostics

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The purpose of this work was the evaluation of the use of electron-bean fluorescence for flow measurements during hypersonic flight. Both analytical and numerical models were developed in this investigation to evaluate quantitatively flow field imaging concepts based upon the electron beam fluorescence technique for use in flight research and wind tunnel applications. Specific models were developed for: (1) fluorescence excitation/emission for nitrogen, (2) rotational fluorescence spectrum for nitrogen, (3) single and multiple scattering of electrons in a variable density medium, (4) spatial and spectral distribution of fluorescence, (5) measurement of rotational temperature and density, (6) optical filter design for fluorescence imaging, and (7) temperature accuracy and signal acquisition time requirements. Application of these models to a typical hypersonic wind tunnel flow is presented. In particular, the capability of simulating the fluorescence resulting from electron impact ionization in a variable density nitrogen or air flow provides the capability to evaluate the design of imaging instruments for flow field mapping. The result of this analysis is a recommendation that quantitative measurements of hypersonic flow fields using electron-bean fluorescence is a tractable method with electron beam energies of 100 keV. With lower electron energies, electron scattering increases with significant beam divergence which makes quantitative imaging difficult. The potential application of the analytical and numerical models developed in this work is in the design of a flow field imaging instrument for use in hypersonic wind tunnels or onboard a flight research vehicle.

  15. Functional leg length discrepancy between theories and reliable instrumental assessment: a study about newly invented NPoS system.

    PubMed

    Mahmoud, Asmaa; Abundo, Paolo; Basile, Luisanna; Albensi, Caterina; Marasco, Morena; Bellizzi, Letizia; Galasso, Franco; Foti, Calogero

    2017-01-01

    In spite the instinct social&financial impact of Leg Length Discrepancy (LLD), controversial and conflicting results still exist regarding a reliable assessment/correction method. For proper management it's essential to discriminate between anatomical&functional Leg Length Discrepancy (FLLD). With the newly invented NPoS (New Postural Solution), under the umbrella of the collaboration of PRM Department, Tor Vergata University with Baro Postural Instruments srl, positive results were observed in both measuring& compensating the hemi-pelvic antero-medial rotation in FLLD through personalized bilateral heel raise using two NPoS components: Foot Image System (FIS) and Postural Optimizer System (POS). This led our research interest to test the validity of NPoS as a preliminary step before evaluating its implementations in postural disorders. After clinical evaluation, 4 subjects with FLLD have been assessed by NPoS. Over a period of 2 months, every subject was evaluated 12 times by two different operators, 48 measurements in total, results have been verified in correlation to BTS GaitLab results. Intra-Operator&inter-operator variability analysis showed statistically insignificant differences, while inter-method variability between NPoS and BTS parameters expressed a linear correlation. Results suggest a significant validity of NPoS in assessment&correction of FLLD, with high degree of reproducibility with minimal operator dependency. This can be considered a base for promising clinical implications of NPoS as a reliable cost effective postural assessment/corrective tool. V.

  16. Feedback control of acoustic musical instruments: collocated control using physical analogs.

    PubMed

    Berdahl, Edgar; Smith, Julius O; Niemeyer, Günter

    2012-01-01

    Traditionally, the average professional musician has owned numerous acoustic musical instruments, many of them having distinctive acoustic qualities. However, a modern musician could prefer to have a single musical instrument whose acoustics are programmable by feedback control, where acoustic variables are estimated from sensor measurements in real time and then fed back in order to influence the controlled variables. In this paper, theory is presented that describes stable feedback control of an acoustic musical instrument. The presentation should be accessible to members of the musical acoustics community who may have limited or no experience with feedback control. First, the only control strategy guaranteed to be stable subject to any musical instrument mobility is described: the sensors and actuators must be collocated, and the controller must emulate a physical analog system. Next, the most fundamental feedback controllers and the corresponding physical analog systems are presented. The effects that these controllers have on acoustic musical instruments are described. Finally, practical design challenges are discussed. A proof explains why changing the resonance frequency of a musical resonance requires much more control power than changing the decay time of the resonance. © 2012 Acoustical Society of America.

  17. An identification method for damping ratio in rotor systems

    NASA Astrophysics Data System (ADS)

    Wang, Weimin; Li, Qihang; Gao, Jinji; Yao, Jianfei; Allaire, Paul

    2016-02-01

    Centrifugal compressor testing with magnetic bearing excitations is the last step to assure the compressor rotordynamic stability in the designed operating conditions. To meet the challenges of stability evaluation, a new method combining the rational polynomials method (RPM) with the weighted instrumental variables (WIV) estimator to fit the directional frequency response function (dFRF) is presented. Numerical simulation results show that the method suggested in this paper can identify the damping ratio of the first forward and backward modes with high accuracy, even in a severe noise environment. Experimental tests were conducted to study the effect of different bearing configurations on the stability of rotor. Furthermore, two example centrifugal compressors (a nine-stage straight-through and a six-stage back-to-back) were employed to verify the feasibility of identification method in industrial configurations as well.

  18. Model predictive controller design for boost DC-DC converter using T-S fuzzy cost function

    NASA Astrophysics Data System (ADS)

    Seo, Sang-Wha; Kim, Yong; Choi, Han Ho

    2017-11-01

    This paper proposes a Takagi-Sugeno (T-S) fuzzy method to select cost function weights of finite control set model predictive DC-DC converter control algorithms. The proposed method updates the cost function weights at every sample time by using T-S type fuzzy rules derived from the common optimal control engineering knowledge that a state or input variable with an excessively large magnitude can be penalised by increasing the weight corresponding to the variable. The best control input is determined via the online optimisation of the T-S fuzzy cost function for all the possible control input sequences. This paper implements the proposed model predictive control algorithm in real time on a Texas Instruments TMS320F28335 floating-point Digital Signal Processor (DSP). Some experimental results are given to illuminate the practicality and effectiveness of the proposed control system under several operating conditions. The results verify that our method can yield not only good transient and steady-state responses (fast recovery time, small overshoot, zero steady-state error, etc.) but also insensitiveness to abrupt load or input voltage parameter variations.

  19. Vibration Testing of Electrical Cables to Quantify Loads at Tie-Down Locations

    NASA Technical Reports Server (NTRS)

    Dutson, Joseph D.

    2013-01-01

    The standard method for defining static equivalent structural load factors for components is based on Mile s equation. Unless test data is available, 5% critical damping is assumed for all components when calculating loads. Application of this method to electrical cable tie-down hardware often results in high loads, which often exceed the capability of typical tie-down options such as cable ties and P-clamps. Random vibration testing of electrical cables was used to better understand the factors that influence component loads: natural frequency, damping, and mass participation. An initial round of vibration testing successfully identified variables of interest, checked out the test fixture and instrumentation, and provided justification for removing some conservatism in the standard method. Additional testing is planned that will include a larger range of cable sizes for the most significant contributors to load as variables to further refine loads at cable tie-down points. Completed testing has provided justification to reduce loads at cable tie-downs by 45% with additional refinement based on measured cable natural frequencies.

  20. Designed experiment evaluation of key variables affecting the cutting performance of rotary instruments.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Ercoli, Carlo

    2015-04-01

    Laboratory studies of tooth preparation are often performed under a limited range of conditions involving single values for all variables other than the 1 being tested. In contrast, in clinical settings not all variables can be tightly controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but in clinical practice, the instrument must make different cuts with individual dentists applying a range of different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies is difficult. The purpose of this study was to examine the effect of 9 process variables on dental cutting in a single experiment, allowing each variable to be robustly tested over a range of values for the other 8 and permitting a direct comparison of the relative importance of each on the cutting process. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures and Macor blocks as the cutting substrate. Analysis of Variance (ANOVA) was used to judge the statistical significance (α=.05). Five variables consistently produced large, statistically significant effects (target applied load, cut length, starting rpm, diamond grit size, and cut type), while 4 variables produced relatively small, statistically insignificant effects (number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate). The control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances as well as hardware choices. These results highlight the importance of local clinical conditions (procedure, dentist) in understanding dental cutting procedures and in designing adequate experimental methodologies for future studies. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  1. Early meteorological records from Latin-America and the Caribbean during the 18th and 19th centuries

    PubMed Central

    Domínguez-Castro, Fernando; Vaquero, José Manuel; Gallego, María Cruz; Farrona, Ana María Marín; Antuña-Marrero, Juan Carlos; Cevallos, Erika Elizabeth; Herrera, Ricardo García; de la Guía, Cristina; Mejía, Raúl David; Naranjo, José Manuel; del Rosario Prieto, María; Ramos Guadalupe, Luis Enrique; Seiner, Lizardo; Trigo, Ricardo Machado; Villacís, Marcos

    2017-01-01

    This paper provides early instrumental data recovered for 20 countries of Latin-America and the Caribbean (Argentina, Bahamas, Belize, Brazil, British Guiana, Chile, Colombia, Costa Rica, Cuba, Ecuador, France (Martinique and Guadalupe), Guatemala, Jamaica, Mexico, Nicaragua, Panama, Peru, Puerto Rico, El Salvador and Suriname) during the 18th and 19th centuries. The main meteorological variables retrieved were air temperature, atmospheric pressure, and precipitation, but other variables, such as humidity, wind direction, and state of the sky were retrieved when possible. In total, more than 300,000 early instrumental data were rescued (96% with daily resolution). Especial effort was made to document all the available metadata in order to allow further post-processing. The compilation is far from being exhaustive, but the dataset will contribute to a better understanding of climate variability in the region, and to enlarging the period of overlap between instrumental data and natural/documentary proxies. PMID:29135974

  2. Investigation on Motorcyclist Riding Behaviour at Curve Entry Using Instrumented Motorcycle

    PubMed Central

    Yuen, Choon Wah; Karim, Mohamed Rehan; Saifizul, Ahmad

    2014-01-01

    This paper details the study on the changes in riding behaviour, such as changes in speed as well as the brake force and throttle force applied, when motorcyclists ride over a curve section road using an instrumented motorcycle. In this study, an instrumented motorcycle equipped with various types of sensors, on-board cameras, and data loggers, was developed in order to collect the riding data on the study site. Results from the statistical analysis showed that riding characteristics, such as changes in speed, brake force, and throttle force applied, are influenced by the distance from the curve entry, riding experience, and travel mileage of the riders. A structural equation modeling was used to study the impact of these variables on the change of riding behaviour in curve entry section. Four regression equations are formed to study the relationship between four dependent variables, which are speed, throttle force, front brake force, and rear brake force applied with the independent variables. PMID:24523660

  3. Developing a historical climatology of Wales from Welsh and English language sources

    NASA Astrophysics Data System (ADS)

    MacDonald, N.; Davies, S. J.; Jones, C. A.; Charnell-White, C.

    2009-04-01

    Historical documentary records are recognised as valuable in understanding long term climate variability. In the UK, the Central England Temperature Series (1772- ) and the Lamb weather catalogue (1861- ) provide a detailed climate record for England, but the value of these archives in Wales and Scotland is more limited, though some long term instrumental series exist, particularly for cities such as Cardiff. The spatial distance from the central England area and a lower density of instrumental stations in Wales has limited understanding of climate variability during the instrumental period (~1750- ). This paper illustrates that historical documentary records represent a considerable resource, that to date have been underutilised in developing a more complete understanding of past weather and climate within many parts of Western Europe.

  4. Quantifying inter-laboratory variability in stable isotope analysis of ancient skeletal remains.

    PubMed

    Pestle, William J; Crowley, Brooke E; Weirauch, Matthew T

    2014-01-01

    Over the past forty years, stable isotope analysis of bone (and tooth) collagen and hydroxyapatite has become a mainstay of archaeological and paleoanthropological reconstructions of paleodiet and paleoenvironment. Despite this method's frequent use across anthropological subdisciplines (and beyond), the present work represents the first attempt at gauging the effects of inter-laboratory variability engendered by differences in a) sample preparation, and b) analysis (instrumentation, working standards, and data calibration). Replicate analyses of a 14C-dated ancient human bone by twenty-one archaeological and paleoecological stable isotope laboratories revealed significant inter-laboratory isotopic variation for both collagen and carbonate. For bone collagen, we found a sizeable range of 1.8‰ for δ13Ccol and 1.9‰ for δ15Ncol among laboratories, but an interpretatively insignificant average pairwise difference of 0.2‰ and 0.4‰ for δ13Ccol and δ15Ncol respectively. For bone hydroxyapatite the observed range increased to a troublingly large 3.5‰ for δ13Cap and 6.7‰ for δ18Oap, with average pairwise differences of 0.6‰ for δ13Cap and a disquieting 2.0‰ for δ18Oap. In order to assess the effects of preparation versus analysis on isotopic variability among laboratories, a subset of the samples prepared by the participating laboratories were analyzed a second time on the same instrument. Based on this duplicate analysis, it was determined that roughly half of the isotopic variability among laboratories could be attributed to differences in sample preparation, with the other half resulting from differences in analysis (instrumentation, working standards, and data calibration). These findings have serious implications for choices made in the preparation and extraction of target biomolecules, the comparison of results obtained from different laboratories, and the interpretation of small differences in bone collagen and hydroxyapatite isotope values. To address the issues arising from inter-laboratory comparisons, we devise a novel measure we term the Minimum Meaningful Difference (MMD), and demonstrate its application.

  5. Quantifying Inter-Laboratory Variability in Stable Isotope Analysis of Ancient Skeletal Remains

    PubMed Central

    Pestle, William J.; Crowley, Brooke E.; Weirauch, Matthew T.

    2014-01-01

    Over the past forty years, stable isotope analysis of bone (and tooth) collagen and hydroxyapatite has become a mainstay of archaeological and paleoanthropological reconstructions of paleodiet and paleoenvironment. Despite this method's frequent use across anthropological subdisciplines (and beyond), the present work represents the first attempt at gauging the effects of inter-laboratory variability engendered by differences in a) sample preparation, and b) analysis (instrumentation, working standards, and data calibration). Replicate analyses of a 14C-dated ancient human bone by twenty-one archaeological and paleoecological stable isotope laboratories revealed significant inter-laboratory isotopic variation for both collagen and carbonate. For bone collagen, we found a sizeable range of 1.8‰ for δ13Ccol and 1.9‰ for δ15Ncol among laboratories, but an interpretatively insignificant average pairwise difference of 0.2‰ and 0.4‰ for δ13Ccol and δ15Ncol respectively. For bone hydroxyapatite the observed range increased to a troublingly large 3.5‰ for δ13Cap and 6.7‰ for δ18Oap, with average pairwise differences of 0.6‰ for δ13Cap and a disquieting 2.0‰ for δ18Oap. In order to assess the effects of preparation versus analysis on isotopic variability among laboratories, a subset of the samples prepared by the participating laboratories were analyzed a second time on the same instrument. Based on this duplicate analysis, it was determined that roughly half of the isotopic variability among laboratories could be attributed to differences in sample preparation, with the other half resulting from differences in analysis (instrumentation, working standards, and data calibration). These findings have serious implications for choices made in the preparation and extraction of target biomolecules, the comparison of results obtained from different laboratories, and the interpretation of small differences in bone collagen and hydroxyapatite isotope values. To address the issues arising from inter-laboratory comparisons, we devise a novel measure we term the Minimum Meaningful Difference (MMD), and demonstrate its application. PMID:25061843

  6. Extreme Response Style and the Measurement of Intra-Individual Variability

    ERIC Educational Resources Information Center

    Deng, Sien

    2017-01-01

    Psychologists have become increasingly interested in the intra-individual variability of psychological measures as a meaningful distinguishing characteristic of persons. Assessments of intra-individual variability are frequently based on the repeated administration of self-report rating scale instruments, and extreme response style (ERS) has the…

  7. The individualistic fallacy, ecological studies and instrumental variables: a causal interpretation.

    PubMed

    Loney, Tom; Nagelkerke, Nico J

    2014-01-01

    The validity of ecological studies in epidemiology for inferring causal relationships has been widely challenged as observed associations could be biased by the Ecological Fallacy. We reconsider the important design components of ecological studies, and discuss the conditions that may lead to spurious associations. Ecological associations are useful and valid when the ecological exposures can be interpreted as Instrumental Variables. A suitable example may be a time series analysis of environmental pollution (e.g. particulate matter with an aerodynamic diameter of <10 micrometres; PM10) and health outcomes (e.g. hospital admissions for acute myocardial infarction) as environmental pollution levels are a cause of individual exposure levels and not just an aggregate measurement. Ecological exposures may also be employed in situations (perhaps rare) where individual exposures are known but their associations with health outcomes are confounded by unknown or unquantifiable factors. Ecological associations have a notorious reputation in epidemiology and individualistic associations are considered superior to ecological associations because of the "ecological fallacy". We have argued that this is incorrect in situations in which ecological or aggregate exposures can serve as an instrumental variable and associations between individual exposure and outcome are likely to be confounded by unmeasured variables.

  8. Development and validation of the Hogan Grief Reaction Checklist.

    PubMed

    Hogan, N S; Greenfield, D B; Schmidt, L A

    2001-01-01

    The purpose of this article is to provide data on a recently developed instrument to measure the multidimensional nature of the bereavement process. In contrast to widely used grief instruments that have been developed using rational methods of instrument construction, the Hogan Grief Reaction Checklist (HGRC) was developed empirically from data collected from bereaved adults who had experienced the death of a loved one. Factor analysis of the HGRC revealed 6 factors in the normal trajectory of the grieving process: Despair, Panic Behavior, Blame and Anger, Detachment, Disorganization, and Personal Growth. Additional data are provided that support reliability and validity of the HGRC as well as its ability to discriminate variability in the grieving process as a function of cause of death and time lapsed since death. Empirical support is also provided for Personal Growth as an integral component of the bereavement process. The article concludes by considering the substantive as well as psychometric findings of this research for such issues as traumatic grief, anticipatory grief, change in the bereaved person's self-schema, and spiritual and existential growth.

  9. Validation of spontaneous assessment of baroreceptor reflex sensitivity and its relation to heart rate variability in the ovine fetus pre- and near-term.

    PubMed

    Frasch, Martin G; Müller, Thomas; Szynkaruk, Mark; Schwab, Matthias

    2009-09-01

    Assessment of baroreceptor reflex sensitivity (BRS) in the ovine fetus provides insight into autonomic cardiovascular regulation. Currently, assessment of BRS relies on vasoactive drugs, but this approach is limited by feasibility issues and by the nonphysiologic nature of the stimulus. Thus we aimed to validate the method of spontaneous BRS assessment against the reference method of using vasoactive drugs in preterm (0.76 gestation, n = 16) and near-term (0.86 gestation, n = 16) chronically instrumented ovine fetuses. The BRS measures derived from the spontaneous and reference methods correlated at both gestational ages (R = 0.67 +/- 0.03). The sequence method of spontaneous BRS measures also correlated both to the root mean square of standard deviations (RMSSD), which is a measure of fetal heart rate variability reflecting vagal modulation (R = 0.69 +/- 0.03), and to fetal body weight (R = 0.65 +/- 0.03), which is a surrogate for growth trajectory of each fetus. The methodology presented may aid in developing new models to study BRS and cardiovascular control in ovine fetus in the last trimester of pregnancy.

  10. Current applications of robotics in spine surgery: a systematic review of the literature.

    PubMed

    Joseph, Jacob R; Smith, Brandon W; Liu, Xilin; Park, Paul

    2017-05-01

    OBJECTIVE Surgical robotics has demonstrated utility across the spectrum of surgery. Robotics in spine surgery, however, remains in its infancy. Here, the authors systematically review the evidence behind robotic applications in spinal instrumentation. METHODS This systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines. Relevant studies (through October 2016) that reported the use of robotics in spinal instrumentation were identified from a search of the PubMed database. Data regarding the accuracy of screw placement, surgeon learning curve, radiation exposure, and reasons for robotic failure were extracted. RESULTS Twenty-five studies describing 2 unique robots met inclusion criteria. Of these, 22 studies evaluated accuracy of spinal instrumentation. Although grading of pedicle screw accuracy was variable, the most commonly used method was the Gertzbein and Robbins system of classification. In the studies using the Gertzbein and Robbins system, accuracy (Grades A and B) ranged from 85% to 100%. Ten studies evaluated radiation exposure during the procedure. In studies that detailed fluoroscopy usage, overall fluoroscopy times ranged from 1.3 to 34 seconds per screw. Nine studies examined the learning curve for the surgeon, and 12 studies described causes of robotic failure, which included registration failure, soft-tissue hindrance, and lateral skiving of the drill guide. CONCLUSIONS Robotics in spine surgery is an emerging technology that holds promise for future applications. Surgical accuracy in instrumentation implanted using robotics appears to be high. However, the impact of robotics on radiation exposure is not clear and seems to be dependent on technique and robot type.

  11. Informed Source Separation of Atmospheric and Surface Signal Contributions in Shortwave Hyperspectral Imagery using Non-negative Matrix Factorization

    NASA Astrophysics Data System (ADS)

    Wright, L.; Coddington, O.; Pilewskie, P.

    2015-12-01

    Current challenges in Earth remote sensing require improved instrument spectral resolution, spectral coverage, and radiometric accuracy. Hyperspectral instruments, deployed on both aircraft and spacecraft, are a growing class of Earth observing sensors designed to meet these challenges. They collect large amounts of spectral data, allowing thorough characterization of both atmospheric and surface properties. The higher accuracy and increased spectral and spatial resolutions of new imagers require new numerical approaches for processing imagery and separating surface and atmospheric signals. One potential approach is source separation, which allows us to determine the underlying physical causes of observed changes. Improved signal separation will allow hyperspectral instruments to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. In this work, we investigate a Non-negative Matrix Factorization (NMF) method for the separation of atmospheric and land surface signal sources. NMF offers marked benefits over other commonly employed techniques, including non-negativity, which avoids physically impossible results, and adaptability, which allows the method to be tailored to hyperspectral source separation. We adapt our NMF algorithm to distinguish between contributions from different physically distinct sources by introducing constraints on spectral and spatial variability and by using library spectra to inform separation. We evaluate our NMF algorithm with simulated hyperspectral images as well as hyperspectral imagery from several instruments including, the NASA Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), NASA Hyperspectral Imager for the Coastal Ocean (HICO) and National Ecological Observatory Network (NEON) Imaging Spectrometer.

  12. Meta-analysis of the role of delivery mode in postpartum depression (Iran 1997-2011)

    PubMed Central

    Bahadoran, Parvin; Oreizi, Hamid Reza; Safari, Saeideh

    2014-01-01

    Background: Postpartum period is the riskiest time for mood disorders and psychosis. Postpartum depression is the most important mood disorder after delivery, which can be accompanied by mother-child and family relationship disorders. Meta-analysis with the integration of research results demonstrates to investigate the association between the mode of delivery and postpartum depression. Materials and Methods: This meta-analysis uses the Rosenthal and Robin approach. For this purpose, 18 studies which were acceptable in terms of methodology were selected and meta-analysis was conducted on them. Research instrument was a checklist of meta-analysis. After summarizing the results of the studies, effect sizes were calculated manually and combined based on meta-analysis method. Results: The findings showed that the amount of effect size (in term of Cohen d) of delivery mode on postpartum depression was 0/30 (P < 0.001). Conclusion: Delivery mode on maternal mental health is assessed medium. Meta analysis also indicates moderator variables role, and researcher must focus in these variables. PMID:25540791

  13. Biospectral analysis of the bladder channel point in chronic low back pain patients

    NASA Astrophysics Data System (ADS)

    Vidal, Alberto Espinosa; Nava, Juan José Godina; Segura, Miguel Ángel Rodriguez; Bastida, Albino Villegas

    2012-10-01

    Chronic pain is the main cause of disability in the productive age people and is a public health problem that affects both the patient and society. On the other hand, there isn't any instrument to measure it; this is only estimated using subjective variables. The healthy cells generate a known membrane potential which is part of a network of biologically closed electric circuits still unstudied. It is proposed a biospectral analysis of a bladder channel point as a diagnosis method for chronic low back pain patients. Materials and methods: We employed a study group with chronic low back pain patients and a control group without low back pain patients. The visual analog scale (VAS) to determine the level of pain was applied. Bioelectric variables were measured for 10 seconds and the respective biostatistical analyses were made. Results: Biospectral analysis on frequency domain shows a depression in the 60-300 Hz frequency range proportional to the chronicity of low back pain compared against healthy patients.

  14. Assessing the effects of employee assistance programs: a review of employee assistance program evaluations.

    PubMed

    Colantonio, A

    1989-01-01

    Employee assistance programs have grown at a dramatic rate, yet the effectiveness of these programs has been called into question. The purpose of this paper was to assess the effectiveness of employee assistance programs (EAPs) by reviewing recently published EAP evaluations. All studies evaluating EAPs published since 1975 from peer-reviewed journals in the English language were included in this analysis. Each of the articles was assessed in the following areas: (a) program description (subjects, setting, type of intervention, format), (b) evaluation design (research design, variables measured, operational methods), and (c) program outcomes. Results indicate numerous methodological and conceptual weaknesses and issues. These weaknesses included lack of controlled research designs and short time lags between pre- and post-test measures. Other problems identified are missing information regarding subjects, type of intervention, how variables are measured (operational methods), and reliability and validity of evaluation instruments. Due to the aforementioned weaknesses, positive outcomes could not be supported. Recommendations are made for future EAP evaluations.

  15. Assessing the effects of employee assistance programs: a review of employee assistance program evaluations.

    PubMed Central

    Colantonio, A.

    1989-01-01

    Employee assistance programs have grown at a dramatic rate, yet the effectiveness of these programs has been called into question. The purpose of this paper was to assess the effectiveness of employee assistance programs (EAPs) by reviewing recently published EAP evaluations. All studies evaluating EAPs published since 1975 from peer-reviewed journals in the English language were included in this analysis. Each of the articles was assessed in the following areas: (a) program description (subjects, setting, type of intervention, format), (b) evaluation design (research design, variables measured, operational methods), and (c) program outcomes. Results indicate numerous methodological and conceptual weaknesses and issues. These weaknesses included lack of controlled research designs and short time lags between pre- and post-test measures. Other problems identified are missing information regarding subjects, type of intervention, how variables are measured (operational methods), and reliability and validity of evaluation instruments. Due to the aforementioned weaknesses, positive outcomes could not be supported. Recommendations are made for future EAP evaluations. PMID:2728498

  16. Do gender gaps in education and health affect economic growth? A cross-country study from 1975 to 2010.

    PubMed

    Mandal, Bidisha; Batina, Raymond G; Chen, Wen

    2018-05-01

    We use system-generalized method-of-moments to estimate the effect of gender-specific human capital on economic growth in a cross-country panel of 127 countries between 1975 and 2010. There are several benefits of using this methodology. First, a dynamic lagged dependent econometric model is suitable to address persistence in per capita output. Second, the generalized method-of-moments estimator uses dynamic properties of the data to generate appropriate instrumental variables to address joint endogeneity of the explanatory variables. Third, we allow the measurement error to include unobserved country-specific effect and random noise. We include two gender-disaggregated measures of human capital-education and health. We find that gender gap in health plays a critical role in explaining economic growth in developing countries. Our results provide aggregate evidence that returns to investments in health systematically differ across gender and between low-income and high-income countries. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Single-breath diffusing capacity for carbon monoxide instrument accuracy across 3 health systems.

    PubMed

    Hegewald, Matthew J; Markewitz, Boaz A; Wilson, Emily L; Gallo, Heather M; Jensen, Robert L

    2015-03-01

    Measuring diffusing capacity of the lung for carbon monoxide (DLCO) is complex and associated with wide intra- and inter-laboratory variability. Increased D(LCO) variability may have important clinical consequences. The objective of the study was to assess instrument performance across hospital pulmonary function testing laboratories using a D(LCO) simulator that produces precise and repeatable D(LCO) values. D(LCO) instruments were tested with CO gas concentrations representing medium and high range D(LCO) values. The absolute difference between observed and target D(LCO) value was used to determine measurement accuracy; accuracy was defined as an average deviation from the target value of < 2.0 mL/min/mm Hg. Accuracy of inspired volume measurement and gas sensors were also determined. Twenty-three instruments were tested across 3 healthcare systems. The mean absolute deviation from the target value was 1.80 mL/min/mm Hg (range 0.24-4.23) with 10 of 23 instruments (43%) being inaccurate. High volume laboratories performed better than low volume laboratories, although the difference was not significant. There was no significant difference among the instruments by manufacturers. Inspired volume was not accurate in 48% of devices; mean absolute deviation from target value was 3.7%. Instrument gas analyzers performed adequately in all instruments. D(LCO) instrument accuracy was unacceptable in 43% of devices. Instrument inaccuracy can be primarily attributed to errors in inspired volume measurement and not gas analyzer performance. D(LCO) instrument performance may be improved by regular testing with a simulator. Caution should be used when comparing D(LCO) results reported from different laboratories. Copyright © 2015 by Daedalus Enterprises.

  18. Regression Discontinuity for Causal Effect Estimation in Epidemiology.

    PubMed

    Oldenburg, Catherine E; Moscoe, Ellen; Bärnighausen, Till

    Regression discontinuity analyses can generate estimates of the causal effects of an exposure when a continuously measured variable is used to assign the exposure to individuals based on a threshold rule. Individuals just above the threshold are expected to be similar in their distribution of measured and unmeasured baseline covariates to individuals just below the threshold, resulting in exchangeability. At the threshold exchangeability is guaranteed if there is random variation in the continuous assignment variable, e.g., due to random measurement error. Under exchangeability, causal effects can be identified at the threshold. The regression discontinuity intention-to-treat (RD-ITT) effect on an outcome can be estimated as the difference in the outcome between individuals just above (or below) versus just below (or above) the threshold. This effect is analogous to the ITT effect in a randomized controlled trial. Instrumental variable methods can be used to estimate the effect of exposure itself utilizing the threshold as the instrument. We review the recent epidemiologic literature reporting regression discontinuity studies and find that while regression discontinuity designs are beginning to be utilized in a variety of applications in epidemiology, they are still relatively rare, and analytic and reporting practices vary. Regression discontinuity has the potential to greatly contribute to the evidence base in epidemiology, in particular on the real-life and long-term effects and side-effects of medical treatments that are provided based on threshold rules - such as treatments for low birth weight, hypertension or diabetes.

  19. Variables Affecting Preservice Teacher Candidate Identification of Teacher Sexual Misconduct

    ERIC Educational Resources Information Center

    Haverland, Jeffrey A.

    2017-01-01

    Using a quantitative research model, this study explored variables affecting pre-service teacher candidate identification of teacher sexual misconduct through a scenario-based survey instrument. Independent variables in this study were respondent gender, student gender, teacher gender, student age-related ambiguity (students depicted were 17),…

  20. Some Variables in Relation to Students' Anxiety in Learning Statistics.

    ERIC Educational Resources Information Center

    Sutarso, Toto

    The purpose of this study was to investigate some variables that relate to students' anxiety in learning statistics. The variables included sex, class level, students' achievement, school, mathematical background, previous statistics courses, and race. The instrument used was the 24-item Students' Attitudes Toward Statistics (STATS), which was…

  1. Variable-Structure Control of a Model Glider Airplane

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Anderson, Mark R.

    2008-01-01

    A variable-structure control system designed to enable a fuselage-heavy airplane to recover from spin has been demonstrated in a hand-launched, instrumented model glider airplane. Variable-structure control is a high-speed switching feedback control technique that has been developed for control of nonlinear dynamic systems.

  2. The Impact of a School Loaner-Instrument Program on the Attitudes and Achievement of Low-Income Music Students

    ERIC Educational Resources Information Center

    Ester, Don; Turner, Kristin

    2009-01-01

    The purpose of this study was to investigate the impact of a public school loaner-instrument program on the attitudes and achievement of low-income students in an urban environment. Socioeconomic Status (SES) and Instrument Status served as independent variables. Participants (N = 245) completed surveys at the beginning and end of the school year,…

  3. Remote Sensing of Non-Aerosol (anomalous) Absorption in Cloud Free Atmosphere

    NASA Technical Reports Server (NTRS)

    Kaufman, Yoram J.; Dubovik, Oleg; Smirnov, Alexander; Holben, Brent N.; Lau, William K. M. (Technical Monitor)

    2001-01-01

    The interaction of sunlight with atmospheric gases, aerosols and clouds is fundamental to the understanding of climate and its variation. Several studies questioned our understanding of atmospheric absorption of sunlight in cloudy or in cloud free atmospheres. Uncertainty in instruments' accuracy and in the analysis methods makes this problem difficult to resolve. Here we use several years of measurements of sky and sun spectral brightness by selected instruments of the Aerosol Robotic Network (AERONET), that have known and high measurement accuracy. The measurements taken in several locations around the world show that in the atmospheric windows 0.44, 0.06, 0.86 and 1.02 microns the only significant absorbers in cloud free atmosphere is aerosol and ozone. This conclusions is reached using a method developed to distinguish between absorption associated with the presence of aerosol and absorption that is not related to the presence of aerosol. Non-aerosol absorption, defined as spectrally independent or smoothly variable, was found to have an optical thickness smaller than 0.002 corresponding to absorption of sunlight less than 1W/sq m, or essentially zero.

  4. [Post-academic dental specialties. 11. Discomfort during atraumatic restorative treatment (ART) versus conventional restorative treatment].

    PubMed

    van Gemert-Schriks, M C M

    2007-05-01

    Although Atraumatic Restorative Treatment (ART) claims to be a patient-friendly method of treatment, little scientific proof of this is available. The aim of this study, therefore, was to acquire a reliable measurement of the degree of discomfort which children experience during dental treatment performed according to the ART approach and during the conventional method. A number of 403 Indonesian schoolchildren were randomly divided into 2 groups. In each child, one class II restoration was carried out on a deciduous molar either by means of ART or the use of rotary instruments (750 rpm). Discomfort scores were determined both by physiological measurements (heart rate) and behavioral observations (Venham scale). Venham scores showed a marked difference between the 2 groups, whereas heart rate scores only differed significantly during deep excavation. A correlation was found between Venham scores and heart rate measurements. Sex, initial anxiety and performing dentist were shown to be confounding variables. In conclusion it can be said that children treated according to the ART approach experience less discomfort than those treated with rotary instruments.

  5. An Eight-Century High-Resolution Paleoclimate Record From the Cariaco Basin: Baseline Variability and the 20th Century

    NASA Astrophysics Data System (ADS)

    Black, D. E.; Thunell, R. C.; Kaplan, A.; Tappa, E. J.; Peterson, L. C.

    2007-12-01

    The Cariaco Basin, Venezuela is well-positioned to record a detailed history of surface ocean changes along the southern margin of the Caribbean and the tropical Atlantic. Varved, high deposition rate sediments deposited under anoxic conditions and an abundance of well-preserved microfossils result in one of the few marine records capable of preserving evidence of interannual- to decadal-scale climate variability in the tropical Atlantic. Here we present Mg/Ca and stable oxygen isotope data with sub-decadal resolution derived from sediments deposited over the last 800 years. Mg/Ca measured on the planktic foraminifer Globigerina bulloides from a Cariaco Basin sediment core strongly correlates with spring (March-May) instrumental SSTs between AD 1870 and 1990. The long-term record displays a surprising amount of variability for a tropical location. The temperature swings are not necessarily related to local upwelling variability, but instead represent wider conditions in the Caribbean and western tropical Atlantic. The Mg/Ca-SST record also captures the decadal and multidecadal variability observed in global land and sea surface temperature anomalies, and correlates with Atlantic tropical storm and hurricane frequency over the late-19th and 20th centuries. On average, 20th century temperatures are not the warmest in the entire record, but they do show the largest increase in magnitude and fastest rate of SST change over the last eight hundred years. Stable oxygen isotope data also correlate well with instrumental SSTs, but not over the full instrumental record. Poor correlations with early instrumental SST data suggest a salinity overprint. However, reconstructing δ- water variability using combined Mg/Ca and δ18O data is not straightforward as the δ- water/salinity relationship varies seasonally in the Cariaco Basin. Comparisons with percent titanium data suggest intervals of both local and regional surface salinity changes over the length of the record.

  6. Satellite oceanography - The instruments

    NASA Technical Reports Server (NTRS)

    Stewart, R. H.

    1981-01-01

    It is pointed out that no instrument is sensitive to only one oceanographic variable; rather, each responds to a combination of atmospheric and oceanic phenomena. This complicates data interpretation and usually requires that a number of observations, each sensitive to somewhat different phenomena, be combined to provide unambiguous information. The distinction between active and passive instruments is described. A block diagram illustrating the steps necessary to convert data from satellite instruments into oceanographic information is included, as is a diagram illustrating the operation of a radio-frequency radiometer. Attention is also given to the satellites that carry the various oceanographic instruments.

  7. Energy conserving schemes for the simulation of musical instrument contact dynamics

    NASA Astrophysics Data System (ADS)

    Chatziioannou, Vasileios; van Walstijn, Maarten

    2015-03-01

    Collisions are an innate part of the function of many musical instruments. Due to the nonlinear nature of contact forces, special care has to be taken in the construction of numerical schemes for simulation and sound synthesis. Finite difference schemes and other time-stepping algorithms used for musical instrument modelling purposes are normally arrived at by discretising a Newtonian description of the system. However because impact forces are non-analytic functions of the phase space variables, algorithm stability can rarely be established this way. This paper presents a systematic approach to deriving energy conserving schemes for frictionless impact modelling. The proposed numerical formulations follow from discretising Hamilton's equations of motion, generally leading to an implicit system of nonlinear equations that can be solved with Newton's method. The approach is first outlined for point mass collisions and then extended to distributed settings, such as vibrating strings and beams colliding with rigid obstacles. Stability and other relevant properties of the proposed approach are discussed and further demonstrated with simulation examples. The methodology is exemplified through a case study on tanpura string vibration, with the results confirming the main findings of previous studies on the role of the bridge in sound generation with this type of string instrument.

  8. Factors associated with the health status of internally displaced persons in northern Uganda

    PubMed Central

    Roberts, B; Ocaka, K Felix; Browne, J; Oyok, T; Sondorp, E

    2009-01-01

    Background: Globally, there are over 24 million internally displaced persons (IDPs) who have fled their homes due to violence and insecurity but who remain within their own country. There have been up to 2 million IDPs in northern Uganda alone. The objective of this study was to investigate factors associated with mental and physical health status of IDPs in northern Uganda. Methods: A cross-sectional survey was conducted in November 2006 in IDP camps in the Gulu and Amuru districts of northern Uganda. The study outcome of physical and mental health was measured using the SF-8 instrument, which produces physical (PCS) and mental (MCS) component summary measures. Independent demographic, socio-economic, and trauma exposure (using the Harvard Trauma Questionnaire) variables were also measured. Multivariate regression linear regression analysis was conducted to investigate associations of the independent variables on the PCS and MCS outcomes. Results: 1206 interviews were completed. The respective mean PCS and MCS scores were 42.2 (95% CI 41.32 to 43.10) and 39.3 (95% CI 38.42 to 40.13), well below the instrument norm of 50, indicating poor health. Variables with negative associations with physical or mental health included gender, age, marital status, income, distance of camp from home areas, food security, soap availability, and sense of safety in the camp. A number of individual trauma variables and the frequency of trauma exposure also had negative associations with physical and mental health. Conclusions: This study provides evidence on the impact on health of deprivation of basic goods and services, traumatic events, and fear and uncertainty amongst displaced and crisis affected populations. PMID:19028730

  9. Reconstruction of total and spectral solar irradiance from 1974 to 2013 based on KPVT, SoHO/MDI, and SDO/HMI observations

    NASA Astrophysics Data System (ADS)

    Yeo, K. L.; Krivova, N. A.; Solanki, S. K.; Glassmeier, K. H.

    2014-10-01

    Context. Total and spectral solar irradiance are key parameters in the assessment of solar influence on changes in the Earth's climate. Aims: We present a reconstruction of daily solar irradiance obtained using the SATIRE-S model spanning 1974 to 2013 based on full-disc observations from the KPVT, SoHO/MDI, and SDO/HMI. Methods: SATIRE-S ascribes variation in solar irradiance on timescales greater than a day to photospheric magnetism. The solar spectrum is reconstructed from the apparent surface coverage of bright magnetic features and sunspots in the daily data using the modelled intensity spectra of these magnetic structures. We cross-calibrated the various data sets, harmonizing the model input so as to yield a single consistent time series as the output. Results: The model replicates 92% (R2 = 0.916) of the variability in the PMOD TSI composite including the secular decline between the 1996 and 2008 solar cycle minima. The model also reproduces most of the variability in observed Lyman-α irradiance and the Mg II index. The ultraviolet solar irradiance measurements from the UARS and SORCE missions are mutually consistent up to about 180 nm before they start to exhibit discrepant rotational and cyclical variability, indicative of unresolved instrumental effects. As a result, the agreement between model and measurement, while relatively good below 180 nm, starts to deteriorate above this wavelength. As with earlier similar investigations, the reconstruction cannot reproduce the overall trends in SORCE/SIM SSI. We argue, from the lack of clear solar cycle modulation in the SIM record and the inconsistency between the total flux recorded by the instrument and TSI, that unaccounted instrumental trends are present. Conclusions: The daily solar irradiance time series is consistent with observations from multiple sources, demonstrating its validity and utility for climate models. It also provides further evidence that photospheric magnetism is the prime driver of variation in solar irradiance on timescales greater than a day.

  10. Using a simple apparatus to measure direct and diffuse photosynthetically active radiation at remote locations.

    PubMed

    Cruse, Michael J; Kucharik, Christopher J; Norman, John M

    2015-01-01

    Plant canopy interception of photosynthetically active radiation (PAR) drives carbon dioxide (CO2), water and energy cycling in the soil-plant-atmosphere system. Quantifying intercepted PAR requires accurate measurements of total incident PAR above canopies and direct beam and diffuse PAR components. While some regional data sets include these data, e.g. from Atmospheric Radiation Measurement (ARM) Program sites, they are not often applicable to local research sites because of the variable nature (spatial and temporal) of environmental variables that influence incoming PAR. Currently available instrumentation that measures diffuse and direct beam radiation separately can be cost prohibitive and require frequent adjustments. Alternatively, generalized empirical relationships that relate atmospheric variables and radiation components can be used but require assumptions that increase the potential for error. Our goal here was to construct and test a cheaper, highly portable instrument alternative that could be used at remote field sites to measure total, diffuse and direct beam PAR for extended time periods without supervision. The apparatus tested here uses a fabricated, solar powered rotating shadowband and other commercially available parts to collect continuous hourly PAR data. Measurements of total incident PAR had nearly a one-to-one relationship with total incident radiation measurements taken at the same research site by an unobstructed point quantum sensor. Additionally, measurements of diffuse PAR compared favorably with modeled estimates from previously published data, but displayed significant differences that were attributed to the important influence of rapidly changing local environmental conditions. The cost of the system is about 50% less than comparable commercially available systems that require periodic, but not continual adjustments. Overall, the data produced using this apparatus indicates that this instrumentation has the potential to support ecological research via a relatively inexpensive method to collect continuous measurements of total, direct beam and diffuse PAR in remote locations.

  11. Using a Simple Apparatus to Measure Direct and Diffuse Photosynthetically Active Radiation at Remote Locations

    PubMed Central

    Cruse, Michael J.; Kucharik, Christopher J.; Norman, John M.

    2015-01-01

    Plant canopy interception of photosynthetically active radiation (PAR) drives carbon dioxide (CO2), water and energy cycling in the soil-plant-atmosphere system. Quantifying intercepted PAR requires accurate measurements of total incident PAR above canopies and direct beam and diffuse PAR components. While some regional data sets include these data, e.g. from Atmospheric Radiation Measurement (ARM) Program sites, they are not often applicable to local research sites because of the variable nature (spatial and temporal) of environmental variables that influence incoming PAR. Currently available instrumentation that measures diffuse and direct beam radiation separately can be cost prohibitive and require frequent adjustments. Alternatively, generalized empirical relationships that relate atmospheric variables and radiation components can be used but require assumptions that increase the potential for error. Our goal here was to construct and test a cheaper, highly portable instrument alternative that could be used at remote field sites to measure total, diffuse and direct beam PAR for extended time periods without supervision. The apparatus tested here uses a fabricated, solar powered rotating shadowband and other commercially available parts to collect continuous hourly PAR data. Measurements of total incident PAR had nearly a one-to-one relationship with total incident radiation measurements taken at the same research site by an unobstructed point quantum sensor. Additionally, measurements of diffuse PAR compared favorably with modeled estimates from previously published data, but displayed significant differences that were attributed to the important influence of rapidly changing local environmental conditions. The cost of the system is about 50% less than comparable commercially available systems that require periodic, but not continual adjustments. Overall, the data produced using this apparatus indicates that this instrumentation has the potential to support ecological research via a relatively inexpensive method to collect continuous measurements of total, direct beam and diffuse PAR in remote locations. PMID:25668208

  12. Continuously-Variable Positive-Mesh Power Transmission

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1982-01-01

    Proposed transmission with continuously-variable speed ratio couples two mechanical trigonometric-function generators. Transmission is expected to handle higher loads than conventional variable-pulley drives; and, unlike variable pulley, positive traction through entire drive train with no reliance on friction to transmit power. Able to vary speed continuously through zero and into reverse. Possible applications in instrumentation where drive-train slippage cannot be tolerated.

  13. VizieR Online Data Catalog: Variable Stars in the Galactic Center (Dong+, 2017)

    NASA Astrophysics Data System (ADS)

    Dong, H.; Schodel, R.; William, B. F.; Nogueras-Lara, F.; Gallego-Cano, E.; Gallego-Calvente, T.; Wang, Q. D.; Morris, R. M.; Do, T.; Ghez, A.

    2017-06-01

    We use the 'DOLPHOT' to detect sources and extract photometry from the HST WFC3/IR observations at the F127M and F135M bands of the Galactic Centre from 2010 to 2014. The F153M observations, which are used to identify variable stars, include 290 dithered exposures from six HST programs. The detailed description of the HST dataset are given in Table 1 of the paper. We identified 33070 sources. Their F127M and F153M magnitudes, as well as their uncertainties, are given in Table 3. For each star, we used the least chi square method to identify whether it is variable or not. The output from the least chi square method are chi2y and chi2d, which are calculated from all the 290 dithered exposures and the exposures in March and April, 2014, respectively, to examine whether the star varies among years and/or days. In order to reduce the potential variation among dithered exposures, which could be potentially introduced by instrument effects, we also bin the dithered exposures and use the least chi square method to calculate chi2y,b and chi2{d,b}. We classify stars with chi2y>3 and chi2y,b>2 are variables among years and stars with chi2d>3 and chi2d,b>2 are variables among days. The detailed description about the data analysis is given in the paper. In Table 4, we gives the magnitudes of sources in individual dithered exposures, as well as the photometric uncertainties and the quality control parameters provided by 'DOLPHOT', such as signal-to-noise ratio, sharpness^2, crowd and flag. We also cross-correlated our variables with previous variable studies taken by ground-based telescopes in Table 8 and spectroscopic observations in Table 9. (4 data files).

  14. Short-term Variability of Extinction by Broadband Stellar Photometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musat, I.C.; Ellingson, R.G.

    2005-03-18

    Aerosol optical depth variation over short-term time intervals is determined from broadband observations of stars with a whole sky imager. The main difficulty in such measurements consists of accurately separating the star flux value from the non-stellar diffuse skylight. Using correction method to overcome this difficulty, the monochromatic extinction at the ground due to aerosols is extracted from heterochromatic measurements. A form of closure is achieved by comparison with simultaneous or temporally close measurements with other instruments, and the total error of the method, as a combination of random error of measurements and systematic error of calibration and model, ismore » assessed as being between 2.6 and 3% rms.« less

  15. Sterilization of endoscopic instruments.

    PubMed

    Sabnis, Ravindra B; Bhattu, Amit; Vijaykumar, Mohankumar

    2014-03-01

    Sterilization of endoscopic instruments is an important but often ignored topic. The purpose of this article is to review the current literature on the sterilization of endoscopic instruments and elaborate on the appropriate sterilization practices. Autoclaving is an economic and excellent method of sterilizing the instruments that are not heat sensitive. Heat sensitive instruments may get damaged with hot sterilization methods. Several new endoscopic instruments such as flexible ureteroscopes, chip on tip endoscopes, are added in urologists armamentarium. Many of these instruments are heat sensitive and hence alternative efficacious methods of sterilization are necessary. Although ethylene oxide and hydrogen peroxide are excellent methods of sterilization, they have some drawbacks. Gamma irradiation is mainly for disposable items. Various chemical agents are widely used even though they achieve high-level disinfection rather than sterilization. This article reviews various methods of endoscopic instrument sterilization with their advantages and drawbacks. If appropriate sterilization methods are adopted, then it not only will protect patients from procedure-related infections but prevent hypersensitive allergic reactions. It will also protect instruments from damage and increase its longevity.

  16. Information content of MOPITT CO profile retrievals: Temporal and geographical variability

    NASA Astrophysics Data System (ADS)

    Deeter, M. N.; Edwards, D. P.; Gille, J. C.; Worden, H. M.

    2015-12-01

    Satellite measurements of tropospheric carbon monoxide (CO) enable a wide array of applications including studies of air quality and pollution transport. The MOPITT (Measurements of Pollution in the Troposphere) instrument on the Earth Observing System Terra platform has been measuring CO concentrations globally since March 2000. As indicated by the Degrees of Freedom for Signal (DFS), the standard metric for trace-gas retrieval information content, MOPITT retrieval performance varies over a wide range. We show that both instrumental and geophysical effects yield significant geographical and temporal variability in MOPITT DFS values. Instrumental radiance uncertainties, which describe random errors (or "noise") in the calibrated radiances, vary over long time scales (e.g., months to years) and vary between the four detector elements of MOPITT's linear detector array. MOPITT retrieval performance depends on several factors including thermal contrast, fine-scale variability of surface properties, and CO loading. The relative importance of these various effects is highly variable, as demonstrated by analyses of monthly mean DFS values for the United States and the Amazon Basin. An understanding of the geographical and temporal variability of MOPITT retrieval performance is potentially valuable to data users seeking to limit the influence of the a priori through data filtering. To illustrate, it is demonstrated that calculated regional-average CO mixing ratios may be improved by excluding observations from a subset of pixels in MOPITT's linear detector array.

  17. Horizontal Temperature Variability in the Stratosphere: Global Variations Inferred from CRISTA Data

    NASA Technical Reports Server (NTRS)

    Eidmann, G.; Offermann, D.; Jarisch, M.; Preusse, P.; Eckermann, S. D.; Schmidlin, F. J.

    2001-01-01

    In two separate orbital campaigns (November, 1994 and August, 1997), the Cryogenic Infrared Spectrometers and Telescopes for the Atmosphere (CRISTA) instrument acquired global stratospheric data of high accuracy and high spatial resolution. The standard limb-scanned CRISTA measurements resolved atmospheric spatial structures with vertical dimensions greater than or equal to 1.5 - 2 km and horizontal dimensions is greater than or equal to 100 - 200 km. A fluctuation analysis of horizontal temperature distributions derived from these data is presented. This method is somewhat complementary to conventional power-spectral analysis techniques.

  18. Design of Instrument Dials for Maximum Legibility. Part 5. Origin Location, Scale Break, Number Location, and Contrast Direction

    DTIC Science & Technology

    1951-05-01

    prccedur&:s to be of hipn accuracy. Ambij;uity of subject responizes due to overlap of entries on tU,, record sheets vas negligible. Handwriting ...experimental variables on reading errors us carried out by analysis of variance methods. For this purpose it was convenient to consider different classes...on any scale - an error ofY one numbered division. For this reason, the result. of the analysis of variance of the /10’s errors by dial types may

  19. Important variables for parents' postnatal sense of security: evaluating a new Swedish instrument (the PPSS instrument).

    PubMed

    Persson, Eva K; Dykes, Anna-Karin

    2009-08-01

    to evaluate dimensions of both parents' postnatal sense of security the first week after childbirth, and to determine associations between the PPSS instrument and different sociodemographic and situational background variables. evaluative, cross-sectional design. 113 mothers and 99 fathers with children live born at term, from five hospitals in southern Sweden. mothers and fathers had similar feelings concerning postnatal sense of security. Of the dimensions in the PPSS instrument, a sense of midwives'/nurses' empowering behaviour, a sense of one's own general well-being and a sense of the mother's well-being as experienced by the father were the most important dimensions for parents' experienced security. A sense of affinity within the family (for both parents) and a sense of manageable breast feeding (for mothers) were not significantly associated with their experienced security. A sense of participation during pregnancy and general anxiety were significantly associated background variables for postnatal sense of security for both parents. For the mothers, parity and a sense that the father was participating during pregnancy were also significantly associated. more focus on parents' participation during pregnancy as well as midwives'/nurses' empowering behaviour during the postnatal period will be beneficial for both parents' postnatal sense of security.

  20. Episiotomy and its relationship to various clinical variables that influence its performance

    PubMed Central

    Ballesteros-Meseguer, Carmen; Carrillo-García, César; Meseguer-de-Pedro, Mariano; Canteras-Jordana, Manuel; Martínez-Roche, Mª Emilia

    2016-01-01

    Objective: to understand the episiotomy rate and its relationship with various clinical variables. Method: a descriptive, cross-sectional, analytic study of 12,093 births in a tertiary hospital. Variables: Parity, gestational age, start of labor, use of epidural analgesia, oxytocin usage, position during fetal explusion, weight of neonate, and completion of birth. The analysis was performed with SPSS 19.0. Results: the global percentage of episiotomies was 50%. The clinical variables that presented a significant association were primiparity (RR=2.98), gestational age >41 weeks (RR=1.2), augmented or induced labor (RR=1.33), epidural analgesia use (RR=1,95), oxytocin use (RR=1.58), lithotomy position during fetal expulsion (RR=6.4), and instrumentation (RR=1.84). Furthermore, maternal age ≥35 years (RR=0.85) and neonatal weight <2500 g (RR=0.8) were associated with a lower incidence of episiotomy. Conclusions: episiotomy is dependent on obstetric interventions performed during labor. If we wish to reduce the episiotomy rate, it will be necessary to bear in mind these risk factors when establishing policies for reducing this procedure. PMID:27224064

  1. Handwriting Movement Kinematics for Quantifying EPS in Patients Treated with Atypical Antipsychotics

    PubMed Central

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Niculescu, Alexander B.; Lohr, James B.

    2009-01-01

    Ongoing monitoring of neuroleptic-induced extrapyramidal side effects (EPS) is important to maximize treatment outcome, improve medication adherence and reduce re-hospitalization. Traditional approaches for assessing EPS such as parkinsonism, tardive akathisia, or dyskinesia rely upon clinical ratings. However, these observer-based EPS severity ratings can be unreliable and are subject to examiner bias. In contrast, quantitative instrumental methods are less subject to bias. Most instrumental methods have only limited clinical utility because of their complexity and costs. This paper describes an easy-to-use instrumental approach based on handwriting movements for quantifying EPS. Here, we present findings from psychiatric patients treated with atypical (second generation) antipsychotics. The handwriting task consisted of a sentence written several times within a 2 cm vertical boundary at a comfortable speed using an inkless pen and digitizing tablet. Kinematic variables including movement duration, peak vertical velocity and the number of acceleration peaks, and average normalized jerk (a measure of smoothness) for each up or down stroke and their submovements were analyzed. Results from 59 psychosis patients and 46 healthy comparison subjects revealed significant slowing and dysfluency in patients compared to controls. We observed differences across medications and daily dose. These findings support the ecological validity of handwriting movement analysis as an objective behavioral biomarker for quantifying the effects of antipsychotic medication and dose on the motor system. PMID:20381875

  2. Interannual, seasonal and diurnal Mars surface environmental cycles observed from Viking to Curiosity

    NASA Astrophysics Data System (ADS)

    Martinez, German; Vicente-Retortillo, Álvaro; Kemppinen, Osku; Fischer, Erik; Fairen, Alberto G.; Guzewich, Scott David; Haberle, Robert; Lemmon, Mark T.; Newman, Claire E.; Renno, Nilton O.; Richardson, Mark I.; Smith, Michael D.; De la Torre, Manuel; Vasavada, Ashwin R.

    2016-10-01

    We analyze in-situ environmental data from the Viking landers to the Curiosity rover to estimate atmospheric pressure, near-surface air and ground temperature, relative humidity, wind speed and dust opacity with the highest confidence possible. We study the interannual, seasonal and diurnal variability of these quantities at the various landing sites over a span of more than twenty Martian years to characterize the climate on Mars and its variability. Additionally, we characterize the radiative environment at the various landing sites by estimating the daily UV irradiation (also called insolation and defined as the total amount of solar UV energy received on flat surface during one sol) and by analyzing its interannual and seasonal variability.In this study we use measurements conducted by the Viking Meteorology Instrument System (VMIS) and Viking lander camera onboard the Viking landers (VL); the Atmospheric Structure Instrument/Meteorology (ASIMET) package and the Imager for Mars Pathfinder (IMP) onboard the Mars Pathfinder (MPF) lander; the Miniature Thermal Emission Spectrometer (Mini-TES) and Pancam instruments onboard the Mars Exploration Rovers (MER); the Meteorological Station (MET), Thermal Electrical Conductivity Probe (TECP) and Phoenix Surface Stereo Imager (SSI) onboard the Phoenix (PHX) lander; and the Rover Environmental Monitoring Station (REMS) and Mastcam instrument onboard the Mars Science Laboratory (MSL) rover.A thorough analysis of in-situ environmental data from past and present missions is important to aid in the selection of the Mars 2020 landing site. We plan to extend our analysis of Mars surface environmental cycles by using upcoming data from the Temperature and Wind sensors (TWINS) instrument onboard the InSight mission and the Mars Environmental Dynamics Analyzer (MEDA) instrument onboard the Mars 2020 mission.

  3. Use of Objective Metrics in Dynamic Facial Reanimation: A Systematic Review.

    PubMed

    Revenaugh, Peter C; Smith, Ryan M; Plitt, Max A; Ishii, Lisa; Boahene, Kofi; Byrne, Patrick J

    2018-06-21

    Facial nerve deficits cause significant functional and social consequences for those affected. Existing techniques for dynamic restoration of facial nerve function are imperfect and result in a wide variety of outcomes. Currently, there is no standard objective instrument for facial movement as it relates to restorative techniques. To determine what objective instruments of midface movement are used in outcome measurements for patients treated with dynamic methods for facial paralysis. Database searches from January 1970 to June 2017 were performed in PubMed, Embase, Cochrane Library, Web of Science, and Scopus. Only English-language articles on studies performed in humans were considered. The search terms used were ("Surgical Flaps"[Mesh] OR "Nerve Transfer"[Mesh] OR "nerve graft" OR "nerve grafts") AND (face [mh] OR facial paralysis [mh]) AND (innervation [sh]) OR ("Face"[Mesh] OR facial paralysis [mh]) AND (reanimation [tiab]). Two independent reviewers evaluated the titles and abstracts of all articles and included those that reported objective outcomes of a surgical technique in at least 2 patients. The presence or absence of an objective instrument for evaluating outcomes of midface reanimation. Additional outcome measures were reproducibility of the test, reporting of symmetry, measurement of multiple variables, and test validity. Of 241 articles describing dynamic facial reanimation techniques, 49 (20.3%) reported objective outcome measures for 1898 patients. Of those articles reporting objective measures, there were 29 different instruments, only 3 of which reported all outcome measures. Although instruments are available to objectively measure facial movement after reanimation techniques, most studies do not report objective outcomes. Of objective facial reanimation instruments, few are reproducible and able to measure symmetry and multiple data points. To accurately compare objective outcomes in facial reanimation, a reproducible, objective, and universally applied instrument is needed.

  4. Nurse Competence Scale: a systematic and psychometric review.

    PubMed

    Flinkman, Mervi; Leino-Kilpi, Helena; Numminen, Olivia; Jeon, Yunsuk; Kuokkanen, Liisa; Meretoja, Riitta

    2017-05-01

    The aim of this study was to report a systematic and psychometric review. The Nurse Competence Scale is currently the most widely used generic instrument to measure Registered Nurses' competence in different phases of their careers. Based on a decade of research, this review provides a summary of the existing evidence. A systematic literature review of research evidence and psychometric properties. Nine databases from 2004 - October 2015. We retrieved scientific publications in English and Finnish. Two researchers performed data selection and appraised the methodological quality using the COnsensus-based Standards for the selection of health status Measurement INstruments checklist. A total of 30 studies reported in 43 publications were included. These consisted of over 11,000 competence assessments. Twenty studies were from Europe and 10 from outside Europe. In addition to experienced nurses, the Nurse Competence Scale has been used for the competence assessment of newly graduated nurses and nursing students, mainly in hospital settings. Length of work experience, age, higher education, permanent employment and participation in educational programmes correlated positively with competence. Variables including empowerment, commitment, practice environment, quality of care and critical thinking were also associated with higher competence. The Nurse Competence Scale has demonstrated good content validity and appropriate internal consistency. The value of Nurse Competence Scale has been confirmed in determining relationships between background variables and competence. The instrument has been widely used with experienced and newly graduated nurses and their managers. Cross-cultural validation must be continued using rigorous methods. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  5. Approaches to vegetation mapping and ecophysiological hypothesis testing using combined information from TIMS, AVIRIS, and AIRSAR

    NASA Technical Reports Server (NTRS)

    Oren, R.; Vane, G.; Zimmermann, R.; Carrere, V.; Realmuto, V.; Zebker, Howard A.; Schoeneberger, P.; Schoeneberger, M.

    1991-01-01

    The Tropical Rainforest Ecology Experiment (TREE) had two primary objectives: (1) to design a method for mapping vegetation in tropical regions using remote sensing and determine whether the result improves on available vegetation maps; and (2) to test a specific hypothesis on plant/water relations. Both objectives were thought achievable with the combined information from the Thermal Infrared Multispectral Scanner (TIMS), Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and Airborne Synthetic Aperture Radar (AIRSAR). Implicitly, two additional objectives were: (1) to ascertain that the range within each variable potentially measurable with the three instruments is large enough in the site, relative to the sensitivity of the instruments, so that differences between ecological groups may be detectable; and (2) to determine the ability of the three systems to quantify different variables and sensitivities. We found that the ranges in values of foliar nitrogen concentration, water availability, stand structure and species composition, and plant/water relations were large, even within the upland broadleaf vegetation type. The range was larger when other vegetation types were considered. Unfortunately, cloud cover and navigation errors compromised the utility of the TIMS and AVIRIS data. Nevertheless, the AIRSAR data alone appear to have improved on the available vegetation map for the study area. An example from an area converted to a farm is given to demonstrate how the combined information from AIRSAR, TIMS, and AVIRIS can uniquely identify distinct classes of land use. The example alludes to the potential utility of the three instruments for identifying vegetation at an ecological scale finer than vegetation types.

  6. Right adolescent idiopathic thoracic curve (Lenke 1 A and B): does cost of instrumentation and implant density improve radiographic and cosmetic parameters?

    PubMed

    Yang, Scott; Jones-Quaidoo, Sean M; Eager, Matthew; Griffin, Justin W; Reddi, Vasantha; Novicoff, Wendy; Shilt, Jeffrey; Bersusky, Ernesto; Defino, Helton; Ouellet, Jean; Arlet, Vincent

    2011-07-01

    In adolescent idiopathic scoliosis (AIS) there has been a shift towards increasing the number of implants and pedicle screws, which has not been proven to improve cosmetic correction. To evaluate if increasing cost of instrumentation correlates with cosmetic correction using clinical photographs. 58 Lenke 1A and B cases from a multicenter AIS database with at least 3 months follow-up of clinical photographs were used for analysis. Cosmetic parameters on PA and forward bending photographs included angular measurements of trunk shift, shoulder balance, rib hump, and ratio measurements of waist line asymmetry. Pre-op and follow-up X-rays were measured for coronal and sagittal deformity parameters. Cost density was calculated by dividing the total cost of instrumentation by the number of vertebrae being fused. Linear regression and spearman's correlation were used to correlate cost density to X-ray and photo outcomes. Three independent observers verified radiographic and cosmetic parameters for inter/interobserver variability analysis. Average pre-op Cobb angle and instrumented correction were 54° (SD 12.5) and 59% (SD 25) respectively. The average number of vertebrae fused was 10 (SD 1.9). The total cost of spinal instrumentation ranged from $6,769 to $21,274 (Mean $12,662, SD $3,858). There was a weak positive and statistically significant correlation between Cobb angle correction and cost density (r = 0.33, p = 0.01), and no correlation between Cobb angle correction of the uninstrumented lumbar spine and cost density (r = 0.15, p = 0.26). There was no significant correlation between all sagittal X-ray measurements or any of the photo parameters and cost density. There was good to excellent inter/intraobserver variability of all photographic parameters based on the intraclass correlation coefficient (ICC 0.74-0.98). Our method used to measure cosmesis had good to excellent inter/intraobserver variability, and may be an effective tool to objectively assess cosmesis from photographs. Since increasing cost density only improves mildly the Cobb angle correction of the main thoracic curve and not the correction of the uninstrumented spine or any of the cosmetic parameters, one should consider the cost of increasing implant density in Lenke 1A and B curves. In the area of rationalization of health care expenses, this study demonstrates that increasing the number of implants does not improve any relevant cosmetic or radiographic outcomes.

  7. STEREO TRansiting Exoplanet and Stellar Survey (STRESS) - I. Introduction and data pipeline

    NASA Astrophysics Data System (ADS)

    Sangaralingam, Vinothini; Stevens, Ian R.

    2011-12-01

    The Solar TErrestrial RElations Observatory (STEREO) is a system of two identical spacecraft in heliocentric Earth orbit. We use the two heliospheric imagers (HI), which are wide-angle imagers with multibaffle systems, to perform high-precision stellar photometry in order to search for exoplanetary transits and understand stellar variables. The large cadence (40 min for HI-1 and 2 h for HI-2), high precision, wide magnitude range (R mag: 4-12) and broad sky coverage (nearly 20 per cent for HI-1A alone and 60 per cent of the sky in the zodiacal region for all instruments combined) of this instrument place it in a region left largely devoid by other current projects. In this paper, we describe the semi-automated pipeline devised for reduction of the data, some of the interesting characteristics of the data obtained and data-analysis methods used, along with some early results.

  8. Signal-to-noise optimization and evaluation of a home-made visible diode-array spectrophotometer

    PubMed Central

    Raimundo, Jr., Ivo M.; Pasquini, Celio

    1993-01-01

    This paper describes a simple low-cost multichannel visible spectrophotometer built with an RL512G EGG-Reticon photodiode array. A symmetric Czerny-Turner optical design was employed; instrument control was via a single-board microcomputer based on the 8085 Intel microprocessor. Spectral intensity data are stored in the single-board's RAM and then transferred to an IBM-AT 3865X compatible microcomputer through a RS-232C interface. This external microcomputer processes the data to recover transmittance, absorbance or relative intensity of the spectra. The signal-to-noise ratio and dynamic range were improved by using variable integration times, which increase during the same scan; and by the use of either weighted or unweighted sliding average of consecutive diodes. The instrument is suitable for automatic methods requiring quasi-simultaneous multiwavelength detections, such as multivariative calibration and flow-injection gradient scan techniques. PMID:18924979

  9. Automated telescope for variability studies

    NASA Astrophysics Data System (ADS)

    Ganesh, S.; Baliyan, K. S.; Chandra, S.; Joshi, U. C.; Kalyaan, A.; Mathur, S. N.

    PRL has installed a 50 cm telescope at Mt Abu, Gurushikhar. The backend instrument consists of a 1K × 1K EMCCD camera with standard UBVRI filters and also has polarization measurement capability using a second filter wheel with polaroid sheets oriented at different position angles. This 50 cm telescope observatory is operated in a robotic mode with different methods of scheduling of the objects being observed. This includes batch mode, fully manual as well as fully autonomous mode of operation. Linux based command line as well as GUI software are used entirely in this observatory. This talk will present the details of the telescope and associated instruments and auxiliary facilities for weather monitoring that were developed in house to ensure the safe and reliable operation of the telescope. The facility has been in use for a couple of years now and various objects have been observed. Some of the interesting results will also be presented.

  10. Developing and testing an instrument to measure the presence of conditions for successful implementation of quality improvement collaboratives.

    PubMed

    Dückers, Michel L A; Wagner, Cordula; Groenewegen, Peter P

    2008-08-11

    In quality improvement collaboratives (QICs) teams of practitioners from different health care organizations are brought together to systematically improve an aspect of patient care. Teams take part in a series of meetings to learn about relevant best practices, quality methods and change ideas, and share experiences in making changes in their own local setting. The purpose of this study was to develop an instrument for measuring team organization, external change agent support and support from the team's home institution in a Dutch national improvement and dissemination programme for hospitals based on several QICs. The exploratory methodological design included two phases: a) content development and assessment, resulting in an instrument with 15 items, and b) field testing (N = 165). Internal consistency reliability was tested via Cronbach's alpha coefficient. Principal component analyses were used to identify underlying constructs. Tests of scaling assumptions according to the multi trait/multi-item matrix, were used to confirm the component structure. Three components were revealed, explaining 65% of the variability. The components were labelled 'organizational support', 'team organization' and 'external change agent support'. One item not meeting item-scale criteria was removed. This resulted in a 14 item instrument. Scale reliability ranged from 0.77 to 0.91. Internal item consistency and divergent validity were satisfactory. On the whole, the instrument appears to be a promising tool for assessing team organization and internal and external support during QIC implementation. The psychometric properties were good and warrant application of the instrument for the evaluation of the national programme and similar improvement programmes.

  11. Microarray image analysis: background estimation using quantile and morphological filters.

    PubMed

    Bengtsson, Anders; Bengtsson, Henrik

    2006-02-28

    In a microarray experiment the difference in expression between genes on the same slide is up to 103 fold or more. At low expression, even a small error in the estimate will have great influence on the final test and reference ratios. In addition to the true spot intensity the scanned signal consists of different kinds of noise referred to as background. In order to assess the true spot intensity background must be subtracted. The standard approach to estimate background intensities is to assume they are equal to the intensity levels between spots. In the literature, morphological opening is suggested to be one of the best methods for estimating background this way. This paper examines fundamental properties of rank and quantile filters, which include morphological filters at the extremes, with focus on their ability to estimate between-spot intensity levels. The bias and variance of these filter estimates are driven by the number of background pixels used and their distributions. A new rank-filter algorithm is implemented and compared to methods available in Spot by CSIRO and GenePix Pro by Axon Instruments. Spot's morphological opening has a mean bias between -47 and -248 compared to a bias between 2 and -2 for the rank filter and the variability of the morphological opening estimate is 3 times higher than for the rank filter. The mean bias of Spot's second method, morph.close.open, is between -5 and -16 and the variability is approximately the same as for morphological opening. The variability of GenePix Pro's region-based estimate is more than ten times higher than the variability of the rank-filter estimate and with slightly more bias. The large variability is because the size of the background window changes with spot size. To overcome this, a non-adaptive region-based method is implemented. Its bias and variability are comparable to that of the rank filter. The performance of more advanced rank filters is equal to the best region-based methods. However, in order to get unbiased estimates these filters have to be implemented with great care. The performance of morphological opening is in general poor with a substantial spatial-dependent bias.

  12. Conditional screening for ultra-high dimensional covariates with survival outcomes

    PubMed Central

    Hong, Hyokyoung G.; Li, Yi

    2017-01-01

    Identifying important biomarkers that are predictive for cancer patients’ prognosis is key in gaining better insights into the biological influences on the disease and has become a critical component of precision medicine. The emergence of large-scale biomedical survival studies, which typically involve excessive number of biomarkers, has brought high demand in designing efficient screening tools for selecting predictive biomarkers. The vast amount of biomarkers defies any existing variable selection methods via regularization. The recently developed variable screening methods, though powerful in many practical setting, fail to incorporate prior information on the importance of each biomarker and are less powerful in detecting marginally weak while jointly important signals. We propose a new conditional screening method for survival outcome data by computing the marginal contribution of each biomarker given priorily known biological information. This is based on the premise that some biomarkers are known to be associated with disease outcomes a priori. Our method possesses sure screening properties and a vanishing false selection rate. The utility of the proposal is further confirmed with extensive simulation studies and analysis of a diffuse large B-cell lymphoma dataset. We are pleased to dedicate this work to Jack Kalbfleisch, who has made instrumental contributions to the development of modern methods of analyzing survival data. PMID:27933468

  13. Applications of neural networks in training science.

    PubMed

    Pfeiffer, Mark; Hohmann, Andreas

    2012-04-01

    Training science views itself as an integrated and applied science, developing practical measures founded on scientific method. Therefore, it demands consideration of a wide spectrum of approaches and methods. Especially in the field of competitive sports, research questions are usually located in complex environments, so that mainly field studies are drawn upon to obtain broad external validity. Here, the interrelations between different variables or variable sets are mostly of a nonlinear character. In these cases, methods like neural networks, e.g., the pattern recognizing methods of Self-Organizing Kohonen Feature Maps or similar instruments to identify interactions might be successfully applied to analyze data. Following on from a classification of data analysis methods in training-science research, the aim of the contribution is to give examples of varied sports in which network approaches can be effectually used in training science. First, two examples are given in which neural networks are employed for pattern recognition. While one investigation deals with the detection of sporting talent in swimming, the other is located in game sports research, identifying tactical patterns in team handball. The third and last example shows how an artificial neural network can be used to predict competitive performance in swimming. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Mapping Spatial Variability of Soil Salinity in a Coastal Paddy Field Based on Electromagnetic Sensors

    PubMed Central

    Guo, Yan; Huang, Jingyi; Shi, Zhou; Li, Hongyi

    2015-01-01

    In coastal China, there is an urgent need to increase land area for agricultural production and urban development, where there is a rapid growing population. One solution is land reclamation from coastal tidelands, but soil salinization is problematic. As such, it is very important to characterize and map the within-field variability of soil salinity in space and time. Conventional methods are often time-consuming, expensive, labor-intensive, and unpractical. Fortunately, proximal sensing has become an important technology in characterizing within-field spatial variability. In this study, we employed the EM38 to study spatial variability of soil salinity in a coastal paddy field. Significant correlation relationship between ECa and EC1:5 (i.e. r >0.9) allowed us to use EM38 data to characterize the spatial variability of soil salinity. Geostatistical methods were used to determine the horizontal spatio-temporal variability of soil salinity over three consecutive years. The study found that the distribution of salinity was heterogeneous and the leaching of salts was more significant in the edges of the study field. By inverting the EM38 data using a Quasi-3D inversion algorithm, the vertical spatio-temporal variability of soil salinity was determined and the leaching of salts over time was easily identified. The methodology of this study can be used as guidance for researchers interested in understanding soil salinity development as well as land managers aiming for effective soil salinity monitoring and management practices. In order to better characterize the variations in soil salinity to a deeper soil profile, the deeper mode of EM38 (i.e., EM38v) as well as other EMI instruments (e.g. DUALEM-421) can be incorporated to conduct Quasi-3D inversions for deeper soil profiles. PMID:26020969

  15. Mapping spatial variability of soil salinity in a coastal paddy field based on electromagnetic sensors.

    PubMed

    Guo, Yan; Huang, Jingyi; Shi, Zhou; Li, Hongyi

    2015-01-01

    In coastal China, there is an urgent need to increase land area for agricultural production and urban development, where there is a rapid growing population. One solution is land reclamation from coastal tidelands, but soil salinization is problematic. As such, it is very important to characterize and map the within-field variability of soil salinity in space and time. Conventional methods are often time-consuming, expensive, labor-intensive, and unpractical. Fortunately, proximal sensing has become an important technology in characterizing within-field spatial variability. In this study, we employed the EM38 to study spatial variability of soil salinity in a coastal paddy field. Significant correlation relationship between ECa and EC1:5 (i.e. r >0.9) allowed us to use EM38 data to characterize the spatial variability of soil salinity. Geostatistical methods were used to determine the horizontal spatio-temporal variability of soil salinity over three consecutive years. The study found that the distribution of salinity was heterogeneous and the leaching of salts was more significant in the edges of the study field. By inverting the EM38 data using a Quasi-3D inversion algorithm, the vertical spatio-temporal variability of soil salinity was determined and the leaching of salts over time was easily identified. The methodology of this study can be used as guidance for researchers interested in understanding soil salinity development as well as land managers aiming for effective soil salinity monitoring and management practices. In order to better characterize the variations in soil salinity to a deeper soil profile, the deeper mode of EM38 (i.e., EM38v) as well as other EMI instruments (e.g. DUALEM-421) can be incorporated to conduct Quasi-3D inversions for deeper soil profiles.

  16. Instrumental-Music Study and Student Achievement

    ERIC Educational Resources Information Center

    Heninger, Benjamin Thomas

    2017-01-01

    This quantitative study explained relationships among independent variables that included instrumental music participation and achievement on a standardized test by participants in the ninth grade at a western Wisconsin high school. The ACT "Aspire" (2017) test is a vertically articulated, standards-based system of assessments. Subtest…

  17. Application of a methane carbon isotope analyzer for the investigation of δ13C of methane emission measured by the automatic chamber method in an Arctic Tundra

    NASA Astrophysics Data System (ADS)

    Mastepanov, Mikhail; Christensen, Torben

    2014-05-01

    Methane emissions have been monitored by an automatic chamber method in Zackenberg valley, NE Greenland, since 2006 as a part of Greenland Ecosystem Monitoring (GEM) program. During most of the seasons the measurements were carried out from the time of snow melt (June-July) until freezing of the active layer (October-November). Several years of data, obtained by the same method, instrumentation and at exactly the same site, provided a unique opportunity for the analysis of interannual methane flux patterns and factors affecting their temporal variability. The start of the growing season emissions was found to be closely related to a date of snow melt at the site. Despite a large between year variability of this date (sometimes more than a month), methane emission started within a few days after, and was increasing for the next about 30 days. After this peak of emission, it slowly decreased and stayed more or less constant or slightly decreasing during the rest of the growing season (Mastepanov et al., Biogeosciences, 2013). During the soil freezing, a second peak of methane emission was found (Mastepanov et al., Nature, 2008); its amplitude varied a lot between the years, from almost undetectable to comparable with total growing season emissions. Analysis of the multiyear emission patterns (Mastepanov et al., Biogeosciences, 2013) led to hypotheses of different sources for the spring, summer and autumn methane emissions, and multiyear cycles of accumulation and release of these components to the atmosphere. For the further investigation of this it was decided to complement the monitoring system with a methane carbon isotope analyzer (Los Gatos Research, USA). The instrument was installed during 2013 field season and was successfully operating until the end of the measurement campaign (27 October). Detecting both 12C-CH4 and 13C-CH4 concentrations in real time (0.5 Hz) during automatic chamber closure (15 min), the instrument was providing data for determination of δ13C of the emitting methane (by a relation between 12C-CH4 and 13C-CH4 fluxes). Unfortunately, the beginning of the season was missed due to a delay in the instrument shipment; the summer fluxes were lower than any of the 7 previous years due to an exceptional drought; the autumn burst was not detected due to both exceptionally slow soil freezing and a low soil methane content. However, the data obtained from the most productive chambers confirm the feasibility of the chosen method and give good expectations for the following field campaign 2014.

  18. Mediation and moderation of treatment effects in randomised controlled trials of complex interventions.

    PubMed

    Emsley, Richard; Dunn, Graham; White, Ian R

    2010-06-01

    Complex intervention trials should be able to answer both pragmatic and explanatory questions in order to test the theories motivating the intervention and help understand the underlying nature of the clinical problem being tested. Key to this is the estimation of direct effects of treatment and indirect effects acting through intermediate variables which are measured post-randomisation. Using psychological treatment trials as an example of complex interventions, we review statistical methods which crucially evaluate both direct and indirect effects in the presence of hidden confounding between mediator and outcome. We review the historical literature on mediation and moderation of treatment effects. We introduce two methods from within the existing causal inference literature, principal stratification and structural mean models, and demonstrate how these can be applied in a mediation context before discussing approaches and assumptions necessary for attaining identifiability of key parameters of the basic causal model. Assuming that there is modification by baseline covariates of the effect of treatment (i.e. randomisation) on the mediator (i.e. covariate by treatment interactions), but no direct effect on the outcome of these treatment by covariate interactions leads to the use of instrumental variable methods. We describe how moderation can occur through post-randomisation variables, and extend the principal stratification approach to multiple group methods with explanatory models nested within the principal strata. We illustrate the new methodology with motivating examples of randomised trials from the mental health literature.

  19. An overview of AmeriFlux data products and methods for data acquisition, processing, and publication

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Poindexter, C.; Agarwal, D.; Papale, D.; van Ingen, C.; Torn, M. S.

    2014-12-01

    The AmeriFlux network encompasses independently managed field sites measuring ecosystem carbon, water, and energy fluxes across the Americas. In close coordination with ICOS in Europe, a new set of fluxes data and metadata products is being produced and released at the FLUXNET level, including all AmeriFlux sites. This will enable continued releases of global standardized set of flux data products. In this release, new formats, structures, and ancillary information are being proposed and adopted. This presentation discusses these aspects, detailing current and future solutions. One of the major revisions was to the BADM (Biological, Ancillary, and Disturbance Metadata) protocols. The updates include structure and variable changes to address new developments in data collection related to flux towers and facilitate two-way data sharing. In particular, a new organization of templates is now in place, including changes in templates for biomass, disturbances, instrumentation, soils, and others. New variables and an extensive addition to the vocabularies used to describe BADM templates allow for a more flexible and comprehensible coverage of field sites and the data collection methods and results. Another extensive revision is in the data formats, levels, and versions for fluxes and micrometeorological data. A new selection and revision of data variables and an integrated new definition for data processing levels allow for a more intuitive and flexible notation for the variety of data products. For instance, all variables now include positional information that is tied to BADM instrumentation descriptions. This allows for a better characterization of spatial representativeness of data points, e.g., individual sensors or the tower footprint. Additionally, a new definition for data levels better characterizes the types of processing and transformations applied to the data across different dimensions (e.g., spatial representativeness of a data point, data quality checks applied, and differentiation between measured data and data from models that use process knowledge). We also present an expanded approach to versions of data and data processing software, with stable and immutable data releases, but also pre-release versions to allow evaluation and feedback prior to a stable release.

  20. Design and validation of a standards-based science teacher efficacy instrument

    NASA Astrophysics Data System (ADS)

    Kerr, Patricia Reda

    National standards for K--12 science education address all aspects of science education, with their main emphasis on curriculum---both science subject matter and the process involved in doing science. Standards for science teacher education programs have been developing along a parallel plane, as is self-efficacy research involving classroom teachers. Generally, studies about efficacy have been dichotomous---basing the theoretical underpinnings on the work of either Rotter's Locus of Control theory or on Bandura's explanations of efficacy beliefs and outcome expectancy. This study brings all three threads together---K--12 science standards, teacher education standards, and efficacy beliefs---in an instrument designed to measure science teacher efficacy with items based on identified critical attributes of standards-based science teaching and learning. Based on Bandura's explanation of efficacy being task-specific and having outcome expectancy, a developmental, systematic progression from standards-based strategies and activities to tasks to critical attributes was used to craft items for a standards-based science teacher efficacy instrument. Demographic questions related to school characteristics, teacher characteristics, preservice background, science teaching experience, and post-certification professional development were included in the instrument. The instrument was completed by 102 middle level science teachers, with complete data for 87 teachers. A principal components analysis of the science teachers' responses to the instrument resulted in two components: Standards-Based Science Teacher Efficacy: Beliefs About Teaching (BAT, reliability = .92) and Standards-Based Science Teacher Efficacy: Beliefs About Student Achievement (BASA, reliability = .82). Variables that were characteristic of professional development activities, science content preparation, and school environment were identified as members of the sets of variables predicting the BAT and BASA subscales. Correlations were computed for BAT, BASA, and demographic variables to identify relationships between teacher efficacy, teacher characteristics, and school characteristics. Further research is recommended to refine the instrument and apply its use to a larger sample of science teachers. Its further development also has significance for the enhancement of science teacher education programs.

  1. Deriving Surface NO2 Mixing Ratios from DISCOVER-AQ ACAM Observations: A Method to Assess Surface NO2 Spatial Variability

    NASA Astrophysics Data System (ADS)

    Silverman, M. L.; Szykman, J.; Chen, G.; Crawford, J. H.; Janz, S. J.; Kowalewski, M. G.; Lamsal, L. N.; Long, R.

    2015-12-01

    Studies have shown that satellite NO2 columns are closely related to ground level NO2 concentrations, particularly over polluted areas. This provides a means to assess surface level NO2 spatial variability over a broader area than what can be monitored from ground stations. The characterization of surface level NO2 variability is important to understand air quality in urban areas, emissions, health impacts, photochemistry, and to evaluate the performance of chemical transport models. Using data from the NASA DISCOVER-AQ campaign in Baltimore/Washington we calculate NO2 mixing ratios from the Airborne Compact Atmospheric Mapper (ACAM), through four different methods to derive surface concentration from column measurements. High spectral resolution lidar (HSRL) mixed layer heights, vertical P3B profiles, and CMAQ vertical profiles are used to scale ACAM vertical column densities. The derived NO2 mixing ratios are compared to EPA ground measurements taken at Padonia and Edgewood. We find similar results from scaling with HSRL mixed layer heights and normalized P3B vertical profiles. The HSRL mixed layer heights are then used to scale ACAM vertical column densities across the DISCOVER-AQ flight pattern to assess spatial variability of NO2 over the area. This work will help define the measurement requirements for future satellite instruments.

  2. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability using High-Resolution Cloud Observations

    NASA Astrophysics Data System (ADS)

    Norris, P. M.; da Silva, A. M., Jr.

    2016-12-01

    Norris and da Silva recently published a method to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation (CDA). The gridcolumn model includes assumed-PDF intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used are MODIS cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. The new approach not only significantly reduces mean and standard deviation biases with respect to the assimilated observables, but also improves the simulated rotational-Ramman scattering cloud optical centroid pressure against independent (non-assimilated) retrievals from the OMI instrument. One obvious difficulty for the method, and other CDA methods, is the lack of information content in passive cloud observables on cloud vertical structure, beyond cloud-top and thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard is helpful, better honoring inversion structures in the background state.

  3. Personality Traits and Socio-Demographic Variables as Correlates of Counselling Effectiveness of Counsellors in Enugu State, Nigeria

    ERIC Educational Resources Information Center

    Onyekuru, Bruno U.; Ibegbunam, Josephat

    2015-01-01

    Quality personality traits and socio-demographic variables are essential elements of effective counselling. This correlational study investigated personality traits and socio-demographic variables as predictors of counselling effectiveness of counsellors in Enugu State. The instruments for data collection were Personality Traits Assessment Scale…

  4. Ascertainment-adjusted parameter estimation approach to improve robustness against misspecification of health monitoring methods

    NASA Astrophysics Data System (ADS)

    Juesas, P.; Ramasso, E.

    2016-12-01

    Condition monitoring aims at ensuring system safety which is a fundamental requirement for industrial applications and that has become an inescapable social demand. This objective is attained by instrumenting the system and developing data analytics methods such as statistical models able to turn data into relevant knowledge. One difficulty is to be able to correctly estimate the parameters of those methods based on time-series data. This paper suggests the use of the Weighted Distribution Theory together with the Expectation-Maximization algorithm to improve parameter estimation in statistical models with latent variables with an application to health monotonic under uncertainty. The improvement of estimates is made possible by incorporating uncertain and possibly noisy prior knowledge on latent variables in a sound manner. The latent variables are exploited to build a degradation model of dynamical system represented as a sequence of discrete states. Examples on Gaussian Mixture Models, Hidden Markov Models (HMM) with discrete and continuous outputs are presented on both simulated data and benchmarks using the turbofan engine datasets. A focus on the application of a discrete HMM to health monitoring under uncertainty allows to emphasize the interest of the proposed approach in presence of different operating conditions and fault modes. It is shown that the proposed model depicts high robustness in presence of noisy and uncertain prior.

  5. A satellite simulator for TRMM PR applied to climate model simulations

    NASA Astrophysics Data System (ADS)

    Spangehl, T.; Schroeder, M.; Bodas-Salcedo, A.; Hollmann, R.; Riley Dellaripa, E. M.; Schumacher, C.

    2017-12-01

    Climate model simulations have to be compared against observation based datasets in order to assess their skill in representing precipitation characteristics. Here we use a satellite simulator for TRMM PR in order to evaluate simulations performed with MPI-ESM (Earth system model of the Max Planck Institute for Meteorology in Hamburg, Germany) performed within the MiKlip project (https://www.fona-miklip.de/, funded by Federal Ministry of Education and Research in Germany). While classical evaluation methods focus on geophysical parameters such as precipitation amounts, the application of the satellite simulator enables an evaluation in the instrument's parameter space thereby reducing uncertainties on the reference side. The CFMIP Observation Simulator Package (COSP) provides a framework for the application of satellite simulators to climate model simulations. The approach requires the introduction of sub-grid cloud and precipitation variability. Radar reflectivities are obtained by applying Mie theory, with the microphysical assumptions being chosen to match the atmosphere component of MPI-ESM (ECHAM6). The results are found to be sensitive to the methods used to distribute the convective precipitation over the sub-grid boxes. Simple parameterization methods are used to introduce sub-grid variability of convective clouds and precipitation. In order to constrain uncertainties a comprehensive comparison with sub-grid scale convective precipitation variability which is deduced from TRMM PR observations is carried out.

  6. How adolescents construct their future: the effect of loneliness on future orientation.

    PubMed

    Seginer, Rachel; Lilach, Efrat

    2004-12-01

    This study examined the effect of loneliness, gender, and two dimensions of prospective life domains on adolescent future orientation. Future orientation was studied in four prospective domains: social relations, marriage and family, higher education and work and career. These domains are described in terms of two dimensions: theme (relational vs. instrumental) and distance (near vs. distant future). Data collected from Israeli Jewish adolescents (11th graders) were analysed by repeated measures ANOVAs and ANCOVAs (covariate: depressive experiences) for seven future orientation variables: value, expectance, control (motivational variables), hopes, fears (cognitive representation variables), exploration, commitment (behavioural variables). As predicted, lonely adolescents scored lower than socially embedded adolescents on future orientation variables applied to the relational and near future domains and lonely boys scored lower than lonely girls. However, effects were found only on the three future orientation motivational variables and not on the cognitive representation and behavioural variables. Contrary to prediction controlling for the effect of depressive experiences did not reduce the effect of loneliness on the future orientation variables, but reduced the tendency of adolescents to score higher on all future orientation variables in the instrumental than in the relational prospective domains. The contribution of these findings to the understanding of adolescent loneliness and future orientation was discussed and directions for future research were suggested.

  7. Waist circumference, body mass index, and employment outcomes.

    PubMed

    Kinge, Jonas Minet

    2017-07-01

    Body mass index (BMI) is an imperfect measure of body fat. Recent studies provide evidence in favor of replacing BMI with waist circumference (WC). Hence, I investigated whether or not the association between fat mass and employment status vary by anthropometric measures. I used 15 rounds of the Health Survey for England (1998-2013), which has measures of employment status in addition to measured height, weight, and WC. WC and BMI were entered as continuous variables and obesity as binary variables defined using both WC and BMI. I used multivariate models controlling for a set of covariates. The association of WC with employment was of greater magnitude than the association between BMI and employment. I reran the analysis using conventional instrumental variables methods. The IV models showed significant impacts of obesity on employment; however, they were not more pronounced when WC was used to measure obesity, compared to BMI. This means that, in the IV models, the impact of fat mass on employment did not depend on the measure of fat mass.

  8. How does searching for health information on the Internet affect individuals' demand for health care services?

    PubMed

    Suziedelyte, Agne

    2012-11-01

    The emergence of the Internet made health information, which previously was almost exclusively available to health professionals, accessible to the general public. Access to health information on the Internet is likely to affect individuals' health care related decisions. The aim of this analysis is to determine how health information that people obtain from the Internet affects their demand for health care. I use a novel data set, the U.S. Health Information National Trends Survey (2003-07), to answer this question. The causal variable of interest is a binary variable that indicates whether or not an individual has recently searched for health information on the Internet. Health care utilization is measured by an individual's number of visits to a health professional in the past 12 months. An individual's decision to use the Internet to search for health information is likely to be correlated to other variables that can also affect his/her demand for health care. To separate the effect of Internet health information from other confounding variables, I control for a number of individual characteristics and use the instrumental variable estimation method. As an instrument for Internet health information, I use U.S. state telecommunication regulations that are shown to affect the supply of Internet services. I find that searching for health information on the Internet has a positive, relatively large, and statistically significant effect on an individual's demand for health care. This effect is larger for the individuals who search for health information online more frequently and people who have health care coverage. Among cancer patients, the effect of Internet health information seeking on health professional visits varies by how long ago they were diagnosed with cancer. Thus, the Internet is found to be a complement to formal health care rather than a substitute for health professional services. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. The Effect of Birth Weight on Academic Performance: Instrumental Variable Analysis.

    PubMed

    Lin, Shi Lin; Leung, Gabriel Matthew; Schooling, C Mary

    2017-05-01

    Observationally, lower birth weight is usually associated with poorer academic performance; whether this association is causal or the result of confounding is unknown. To investigate this question, we obtained an effect estimate, which can have a causal interpretation under specific assumptions, of birth weight on educational attainment using instrumental variable analysis based on single nucleotide polymorphisms determining birth weight combined with results from the Social Science Genetic Association Consortium study of 126,559 Caucasians. We similarly obtained an estimate of the effect of birth weight on academic performance in 4,067 adolescents from Hong Kong's (Chinese) Children of 1997 birth cohort (1997-2016), using twin status as an instrumental variable. Birth weight was not associated with years of schooling (per 100-g increase in birth weight, -0.006 years, 95% confidence interval (CI): -0.02, 0.01) or college completion (odds ratio = 1.00, 95% CI: 0.96, 1.03). Birth weight was also unrelated to academic performance in adolescents (per 100-g increase in birth weight, -0.004 grade, 95% CI: -0.04, 0.04) using instrumental variable analysis, although conventional regression gave a small positive association (0.02 higher grade, 95% CI: 0.01, 0.03). Observed associations of birth weight with academic performance may not be causal, suggesting that interventions should focus on the contextual factors generating this correlation. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. True Airspeed Measurement by Ionization-Tracer Technique

    NASA Technical Reports Server (NTRS)

    Boyd, B.; Dorsch, R. G.; Brodie, G. H.

    1952-01-01

    Ion bundles produced in a pulse-excited corona discharge are used as tracers with a radar-like pulse transit-time measuring instrument in order to provide a measurement of airspeed that is independent of all variables except time and distance. The resulting instrumentation need not project into the air stream and, therefore, will not cause any interference in supersonic flow. The instrument was tested at Mach numbers ranging from 0.3 to 3.8. Use of the proper instrumentation and technique results in accuracy of the order of 1 percent.

  11. Development process and initial validation of the Ethical Conflict in Nursing Questionnaire-Critical Care Version.

    PubMed

    Falcó-Pegueroles, Anna; Lluch-Canut, Teresa; Guàrdia-Olmos, Joan

    2013-06-01

    Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables 'frequency' and 'degree of conflict'. In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable 'exposure to conflict', as well as considering six 'types of ethical conflict'. An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach's alpha was used to evaluate the instrument's reliability. All analyses were performed using the statistical software PASW v19. Cronbach's alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

  12. Estimating the association between metabolic risk factors and marijuana use in U.S. adults using data from the continuous National Health and Nutrition Examination Survey.

    PubMed

    Thompson, Christin Ann; Hay, Joel W

    2015-07-01

    More research is needed on the health effects of marijuana use. Results of previous studies indicate that marijuana could alleviate certain factors of metabolic syndrome, such as obesity. Data on 6281 persons from National Health and Nutrition Examination Survey from 2005 to 2012 were used to estimate the effect of marijuana use on cardiometabolic risk factors. The reliability of ordinary least squares (OLS) regression models was tested by replacing marijuana use as the risk factor of interest with alcohol and carbohydrate consumption. Instrumental variable methods were used to account for the potential endogeneity of marijuana use. OLS models show lower fasting insulin, insulin resistance, body mass index, and waist circumference in users compared with nonusers. However, when alcohol and carbohydrate intake substitute for marijuana use in OLS models, similar metabolic benefits are estimated. The Durbin-Wu-Hausman tests provide evidence of endogeneity of marijuana use in OLS models, but instrumental variables models do not yield significant estimates for marijuana use. These findings challenge the robustness of OLS estimates of a positive relationship between marijuana use and fasting insulin, insulin resistance, body mass index, and waist circumference. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Estimating the effect of treatment rate changes when treatment benefits are heterogeneous: antibiotics and otitis media.

    PubMed

    Park, Tae-Ryong; Brooks, John M; Chrischilles, Elizabeth A; Bergus, George

    2008-01-01

    Contrast methods to assess the health effects of a treatment rate change when treatment benefits are heterogeneous across patients. Antibiotic prescribing for children with otitis media (OM) in Iowa Medicaid is the empirical example. Instrumental variable (IV) and linear probability model (LPM) are used to estimate the effect of antibiotic treatments on cure probabilities for children with OM in Iowa Medicaid. Local area physician supply per capita is the instrument in the IV models. Estimates are contrasted in terms of their ability to make inferences for patients whose treatment choices may be affected by a change in population treatment rates. The instrument was positively related to the probability of being prescribed an antibiotic. LPM estimates showed a positive effect of antibiotics on OM patient cure probability while IV estimates showed no relationship between antibiotics and patient cure probability. Linear probability model estimation yields the average effects of the treatment on patients that were treated. IV estimation yields the average effects for patients whose treatment choices were affected by the instrument. As antibiotic treatment effects are heterogeneous across OM patients, our estimates from these approaches are aligned with clinical evidence and theory. The average estimate for treated patients (higher severity) from the LPM model is greater than estimates for patients whose treatment choices are affected by the instrument (lower severity) from the IV models. Based on our IV estimates it appears that lowering antibiotic use in OM patients in Iowa Medicaid did not result in lost cures.

  14. The effects of amphetamine sensitization on conditioned inhibition during a Pavlovian-instrumental transfer task in rats

    PubMed Central

    Shiflett, Michael W.; Riccie, Meaghan; DiMatteo, RoseMarie

    2013-01-01

    Rationale Psychostimulant sensitization heightens behavioral and motivational responses to reward-associated stimuli; however, its effects on stimuli associated with reward absence are less understood. Objectives We examined whether amphetamine sensitization alters performance during Pavlovian-instrumental transfer (PIT) to conditioned excitors and inhibitors. We further sought to characterize the effects of amphetamine sensitization on learning versus performance by exposing rats to amphetamine prior to Pavlovian training or between training and test. Methods Adult male Long Evans rats were given conditioned inhibition (A+/AX−) and Pavlovian (B+) training, followed by variable-interval instrumental conditioning. Rats were sensitized to d-amphetamine (2 mg/kg daily injections for seven days), or served as non-exposed controls. Rats were given a PIT test, in which they were presented with stimulus B alone or in compound with the conditioned inhibitor (BX). Results During the PIT test, control rats significantly reduced instrumental responding on BX trials (to approximately 50% of responding to B). Amphetamine sensitization prior to Pavlovian conditioning increased lever-pressing on BX trials and reduced lever-pressing on B trials compared to controls. Amphetamine sensitization between training and test increased lever-pressing on B and BX trials compared to controls. No effects of sensitization were observed on conditioned food-cup approach. Conclusions Amphetamine sensitization increases instrumental responding during PIT to a conditioned inhibitor, by enhancing excitation of conditioned stimuli and reducing inhibition of conditioned inhibitors. PMID:23715640

  15. A study of the river velocity measurement techniques and analysis methods

    NASA Astrophysics Data System (ADS)

    Chung Yang, Han; Lun Chiang, Jie

    2013-04-01

    Velocity measurement technology can be traced back to the pitot tube velocity measurement method in the 18th century and today's velocity measurement technology use the acoustic and radar technology, with the Doppler principle developed technology advances, in order to develop the measurement method is more suitable for the measurement of velocity, the purpose is to get a more accurate measurement data and with the surface velocity theory, the maximum velocity theory and the indicator theory to obtain the mean velocity. As the main research direction of this article is to review the literature of the velocity measurement techniques and analysis methods, and to explore the applicability of the measurement method of the velocity measurement instruments, and then to describe the advantages and disadvantages of the different mean velocity profiles analysis method. Adequate review of the references of this study will be able to provide a reference for follow-up study of the velocity measurement. Review velocity measurement literature that different velocity measurement is required to follow the different flow conditions measured be upgraded its accuracy, because each flow rate measurement method has its advantages and disadvantages. Traditional velocity instrument can be used at low flow and RiverRAD microwave radar or imaging technology measurement method may be applied in high flow. In the tidal river can use the ADCP to quickly measure river vertical velocity distribution. In addition, urban rivers may be used the CW radar to set up on the bridge, and wide rivers can be used RiverRAD microwave radar to measure the velocities. Review the relevant literature also found that using Ultrasonic Doppler Current Profiler with the Chiu's theory to the velocity of observing automation work can save manpower and resources to improve measurement accuracy, reduce the risk of measurement, but the great variability of river characteristics in Taiwan and a lot of drifting floating objects in water in high flow, resulting in measurement automation work still needs further study. If the priority for the safety of personnel and instruments, we can use the non-contact velocity measurement method with the theoretical analysis method to achieve real-time monitoring.

  16. Initial Stability Assessment of S-NPP VIIRS Reflective Solar Band Calibration Using Invariant Desert and Deep Convective Cloud Targets

    NASA Technical Reports Server (NTRS)

    Bhatt, Rajendra; Doelling, David R.; Wu, Aisheng; Xiong, Xiaoxiong (Jack); Scarino, Benjamin R.; Haney, Conor O.; Gopalan, Arun

    2014-01-01

    The latest CERES FM-5 instrument launched onboard the S-NPP spacecraft will use the VIIRS visible radiances from the NASA Land Product Evaluation and Analysis Tool Elements (PEATE) product for retrieving the cloud properties associated with its TOA flux measurement. In order for CERES to provide climate quality TOA flux datasets, the retrieved cloud properties must be consistent throughout the record, which is dependent on the calibration stability of the VIIRS imager. This paper assesses the NASA calibration stability of the VIIRS reflective solar bands using the Libya-4 desert and deep convective clouds (DCC). The invariant targets are first evaluated for temporal natural variability. It is found for visible (VIS) bands that DCC targets have half of the variability of Libya-4. For the shortwave infrared (SWIR) bands, the desert has less variability. The brief VIIRS record and target variability inhibits high confidence in identifying any trends that are less than 0.6yr for most VIS bands, and 2.5yr for SWIR bands. None of the observed invariant target reflective solar band trends exceeded these trend thresholds. Initial assessment results show that the VIIRS data have been consistently calibrated and that the VIIRS instrument stability is similar to or better than the MODIS instrument.

  17. Social connections and happiness among the elder population of Taiwan.

    PubMed

    Hsu, H-C; Chang, W-C

    2015-01-01

    The purpose of this study was to examine the association between social connections and happiness among members of the elder population of Taiwan. Longitudinal panel data collected in three waves from 1999 to 2007 that are selected from national samples of Taiwanese older people were used for the analysis (n = 4731 persons). Happiness was defined as a dichotomous variable. Social connection variables included living arrangements, contacts with children/grandchildren/parents/relatives/friends, telephone contacts, providing instrumental and informational support, receiving instrumental and emotional support, and social participation. We controlled for the variables demographics, physical and mental health, economic satisfaction, and lifestyle. A generalized linear model (GLM) was applied in the analysis. Happiness remained stable over time. Receiving more emotional support and participating in social events were related to happiness at the beginning, while the effect of social participation was offset over time. Living arrangements, telephone contacts, providing social support, and receiving instrumental support were not significant. The quality of social relationships experienced is possibly more important than the quantity of social interaction for older people, and having social relationships outside the informal social network may increase happiness.

  18. Ring Current He Ion Control by Bounce Resonant ULF Waves

    NASA Astrophysics Data System (ADS)

    Kim, Hyomin; Gerrard, Andrew J.; Lanzerotti, Louis J.; Soto-Chavez, Rualdo; Cohen, Ross J.; Manweiler, Jerry W.

    2017-12-01

    Ring current energy He ion (˜65 keV to ˜520 keV) differential flux data from the Radiation Belt Storm Probe Ion Composition Experiment (RBSPICE) instrument aboard the Van Allan Probes spacecraft show considerable variability during quiet solar wind and geomagnetic time periods. Such variability is apparent from orbit to orbit (˜9 h) of the spacecraft and is observed to be ˜50-100% of the nominal flux. Using data from the Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrument, also aboard the Van Allen Probes spacecraft, we identify that a dominant source of this variability is from ULF waveforms with periods of tens of seconds. These periods correspond to the bounce resonant timescales of the ring current He ions being measured by RBSPICE. A statistical survey using the particle and field data for one full spacecraft precession period (approximately 2 years) shows that the wave and He ion flux variations are generally anticorrelated, suggesting the bounce resonant pitch angle scattering process as a major component in the scattering of He ions.

  19. Alcohol consumption, beverage prices and measurement error.

    PubMed

    Young, Douglas J; Bielinska-Kwapisz, Agnieszka

    2003-03-01

    Alcohol price data collected by the American Chamber of Commerce Researchers Association (ACCRA) have been widely used in studies of alcohol consumption and related behaviors. A number of problems with these data suggest that they contain substantial measurement error, which biases conventional statistical estimators toward a finding of little or no effect of prices on behavior. We test for measurement error, assess the magnitude of the bias and provide an alternative estimator that is likely to be superior. The study utilizes data on per capita alcohol consumption across U.S. states and the years 1982-1997. State and federal alcohol taxes are used as instrumental variables for prices. Formal tests strongly confim the hypothesis of measurement error. Instrumental variable estimates of the price elasticity of demand range from -0.53 to -1.24. These estimates are substantially larger in absolute value than ordinary least squares estimates, which sometimes are not significantly different from zero or even positive. The ACCRA price data are substantially contaminated with measurement error, but using state and federal taxes as instrumental variables mitigates the problem.

  20. Does job insecurity deteriorate health?

    PubMed

    Caroli, Eve; Godard, Mathilde

    2016-02-01

    This paper estimates the causal effect of perceived job insecurity - that is, the fear of involuntary job loss - on health in a sample of men from 22 European countries. We rely on an original instrumental variable approach on the basis of the idea that workers perceive greater job security in countries where employment is strongly protected by the law and more so if employed in industries where employment protection legislation is more binding; that is, in induastries with a higher natural rate of dismissals. Using cross-country data from the 2010 European Working Conditions Survey, we show that, when the potential endogeneity of job insecurity is not accounted for, the latter appears to deteriorate almost all health outcomes. When tackling the endogeneity issue by estimating an instrumental variable model and dealing with potential weak-instrument issues, the health-damaging effect of job insecurity is confirmed for a limited subgroup of health outcomes; namely, suffering from headaches or eyestrain and skin problems. As for other health variables, the impact of job insecurity appears to be insignificant at conventional levels. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Cisplatin and Etoposide Versus Carboplatin and Paclitaxel With Concurrent Radiotherapy for Stage III Non–Small-Cell Lung Cancer: An Analysis of Veterans Health Administration Data

    PubMed Central

    Santana-Davila, Rafael; Devisetty, Kiran; Szabo, Aniko; Sparapani, Rodney; Arce-Lara, Carlos; Gore, Elizabeth M.; Moran, Amy; Williams, Christina D.; Kelley, Michael J.; Whittle, Jeffrey

    2015-01-01

    Purpose The optimal chemotherapy regimen to use with radiotherapy in stage III non–small-cell lung cancer is unknown. Here, we compare the outcome of patents treated within the Veterans Health Administration with either etoposide-cisplatin (EP) or carboplatin-paclitaxel (CP). Methods We identified patients treated with EP and CP with concurrent radiotherapy from 2001 to 2010. Survival rates were compared using Cox proportional hazards regression models with adjustments for confounding provided by propensity score methods and an instrumental variables analysis. Comorbidities and treatment complications were identified through administrative data. Results A total of 1,842 patients were included; EP was used in 27% (n = 499). Treatment with EP was not associated with a survival advantage in a Cox proportional hazards model (hazard ratio [HR], 0.97; 95% CI, 0.85 to 1.10), a propensity score matched cohort (HR, 1.07; 95% CI, 0.91 to 1.24), or a propensity score adjusted model (HR, 0.97; 95% CI, 0.85 to 1.10). In an instrumental variables analysis, there was no survival advantage for patients treated in centers where EP was used more than 50% of the time as compared with centers where EP was used in less than 10% of the patients (HR, 1.07; 95% CI, 0.90 to 1.26). Patients treated with EP, compared with patients treated with CP, had more hospitalizations (2.4 v 1.7 hospitalizations, respectively; P < .001), outpatient visits (17.6 v 12.6 visits, respectively; P < .001), infectious complications (47.3% v 39.4%, respectively; P = .0022), acute kidney disease/dehydration (30.5% v 21.2%, respectively; P < .001), and mucositis/esophagitis (18.6% v 14.4%, respectively; P = .0246). Conclusion After accounting for prognostic variables, patients treated with EP versus CP had similar overall survival, but EP was associated with increased morbidity. PMID:25422491

  2. The impact of sub-skills and item content on students' skills with regard to the control-of-variables strategy

    NASA Astrophysics Data System (ADS)

    Schwichow, Martin; Christoph, Simon; Boone, William J.; Härtig, Hendrik

    2016-01-01

    The so-called control-of-variables strategy (CVS) incorporates the important scientific reasoning skills of designing controlled experiments and interpreting experimental outcomes. As CVS is a prominent component of science standards appropriate assessment instruments are required to measure these scientific reasoning skills and to evaluate the impact of instruction on CVS development. A detailed review of existing CVS instruments suggests that they utilize different, and only a few of the four, critical CVS sub-skills in the item development. This study presents a new CVS assessment instrument (CVS Inventory, CVSI) and investigates the validity of student measures derived from this instrument utilizing Rasch analyses. The results indicate that the CVSI produces reliable and valid student measures with regard to CVS. Furthermore, the results show that the item difficulty depends on the CVS sub-skills utilized in item development, but not on the item content. Accordingly, previous instruments that are restricted to a few CVS sub-skills tend to over- or underestimate students' CVS skills. In addition, these results indicate that students are able to use CVS as a domain general strategy in multiple content areas. Consequences for science instruction and assessment are discussed.

  3. Genetic Instrumental Variable Studies of Effects of Prenatal Risk Factors

    PubMed Central

    von Hinke Kessler Scholder, Stephanie

    2013-01-01

    Identifying the effects of maternal risk factors during pregnancy on infant and child health is an area of tremendous research interest. However, of interest to policy makers is unraveling the causal effects of prenatal risk factors, not their associations with child health, which may be confounded by several unobserved factors. In this paper, we evaluate the utility of genetic variants in three genes that have unequivocal evidence of being related to three major risk factors – CHRNA3 for smoking, ADH1B for alcohol use, and FTO for obesity – as instrumental variables for identifying the causal effects of such factors during pregnancy. Using two independent datasets, we find that these variants are overall predictive of the risk factors and are not systematically related to observed confounders, suggesting that they may be useful instruments. We also find some suggestive evidence that genetic effects are stronger during than before pregnancy. We provide an empirical example illustrating the use of these genetic variants as instruments to evaluate the effects of risk factors on birth weight. Finally, we offer suggestions for researchers contemplating the use of these variants as instruments. PMID:23701534

  4. Association of Body Mass Index with Depression, Anxiety and Suicide-An Instrumental Variable Analysis of the HUNT Study.

    PubMed

    Bjørngaard, Johan Håkon; Carslake, David; Lund Nilsen, Tom Ivar; Linthorst, Astrid C E; Davey Smith, George; Gunnell, David; Romundstad, Pål Richard

    2015-01-01

    While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health. We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index. Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43) for depression, 1.10 (95% CI: 0.95, 1.27) for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63). The present study's results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not for anxiety.

  5. NIST/ISAC standardization study: variability in assignment of intensity values to fluorescence standard beads and in cross calibration of standard beads to hard dyed beads.

    PubMed

    Hoffman, Robert A; Wang, Lili; Bigos, Martin; Nolan, John P

    2012-09-01

    Results from a standardization study cosponsored by the International Society for Advancement of Cytometry (ISAC) and the US National Institute of Standards and Technology (NIST) are reported. The study evaluated the variability of assigning intensity values to fluorophore standard beads by bead manufacturers and the variability of cross calibrating the standard beads to stained polymer beads (hard-dyed beads) using different flow cytometers. Hard dyed beads are generally not spectrally matched to the fluorophores used to stain cells, and spectral response varies among flow cytometers. Thus if hard dyed beads are used as fluorescence calibrators, one expects calibration for specific fluorophores (e.g., FITC or PE) to vary among different instruments. Using standard beads surface-stained with specific fluorophores (FITC, PE, APC, and Pacific Blue™), the study compared the measured intensity of fluorophore standard beads to that of hard dyed beads through cross calibration on 133 different flow cytometers. Using robust CV as a measure of variability, the variation of cross calibrated values was typically 20% or more for a particular hard dyed bead in a specific detection channel. The variation across different instrument models was often greater than the variation within a particular instrument model. As a separate part of the study, NIST and four bead manufacturers used a NIST supplied protocol and calibrated fluorophore solution standards to assign intensity values to the fluorophore beads. Values assigned to the reference beads by different groups varied by orders of magnitude in most cases, reflecting differences in instrumentation used to perform the calibration. The study concluded that the use of any spectrally unmatched hard dyed bead as a general fluorescence calibrator must be verified and characterized for every particular instrument model. Close interaction between bead manufacturers and NIST is recommended to have reliable and uniformly assigned fluorescence standard beads. Copyright © 2012 International Society for Advancement of Cytometry.

  6. A Millennial Record of Rainfall And ENSO Variability in Stalagmites From a Mid-Ocean Island in the South Pacific

    NASA Astrophysics Data System (ADS)

    Aharon, P.; Rasbury, M. S.; Lambert, W. J.; Ghaleb, B.; Lambert, L.

    2005-12-01

    Improved understanding of ocean-atmosphere interactions that control interdecadal ENSO variability prompted recently a renewed interest in the acquisition of highly resolved proxy ENSO records. Corals possessing annual growth increments have extended the ENSO archive several centuries beyond the existing instrumental data but much longer records are needed to constrain the interdecadal periodicities and unravel their underlying causes. To this end, paleoclimate proxies archived in stalagmites from tropical Pacific settings have not been harnessed to the task of ENSO paleo-reconstructions although stalagmites elsewhere have offered valuable paleoclimate insights. Here we report the results of an investigation of stalagmites from a water-table cave on Niue Island in the South Pacific (19o 00' S; 169o 50' W) located at the epicenter of oceanic ENSO. Century-long instrumental records on Niue provide a frame of reference and indicate that the interannual and interdecadal air temperature variability is negligible but the rainfall is fully engaged in the wheels of ENSO such that El-Niño and La-Niña events correspond with droughts and abundant rainfall, respectively. Seasonal monsoon and trade rainfalls exhibit a marked contrast in their oxygen isotope compositions. Rainfall amount governs microbial soil activities resulting in convergent 18O and 13C depletions and enrichments in the drips that are transferred to the calcite stalagmites in the Niuean caves. A detailed study of four actively growing stalagmites whose chronology overlaps with the instrumental records confirms that interannual and decadal-scale ENSO variability is clearly expressed in the annual couplets widths and stable oxygen and carbon isotope time-series records of continuous layered stalagmites. Acquisition of a chronology for USM1 stalagmite posed radiometric dating challenges. The U concentration, in the range of 44.2 to 97.5 ppb, is relatively low by comparison with typical stalagmite values. Therefore dating by 230Th/234U method was impractical considering the youthfulness of the stalagmite and the amount of available material. Three dating techniques were used to derive a robust chronology: (i) 226Ra/234U by TIMS; (ii) radiocarbon by AMS, and (iii) couplets counting. In conjunction, the three dating methods indicate that the 160 mm stalagmite section spans a time interval from 1540 to 360 years AD. Interannual and interdecadal-scale variability are the largest components of variance in the millennial-long oxygen isotope time series. Processed in the frequency domain to quantify the variance, the data yield mean periodicities of 5.5 yrs and 30 yrs thus matching modern interannual ENSO and interdecadal IPO periodicity bands. Importantly, previously unidentified cycles of about 200 yrs and 500years duration are clearly discerned in both the oxygen and carbon isotope records. The low frequency cycles exhibit phase alternations between strong ENSO events manifested in severe droughts that are succeeded by rare ENSO events and abundant rainfall. Phase transitions occurred at about 1500, 1300, 1100, 900 and 500 yrs AD. The new millennial record of ENSO offers valuable data of interest to the climate dynamics community investigating the factors controlling ENSO variability through time.

  7. Single pilot scanning behavior in simulated instrument flight

    NASA Technical Reports Server (NTRS)

    Pennington, J. E.

    1979-01-01

    A simulation of tasks associated with single pilot general aviation flight under instrument flight rules was conducted as a baseline for future research studies on advanced flight controls and avionics. The tasks, ranging from simple climbs and turns to an instrument landing systems approach, were flown on a fixed base simulator. During the simulation the control inputs, state variables, and the pilots visual scan pattern including point of regard were measured and recorded.

  8. 500-year April-September droughts in the Czech Lands based on documentary data and instrumental records

    NASA Astrophysics Data System (ADS)

    Řezníčková, Ladislava; Brázdil, Rudolf; Trnka, Miroslav; Dobrovolný, Petr; Kotyza, Oldřich; Štěpánek, Petr; Zahradníček, Pavel; Valášek, Hubert

    2013-04-01

    This paper analyses temporal and spatial variability of April-September (the vegetation period) droughts in the Czech Lands over the last 500 years. The study is based on different types of documentary data (e.g. chronicles, newspapers, economic sources, weather diaries) covering the pre-instrumental period AD 1501-1804 and on the systematic instrumental meteorological measurements afterwards. Historical-climatological database of the Czech Lands is used for the study of the duration and intensity of drought episodes based on the series of precipitation indices created from documentary data in a 7-degree scale from -3 (extremely dry) to +3 (extremely wet). For the instrumental period of 1805-2012 Palmer's Z-index and PDSI series for mean Czech temperature and precipitation series are used (they were calculated from homogeneous series of 10 and 14 stations respectively). Consequently the 500-year chronology of drought episodes derived from documentary and instrumental data is compiled and the temporal (frequency, seasonality and intensity) and spatial variability of droughts in the Czech Lands from AD 1501 is analysed. The most outstanding drought events are selected and analysed in detail also with respect to their human impacts. The results obtained for the Czech Lands are compared with drought episodes known in Central Europe from other studies and are evaluated with respect to climate variability in Central Europe during the last 500 years (this research is supported by projects InterDrought no. CZ.1.07/2.3.00/20.0248, and GA CR no. P209/11/0956).

  9. Development of a Computer-Based Survey Instrument for Organophosphate and N-Methyl-Carbamate Exposure Assessment among Agricultural Pesticide Handlers

    PubMed Central

    Hofmann, Jonathan N.; Checkoway, Harvey; Borges, Ofelio; Servin, Flor; Fenske, Richard A.; Keifer, Matthew C.

    2010-01-01

    Background: Assessment of occupational pesticide exposures based on self-reported information can be challenging, particularly with immigrant farm worker populations for whom specialized methods are needed to address language and cultural barriers and account for limited literacy. An audio computer-assisted self-interview (A-CASI) survey instrument was developed to collect information about organophosphate (OP) and N-methyl-carbamate (CB) exposures and other personal characteristics among male agricultural pesticide handlers for an ongoing cholinesterase biomonitoring study in Washington State. Objectives: To assess the feasibility of collecting data using the A-CASI instrument and evaluate reliability for a subset of survey items. Methods: The survey consisted of 64 items administered in Spanish or English on a touch-screen tablet computer. Participants listened to digitally recorded questions on headphones and selected responses on the screen, most of which were displayed as images or icons to facilitate participation of low literacy respondents. From 2006–2008, a total of 195 participants completed the survey during the OP/CB application seasons on at least one occasion. Percent agreement and kappa coefficients were calculated to evaluate test–retest reliability for selected characteristics among 45 participants who completed the survey on two separate occasions within the same year. Results: Almost all participants self-identified as Hispanic or Latino (98%), and 97% completed the survey in Spanish. Most participants completed the survey in a half-hour or less, with minimal assistance from on-site research staff. Analyses of test–retest reliability showed substantial agreement for most demographic, work history, and health characteristics and at least moderate agreement for most variables related to personal protective equipment use during pesticide applications. Conclusions: This A-CASI survey instrument is a novel method that has been used successfully to collect information about OP/CB exposures and other personal characteristics among Spanish-speaking agricultural pesticide handlers. PMID:20413416

  10. IECON '87: Industrial applications of control and simulation; Proceedings of the 1987 International Conference on Industrial Electronics, Control, and Instrumentation, Cambridge, MA, Nov. 3, 4, 1987

    NASA Technical Reports Server (NTRS)

    Hartley, Tom T. (Editor)

    1987-01-01

    Recent advances in control-system design and simulation are discussed in reviews and reports. Among the topics considered are fast algorithms for generating near-optimal binary decision programs, trajectory control of robot manipulators with compensation of load effects via a six-axis force sensor, matrix integrators for real-time simulation, a high-level control language for an autonomous land vehicle, and a practical engineering design method for stable model-reference adaptive systems. Also addressed are the identification and control of flexible-limb robots with unknown loads, adaptive control and robust adaptive control for manipulators with feedforward compensation, adaptive pole-placement controllers with predictive action, variable-structure strategies for motion control, and digital signal-processor-based variable-structure controls.

  11. Models of Solar Irradiance Variability and the Instrumental Temperature Record

    NASA Technical Reports Server (NTRS)

    Marcus, S. L.; Ghil, M.; Ide, K.

    1998-01-01

    The effects of decade-to-century (Dec-Cen) variations in total solar irradiance (TSI) on global mean surface temperature Ts during the pre-Pinatubo instrumental era (1854-1991) are studied by using two different proxies for TSI and a simplified version of the IPCC climate model.

  12. Probe-evoked event-related potential techniques for evaluating aspects of attention and information processing

    NASA Technical Reports Server (NTRS)

    Stern, John A.

    1988-01-01

    The study of probe event related potentials (probe ERPs) is reviewed. Several recent experiments are described which seem to leave in doubt the usefulness of applying ERP to simulation and field conditions as well as laboratory situations. Relatively minor changes in the experimental paradigm can produce major shifts in ERP findings, for reasons that are not clear. However, task-elicited ERPs might be used on a flight simulator if the experimenter takes time of arrival of the eyes on a particular instrument as one variable of concern and dwell time on the instrument as a second variable. One can then look at ERPs triggered by saccade termination for fixation pauses of specified durations. It may well be that ERP to a momentarily important display will differ from that elicited by routine instrument check.

  13. Tree ring reconstructed rainfall over the southern Amazon Basin

    NASA Astrophysics Data System (ADS)

    Lopez, Lidio; Stahle, David; Villalba, Ricardo; Torbenson, Max; Feng, Song; Cook, Edward

    2017-07-01

    Moisture sensitive tree ring chronologies of Centrolobium microchaete have been developed from seasonally dry forests in the southern Amazon Basin and used to reconstruct wet season rainfall totals from 1799 to 2012, adding over 150 years of rainfall estimates to the short instrumental record for the region. The reconstruction is correlated with the same atmospheric variables that influence the instrumental measurements of wet season rainfall. Anticyclonic circulation over midlatitude South America promotes equatorward surges of cold and relatively dry extratropical air that converge with warm moist air to form deep convection and heavy rainfall over this sector of the southern Amazon Basin. Interesting droughts and pluvials are reconstructed during the preinstrumental nineteenth and early twentieth centuries, but the tree ring reconstruction suggests that the strong multidecadal variability in instrumental and reconstructed wet season rainfall after 1950 may have been unmatched since 1799.

  14. Effects of variables upon pyrotechnically induced shock response spectra

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1986-01-01

    Throughout the aerospace industry, large variations of 50 percent (6 dB) or more are continually noted for linear shaped charge (LSC) generated shock response spectra (SRS) from flight data (from the exact same location on different flights) and from plate tests (side by side measurements on the same test). A research program was developed to investigate causes of these large SRS variations. A series of ball drop calibration tests to verify calibration of accelerometers and a series of plate tests to investigate charge and assembly variables were performed. The resulting data were analyzed to determine if and to what degree manufacturing and assembly variables, distance from the shock source, data acquisition instrumentation, and shock energy propagation affect the SRS. LSC variables consisted of coreload, standoff, and apex angle. The assembly variable was the torque on the LSC holder. Other variables were distance from source of accelerometers, accelerometer mounting methods, and joint effects. Results indicated that LSC variables did not affect SRS as long as the plate was severed. Accelerometers mounted on mounting blocks showed significantly lower levels above 5000 Hz. Lap joints did not affect SRS levels. The test plate was mounted in an almost free-free state; therefore, distance from the source did not affect the SRS. Several varieties and brands of accelerometers were used, and all but one demonstrated very large variations in SRS.

  15. Thai Elementary School Teachers' English Pronunciation and Effects of Teacher Variables: Professional Development

    ERIC Educational Resources Information Center

    Kanoksilapatham, Budsaba

    2014-01-01

    The objectives of this study are to describe 147 Thai elementary school teachers' English pronunciation competence and to identify a teacher variable that has an impact on their pronunciation. The instrument used to collect data consisted of two parts: a questionnaire to elicit Thai teachers' personal information (i.e., seven variables in all),…

  16. Self-Concept and Response Variability as Predictors of Leadership Effectiveness in Cooperative Extension.

    ERIC Educational Resources Information Center

    Dvorak, Charles F.

    The research aimed at determining the extent to which two variables, self-concept and response variability, are related to one of the principal components of Fiedler's Contingency Model of leadership, the Esteem for the Least Preferred Coworker (LPC) instrument. Sixty extension workers in the Expanded Food and Nutrition Education Program in New…

  17. An Analysis of Decision Making in a Military Population

    DTIC Science & Technology

    1985-03-01

    1 2 ENFP 1 2 INFJ 0 0 ENFJ 0 0 INTP 2 5 ENTP 3 7 INTJ 0 0 ENTJ 2 5 ESTJ 6 14 ISTJ 22 51 ESTP 1 2 ISTP 2 5 ESFJ 0 0 ISFJ 3 7 ESFP 0 0 ISFP 0 0 B-1I...3-25 3-5 Quadrant Theory Variables . . . . . . . . . . . . . . 3-31 3-6 Predicted Relationships Between Pairs Of Instruments 3-32 3-7 22 x 22...And The HPSF Quadrant Variables Only . . . . . . . . . . . . . . . . . . . 3-43 3-17 Summary Of Relationships For Each Pair Of Instruments 3-45 3-18

  18. Error Budget for a Calibration Demonstration System for the Reflected Solar Instrument for the Climate Absolute Radiance and Refractivity Observatory

    NASA Technical Reports Server (NTRS)

    Thome, Kurtis; McCorkel, Joel; McAndrew, Brendan

    2013-01-01

    A goal of the Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission is to observe highaccuracy, long-term climate change trends over decadal time scales. The key to such a goal is to improving the accuracy of SI traceable absolute calibration across infrared and reflected solar wavelengths allowing climate change to be separated from the limit of natural variability. The advances required to reach on-orbit absolute accuracy to allow climate change observations to survive data gaps exist at NIST in the laboratory, but still need demonstration that the advances can move successfully from to NASA and/or instrument vendor capabilities for spaceborne instruments. The current work describes the radiometric calibration error budget for the Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer (SOLARIS) which is the calibration demonstration system (CDS) for the reflected solar portion of CLARREO. The goal of the CDS is to allow the testing and evaluation of calibration approaches, alternate design and/or implementation approaches and components for the CLARREO mission. SOLARIS also provides a test-bed for detector technologies, non-linearity determination and uncertainties, and application of future technology developments and suggested spacecraft instrument design modifications. The resulting SI-traceable error budget for reflectance retrieval using solar irradiance as a reference and methods for laboratory-based, absolute calibration suitable for climatequality data collections is given. Key components in the error budget are geometry differences between the solar and earth views, knowledge of attenuator behavior when viewing the sun, and sensor behavior such as detector linearity and noise behavior. Methods for demonstrating this error budget are also presented.

  19. Synthesizing US Colonial Climate: Available Data and a "Proxy Adjustment" Method

    NASA Astrophysics Data System (ADS)

    Zalzal, K. S.; Munoz-Hernandez, A.; Arrigo, J. S.

    2008-12-01

    Climate and its variability is a primary driver of hydrologic systems. A paucity of instrumental data makes reconstructing seventeenth- and eighteenth-century climatic conditions along the Northeast corridor difficult, yet this information is necessary if we are to understand the conditions, changes and interactions society had with hydrosystems during this first period of permanent European settlement. For this period (approx. 1600- 1800) there are instrumental records for some regions such as annual temperature and precipitation data for Philadelphia beginning in 1738; Cambridge, Mass., from 1747-1776; and temperature for New Haven, Conn., from 1780 to 1800. There are also paleorecords, including tree-rings analyses and sediment core examinations of pollen and overwash deposits, and historical accounts of extreme weather events. Our analyses of these data show that correlating even the available data is less than straightforward. To produce a "best track" climate record, we introduce a new method of "paleoadjustment" as a means to characterize climate statistical properties as opposed to a strict reconstruction. Combining the instrumented record with the paleorecord, we estimated two sets of climate forcings to use in colonial hydrology study. The first utilized a recent instrumented record (1817-1917) from Baltimore, Md, statistically adjusted in 20-year windows to match trends in the paleorecords and anecdotal evidence from the Middle Colonies and Chesapeake Bay region. The second was a regression reconstruction for New England using climate indices developed from journal records and the Cambridge, Mass., instrumental record. The two climate reconstructions were used to compute the annual potential water yield over the 200-year period of interest. A comparison of these results allowed us to make preliminary conclusions regarding the effect of climate on hydrology during the colonial period. We contend that an understanding of historical hydrology will improve our ability to predict and react to changes in global water resources.

  20. [Study of near infrared spectral preprocessing and wavelength selection methods for endometrial cancer tissue].

    PubMed

    Zhao, Li-Ting; Xiang, Yu-Hong; Dai, Yin-Mei; Zhang, Zhuo-Yong

    2010-04-01

    Near infrared spectroscopy was applied to measure the tissue slice of endometrial tissues for collecting the spectra. A total of 154 spectra were obtained from 154 samples. The number of normal, hyperplasia, and malignant samples was 36, 60, and 58, respectively. Original near infrared spectra are composed of many variables, for example, interference information including instrument errors and physical effects such as particle size and light scatter. In order to reduce these influences, original spectra data should be performed with different spectral preprocessing methods to compress variables and extract useful information. So the methods of spectral preprocessing and wavelength selection have played an important role in near infrared spectroscopy technique. In the present paper the raw spectra were processed using various preprocessing methods including first derivative, multiplication scatter correction, Savitzky-Golay first derivative algorithm, standard normal variate, smoothing, and moving-window median. Standard deviation was used to select the optimal spectral region of 4 000-6 000 cm(-1). Then principal component analysis was used for classification. Principal component analysis results showed that three types of samples could be discriminated completely and the accuracy almost achieved 100%. This study demonstrated that near infrared spectroscopy technology and chemometrics method could be a fast, efficient, and novel means to diagnose cancer. The proposed methods would be a promising and significant diagnosis technique of early stage cancer.

Top