Sample records for regression analyses applied

  1. Ecologic regression analysis and the study of the influence of air quality on mortality.

    PubMed Central

    Selvin, S; Merrill, D; Wong, L; Sacks, S T

    1984-01-01

    This presentation focuses entirely on the use and evaluation of regression analysis applied to ecologic data as a method to study the effects of ambient air pollution on mortality rates. Using extensive national data on mortality, air quality and socio-economic status regression analyses are used to study the influence of air quality on mortality. The analytic methods and data are selected in such a way that direct comparisons can be made with other ecologic regression studies of mortality and air quality. Analyses are performed by use of two types of geographic areas, age-specific mortality of both males and females and three pollutants (total suspended particulates, sulfur dioxide and nitrogen dioxide). The overall results indicate no persuasive evidence exists of a link between air quality and general mortality levels. Additionally, a lack of consistency between the present results and previous published work is noted. Overall, it is concluded that linear regression analysis applied to nationally collected ecologic data cannot be used to usefully infer a causal relationship between air quality and mortality which is in direct contradiction to other major published studies. PMID:6734568

  2. A tutorial on the piecewise regression approach applied to bedload transport data

    Treesearch

    Sandra E. Ryan; Laurie S. Porth

    2007-01-01

    This tutorial demonstrates the application of piecewise regression to bedload data to define a shift in phase of transport so that the reader may perform similar analyses on available data. The use of piecewise regression analysis implicitly recognizes different functions fit to bedload data over varying ranges of flow. The transition from primarily low rates of sand...

  3. Application of Partial Least Square (PLS) Regression to Determine Landscape-Scale Aquatic Resources Vulnerability in the Ozark Mountains

    EPA Science Inventory

    Partial least squares (PLS) analysis offers a number of advantages over the more traditionally used regression analyses applied in landscape ecology, particularly for determining the associations among multiple constituents of surface water and landscape configuration. Common dat...

  4. Use of principal-component, correlation, and stepwise multiple-regression analyses to investigate selected physical and hydraulic properties of carbonate-rock aquifers

    USGS Publications Warehouse

    Brown, C. Erwin

    1993-01-01

    Correlation analysis in conjunction with principal-component and multiple-regression analyses were applied to laboratory chemical and petrographic data to assess the usefulness of these techniques in evaluating selected physical and hydraulic properties of carbonate-rock aquifers in central Pennsylvania. Correlation and principal-component analyses were used to establish relations and associations among variables, to determine dimensions of property variation of samples, and to filter the variables containing similar information. Principal-component and correlation analyses showed that porosity is related to other measured variables and that permeability is most related to porosity and grain size. Four principal components are found to be significant in explaining the variance of data. Stepwise multiple-regression analysis was used to see how well the measured variables could predict porosity and (or) permeability for this suite of rocks. The variation in permeability and porosity is not totally predicted by the other variables, but the regression is significant at the 5% significance level. ?? 1993.

  5. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  6. Application of Partial Least Squares (PLS) Regression to Determine Landscape-Scale Aquatic Resource Vulnerability in the Ozark Mountains

    EPA Science Inventory

    Partial least squares (PLS) analysis offers a number of advantages over the more traditionally used regression analyses applied in landscape ecology to study the associations among constituents of surface water and landscapes. Common data problems in ecological studies include: s...

  7. Partial Least Square Analyses of Landscape and Surface Water Biota Associations in the Savannah River Basin

    EPA Science Inventory

    Ecologists are often faced with problem of small sample size, correlated and large number of predictors, and high noise-to-signal relationships. This necessitates excluding important variables from the model when applying standard multiple or multivariate regression analyses. In ...

  8. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  9. Variance Estimation Using Replication Methods in Structural Equation Modeling with Complex Sample Data

    ERIC Educational Resources Information Center

    Stapleton, Laura M.

    2008-01-01

    This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…

  10. Development of Super-Ensemble techniques for ocean analyses: the Mediterranean Sea case

    NASA Astrophysics Data System (ADS)

    Pistoia, Jenny; Pinardi, Nadia; Oddo, Paolo; Collins, Matthew; Korres, Gerasimos; Drillet, Yann

    2017-04-01

    Short-term ocean analyses for Sea Surface Temperature SST in the Mediterranean Sea can be improved by a statistical post-processing technique, called super-ensemble. This technique consists in a multi-linear regression algorithm applied to a Multi-Physics Multi-Model Super-Ensemble (MMSE) dataset, a collection of different operational forecasting analyses together with ad-hoc simulations produced by modifying selected numerical model parameterizations. A new linear regression algorithm based on Empirical Orthogonal Function filtering techniques is capable to prevent overfitting problems, even if best performances are achieved when we add correlation to the super-ensemble structure using a simple spatial filter applied after the linear regression. Our outcomes show that super-ensemble performances depend on the selection of an unbiased operator and the length of the learning period, but the quality of the generating MMSE dataset has the largest impact on the MMSE analysis Root Mean Square Error (RMSE) evaluated with respect to observed satellite SST. Lower RMSE analysis estimates result from the following choices: 15 days training period, an overconfident MMSE dataset (a subset with the higher quality ensemble members), and the least square algorithm being filtered a posteriori.

  11. Recycling and Ambivalence: Quantitative and Qualitative Analyses of Household Recycling among Young Adults

    ERIC Educational Resources Information Center

    Ojala, Maria

    2008-01-01

    Theories about ambivalence, as well as quantitative and qualitative empirical approaches, are applied to obtain an understanding of recycling among young adults. A questionnaire was mailed to 422 Swedish young people. Regression analyses showed that a mix of negative emotions (worry) and positive emotions (hope and joy) about the environmental…

  12. Regressive Imagery in Creative Problem-Solving: Comparing Verbal Protocols of Expert and Novice Visual Artists and Computer Programmers

    ERIC Educational Resources Information Center

    Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin

    2015-01-01

    We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…

  13. Implementing informative priors for heterogeneity in meta-analysis using meta-regression and pseudo data.

    PubMed

    Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T

    2016-12-20

    Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  14. How is the weather? Forecasting inpatient glycemic control

    PubMed Central

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B; Thompson, Bithika M

    2017-01-01

    Aim: Apply methods of damped trend analysis to forecast inpatient glycemic control. Method: Observed and calculated point-of-care blood glucose data trends were determined over 62 weeks. Mean absolute percent error was used to calculate differences between observed and forecasted values. Comparisons were drawn between model results and linear regression forecasting. Results: The forecasted mean glucose trends observed during the first 24 and 48 weeks of projections compared favorably to the results provided by linear regression forecasting. However, in some scenarios, the damped trend method changed inferences compared with linear regression. In all scenarios, mean absolute percent error values remained below the 10% accepted by demand industries. Conclusion: Results indicate that forecasting methods historically applied within demand industries can project future inpatient glycemic control. Additional study is needed to determine if forecasting is useful in the analyses of other glucometric parameters and, if so, how to apply the techniques to quality improvement. PMID:29134125

  15. BAYESIAN LARGE-SCALE MULTIPLE REGRESSION WITH SUMMARY STATISTICS FROM GENOME-WIDE ASSOCIATION STUDIES1

    PubMed Central

    Zhu, Xiang; Stephens, Matthew

    2017-01-01

    Bayesian methods for large-scale multiple regression provide attractive approaches to the analysis of genome-wide association studies (GWAS). For example, they can estimate heritability of complex traits, allowing for both polygenic and sparse models; and by incorporating external genomic data into the priors, they can increase power and yield new biological insights. However, these methods require access to individual genotypes and phenotypes, which are often not easily available. Here we provide a framework for performing these analyses without individual-level data. Specifically, we introduce a “Regression with Summary Statistics” (RSS) likelihood, which relates the multiple regression coefficients to univariate regression results that are often easily available. The RSS likelihood requires estimates of correlations among covariates (SNPs), which also can be obtained from public databases. We perform Bayesian multiple regression analysis by combining the RSS likelihood with previously proposed prior distributions, sampling posteriors by Markov chain Monte Carlo. In a wide range of simulations RSS performs similarly to analyses using the individual data, both for estimating heritability and detecting associations. We apply RSS to a GWAS of human height that contains 253,288 individuals typed at 1.06 million SNPs, for which analyses of individual-level data are practically impossible. Estimates of heritability (52%) are consistent with, but more precise, than previous results using subsets of these data. We also identify many previously unreported loci that show evidence for association with height in our analyses. Software is available at https://github.com/stephenslab/rss. PMID:29399241

  16. A comparison of Cox and logistic regression for use in genome-wide association studies of cohort and case-cohort design.

    PubMed

    Staley, James R; Jones, Edmund; Kaptoge, Stephen; Butterworth, Adam S; Sweeting, Michael J; Wood, Angela M; Howson, Joanna M M

    2017-06-01

    Logistic regression is often used instead of Cox regression to analyse genome-wide association studies (GWAS) of single-nucleotide polymorphisms (SNPs) and disease outcomes with cohort and case-cohort designs, as it is less computationally expensive. Although Cox and logistic regression models have been compared previously in cohort studies, this work does not completely cover the GWAS setting nor extend to the case-cohort study design. Here, we evaluated Cox and logistic regression applied to cohort and case-cohort genetic association studies using simulated data and genetic data from the EPIC-CVD study. In the cohort setting, there was a modest improvement in power to detect SNP-disease associations using Cox regression compared with logistic regression, which increased as the disease incidence increased. In contrast, logistic regression had more power than (Prentice weighted) Cox regression in the case-cohort setting. Logistic regression yielded inflated effect estimates (assuming the hazard ratio is the underlying measure of association) for both study designs, especially for SNPs with greater effect on disease. Given logistic regression is substantially more computationally efficient than Cox regression in both settings, we propose a two-step approach to GWAS in cohort and case-cohort studies. First to analyse all SNPs with logistic regression to identify associated variants below a pre-defined P-value threshold, and second to fit Cox regression (appropriately weighted in case-cohort studies) to those identified SNPs to ensure accurate estimation of association with disease.

  17. Bayesian hierarchical models for cost-effectiveness analyses that use data from cluster randomized trials.

    PubMed

    Grieve, Richard; Nixon, Richard; Thompson, Simon G

    2010-01-01

    Cost-effectiveness analyses (CEA) may be undertaken alongside cluster randomized trials (CRTs) where randomization is at the level of the cluster (for example, the hospital or primary care provider) rather than the individual. Costs (and outcomes) within clusters may be correlated so that the assumption made by standard bivariate regression models, that observations are independent, is incorrect. This study develops a flexible modeling framework to acknowledge the clustering in CEA that use CRTs. The authors extend previous Bayesian bivariate models for CEA of multicenter trials to recognize the specific form of clustering in CRTs. They develop new Bayesian hierarchical models (BHMs) that allow mean costs and outcomes, and also variances, to differ across clusters. They illustrate how each model can be applied using data from a large (1732 cases, 70 primary care providers) CRT evaluating alternative interventions for reducing postnatal depression. The analyses compare cost-effectiveness estimates from BHMs with standard bivariate regression models that ignore the data hierarchy. The BHMs show high levels of cost heterogeneity across clusters (intracluster correlation coefficient, 0.17). Compared with standard regression models, the BHMs yield substantially increased uncertainty surrounding the cost-effectiveness estimates, and altered point estimates. The authors conclude that ignoring clustering can lead to incorrect inferences. The BHMs that they present offer a flexible modeling framework that can be applied more generally to CEA that use CRTs.

  18. An automated ranking platform for machine learning regression models for meat spoilage prediction using multi-spectral imaging and metabolic profiling.

    PubMed

    Estelles-Lopez, Lucia; Ropodi, Athina; Pavlidis, Dimitris; Fotopoulou, Jenny; Gkousari, Christina; Peyrodie, Audrey; Panagou, Efstathios; Nychas, George-John; Mohareb, Fady

    2017-09-01

    Over the past decade, analytical approaches based on vibrational spectroscopy, hyperspectral/multispectral imagining and biomimetic sensors started gaining popularity as rapid and efficient methods for assessing food quality, safety and authentication; as a sensible alternative to the expensive and time-consuming conventional microbiological techniques. Due to the multi-dimensional nature of the data generated from such analyses, the output needs to be coupled with a suitable statistical approach or machine-learning algorithms before the results can be interpreted. Choosing the optimum pattern recognition or machine learning approach for a given analytical platform is often challenging and involves a comparative analysis between various algorithms in order to achieve the best possible prediction accuracy. In this work, "MeatReg", a web-based application is presented, able to automate the procedure of identifying the best machine learning method for comparing data from several analytical techniques, to predict the counts of microorganisms responsible of meat spoilage regardless of the packaging system applied. In particularly up to 7 regression methods were applied and these are ordinary least squares regression, stepwise linear regression, partial least square regression, principal component regression, support vector regression, random forest and k-nearest neighbours. MeatReg" was tested with minced beef samples stored under aerobic and modified atmosphere packaging and analysed with electronic nose, HPLC, FT-IR, GC-MS and Multispectral imaging instrument. Population of total viable count, lactic acid bacteria, pseudomonads, Enterobacteriaceae and B. thermosphacta, were predicted. As a result, recommendations of which analytical platforms are suitable to predict each type of bacteria and which machine learning methods to use in each case were obtained. The developed system is accessible via the link: www.sorfml.com. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Statistical studies of selected trace elements with reference to geology and genesis of the Carlin gold deposit, Nevada

    USGS Publications Warehouse

    Harris, Michael; Radtke, Arthur S.

    1976-01-01

    Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.

  20. Statistical Prediction in Proprietary Rehabilitation.

    ERIC Educational Resources Information Center

    Johnson, Kurt L.; And Others

    1987-01-01

    Applied statistical methods to predict case expenditures for low back pain rehabilitation cases in proprietary rehabilitation. Extracted predictor variables from case records of 175 workers compensation claimants with some degree of permanent disability due to back injury. Performed several multiple regression analyses resulting in a formula that…

  1. Geodesic least squares regression for scaling studies in magnetic confinement fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdoolaege, Geert

    In regression analyses for deriving scaling laws that occur in various scientific disciplines, usually standard regression methods have been applied, of which ordinary least squares (OLS) is the most popular. However, concerns have been raised with respect to several assumptions underlying OLS in its application to scaling laws. We here discuss a new regression method that is robust in the presence of significant uncertainty on both the data and the regression model. The method, which we call geodesic least squares regression (GLS), is based on minimization of the Rao geodesic distance on a probabilistic manifold. We demonstrate the superiority ofmore » the method using synthetic data and we present an application to the scaling law for the power threshold for the transition to the high confinement regime in magnetic confinement fusion devices.« less

  2. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  3. Classification and regression tree (CART) analyses of genomic signatures reveal sets of tetramers that discriminate temperature optima of archaea and bacteria

    PubMed Central

    Dyer, Betsey D.; Kahn, Michael J.; LeBlanc, Mark D.

    2008-01-01

    Classification and regression tree (CART) analysis was applied to genome-wide tetranucleotide frequencies (genomic signatures) of 195 archaea and bacteria. Although genomic signatures have typically been used to classify evolutionary divergence, in this study, convergent evolution was the focus. Temperature optima for most of the organisms examined could be distinguished by CART analyses of tetranucleotide frequencies. This suggests that pervasive (nonlinear) qualities of genomes may reflect certain environmental conditions (such as temperature) in which those genomes evolved. The predominant use of GAGA and AGGA as the discriminating tetramers in CART models suggests that purine-loading and codon biases of thermophiles may explain some of the results. PMID:19054742

  4. The relationship between biomechanical variables and driving performance during the golf swing.

    PubMed

    Chu, Yungchien; Sell, Timothy C; Lephart, Scott M

    2010-09-01

    Swing kinematic and ground reaction force data from 308 golfers were analysed to identify the variables important to driving ball velocity. Regression models were applied at four selected events in the swing. The models accounted for 44-74% of variance in ball velocity. Based on the regression analyses, upper torso-pelvis separation (the X-Factor), delayed release (i.e. the initiation of movement) of the arms and wrists, trunk forward and lateral tilting, and weight-shifting during the swing were significantly related to ball velocity. Our results also verify several general coaching ideas that were considered important to increased ball velocity. The results of this study may serve as both skill and strength training guidelines for golfers.

  5. Breeding value accuracy estimates for growth traits using random regression and multi-trait models in Nelore cattle.

    PubMed

    Boligon, A A; Baldi, F; Mercadante, M E Z; Lobo, R B; Pereira, R J; Albuquerque, L G

    2011-06-28

    We quantified the potential increase in accuracy of expected breeding value for weights of Nelore cattle, from birth to mature age, using multi-trait and random regression models on Legendre polynomials and B-spline functions. A total of 87,712 weight records from 8144 females were used, recorded every three months from birth to mature age from the Nelore Brazil Program. For random regression analyses, all female weight records from birth to eight years of age (data set I) were considered. From this general data set, a subset was created (data set II), which included only nine weight records: at birth, weaning, 365 and 550 days of age, and 2, 3, 4, 5, and 6 years of age. Data set II was analyzed using random regression and multi-trait models. The model of analysis included the contemporary group as fixed effects and age of dam as a linear and quadratic covariable. In the random regression analyses, average growth trends were modeled using a cubic regression on orthogonal polynomials of age. Residual variances were modeled by a step function with five classes. Legendre polynomials of fourth and sixth order were utilized to model the direct genetic and animal permanent environmental effects, respectively, while third-order Legendre polynomials were considered for maternal genetic and maternal permanent environmental effects. Quadratic polynomials were applied to model all random effects in random regression models on B-spline functions. Direct genetic and animal permanent environmental effects were modeled using three segments or five coefficients, and genetic maternal and maternal permanent environmental effects were modeled with one segment or three coefficients in the random regression models on B-spline functions. For both data sets (I and II), animals ranked differently according to expected breeding value obtained by random regression or multi-trait models. With random regression models, the highest gains in accuracy were obtained at ages with a low number of weight records. The results indicate that random regression models provide more accurate expected breeding values than the traditionally finite multi-trait models. Thus, higher genetic responses are expected for beef cattle growth traits by replacing a multi-trait model with random regression models for genetic evaluation. B-spline functions could be applied as an alternative to Legendre polynomials to model covariance functions for weights from birth to mature age.

  6. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  7. Relocation and Alienation: Support for Stokols' Social Psychological Theory.

    ERIC Educational Resources Information Center

    Hwalek, Melanie; Firestone, Ira

    The extent to which Stokols' model of alienation could be applied to the relocation of elderly into age-homogeneous communal residences was investigated by interviewing 50 residents of two homes for the aged and one senior citizen apartment complex. Three types of regression analyses were performed to test the hypothesis that simultaneously…

  8. General Strain Theory as a Basis for the Design of School Interventions

    ERIC Educational Resources Information Center

    Moon, Byongook; Morash, Merry

    2013-01-01

    The research described in this article applies general strain theory to identify possible points of intervention for reducing delinquency of students in two middle schools. Data were collected from 296 youths, and separate negative binomial regression analyses were used to identify predictors of violent, property, and status delinquency. Emotional…

  9. Integrative eQTL analysis of tumor and host omics data in individuals with bladder cancer.

    PubMed

    Pineda, Silvia; Van Steen, Kristel; Malats, Núria

    2017-09-01

    Integrative analyses of several omics data are emerging. The data are usually generated from the same source material (i.e., tumor sample) representing one level of regulation. However, integrating different regulatory levels (i.e., blood) with those from tumor may also reveal important knowledge about the human genetic architecture. To model this multilevel structure, an integrative-expression quantitative trait loci (eQTL) analysis applying two-stage regression (2SR) was proposed. This approach first regressed tumor gene expression levels with tumor markers and the adjusted residuals from the previous model were then regressed with the germline genotypes measured in blood. Previously, we demonstrated that penalized regression methods in combination with a permutation-based MaxT method (Global-LASSO) is a promising tool to fix some of the challenges that high-throughput omics data analysis imposes. Here, we assessed whether Global-LASSO can also be applied when tumor and blood omics data are integrated. We further compared our strategy with two 2SR-approaches, one using multiple linear regression (2SR-MLR) and other using LASSO (2SR-LASSO). We applied the three models to integrate genomic, epigenomic, and transcriptomic data from tumor tissue with blood germline genotypes from 181 individuals with bladder cancer included in the TCGA Consortium. Global-LASSO provided a larger list of eQTLs than the 2SR methods, identified a previously reported eQTLs in prostate stem cell antigen (PSCA), and provided further clues on the complexity of APBEC3B loci, with a minimal false-positive rate not achieved by 2SR-MLR. It also represents an important contribution for omics integrative analysis because it is easy to apply and adaptable to any type of data. © 2017 WILEY PERIODICALS, INC.

  10. Improving Your Data Transformations: Applying the Box-Cox Transformation

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2010-01-01

    Many of us in the social sciences deal with data that do not conform to assumptions of normality and/or homoscedasticity/homogeneity of variance. Some research has shown that parametric tests (e.g., multiple regression, ANOVA) can be robust to modest violations of these assumptions. Yet the reality is that almost all analyses (even nonparametric…

  11. About the Pace of Climate Change: Write a Report to the President

    ERIC Educational Resources Information Center

    Khadjavi, Lily

    2013-01-01

    This project allows students to better understand the scope and pace of climate change by conducting their own analyses. Using data readily available from NASA and NOAA, students can apply their knowledge of regression models (or of the modeling of rates of change). The results lend themselves to a writing assignment in which students demonstrate…

  12. EMD-regression for modelling multi-scale relationships, and application to weather-related cardiovascular mortality

    NASA Astrophysics Data System (ADS)

    Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B. M. J.

    2018-01-01

    In a number of environmental studies, relationships between natural processes are often assessed through regression analyses, using time series data. Such data are often multi-scale and non-stationary, leading to a poor accuracy of the resulting regression models and therefore to results with moderate reliability. To deal with this issue, the present paper introduces the EMD-regression methodology consisting in applying the empirical mode decomposition (EMD) algorithm on data series and then using the resulting components in regression models. The proposed methodology presents a number of advantages. First, it accounts of the issues of non-stationarity associated to the data series. Second, this approach acts as a scan for the relationship between a response variable and the predictors at different time scales, providing new insights about this relationship. To illustrate the proposed methodology it is applied to study the relationship between weather and cardiovascular mortality in Montreal, Canada. The results shed new knowledge concerning the studied relationship. For instance, they show that the humidity can cause excess mortality at the monthly time scale, which is a scale not visible in classical models. A comparison is also conducted with state of the art methods which are the generalized additive models and distributed lag models, both widely used in weather-related health studies. The comparison shows that EMD-regression achieves better prediction performances and provides more details than classical models concerning the relationship.

  13. Genetic analyses of protein yield in dairy cows applying random regression models with time-dependent and temperature x humidity-dependent covariates.

    PubMed

    Brügemann, K; Gernand, E; von Borstel, U U; König, S

    2011-08-01

    Data used in the present study included 1,095,980 first-lactation test-day records for protein yield of 154,880 Holstein cows housed on 196 large-scale dairy farms in Germany. Data were recorded between 2002 and 2009 and merged with meteorological data from public weather stations. The maximum distance between each farm and its corresponding weather station was 50 km. Hourly temperature-humidity indexes (THI) were calculated using the mean of hourly measurements of dry bulb temperature and relative humidity. On the phenotypic scale, an increase in THI was generally associated with a decrease in daily protein yield. For genetic analyses, a random regression model was applied using time-dependent (d in milk, DIM) and THI-dependent covariates. Additive genetic and permanent environmental effects were fitted with this random regression model and Legendre polynomials of order 3 for DIM and THI. In addition, the fixed curve was modeled with Legendre polynomials of order 3. Heterogeneous residuals were fitted by dividing DIM into 5 classes, and by dividing THI into 4 classes, resulting in 20 different classes. Additive genetic variances for daily protein yield decreased with increasing degrees of heat stress and were lowest at the beginning of lactation and at extreme THI. Due to higher additive genetic variances, slightly higher permanent environment variances, and similar residual variances, heritabilities were highest for low THI in combination with DIM at the end of lactation. Genetic correlations among individual values for THI were generally >0.90. These trends from the complex random regression model were verified by applying relatively simple bivariate animal models for protein yield measured in 2 THI environments; that is, defining a THI value of 60 as a threshold. These high correlations indicate the absence of any substantial genotype × environment interaction for protein yield. However, heritabilities and additive genetic variances from the random regression model tended to be slightly higher in the THI range corresponding to cows' comfort zone. Selecting such superior environments for progeny testing can contribute to an accurate genetic differentiation among selection candidates. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  15. The Use of Linear Instrumental Variables Methods in Health Services Research and Health Economics: A Cautionary Note

    PubMed Central

    Terza, Joseph V; Bradford, W David; Dismuke, Clara E

    2008-01-01

    Objective To investigate potential bias in the use of the conventional linear instrumental variables (IV) method for the estimation of causal effects in inherently nonlinear regression settings. Data Sources Smoking Supplement to the 1979 National Health Interview Survey, National Longitudinal Alcohol Epidemiologic Survey, and simulated data. Study Design Potential bias from the use of the linear IV method in nonlinear models is assessed via simulation studies and real world data analyses in two commonly encountered regression setting: (1) models with a nonnegative outcome (e.g., a count) and a continuous endogenous regressor; and (2) models with a binary outcome and a binary endogenous regressor. Principle Findings The simulation analyses show that substantial bias in the estimation of causal effects can result from applying the conventional IV method in inherently nonlinear regression settings. Moreover, the bias is not attenuated as the sample size increases. This point is further illustrated in the survey data analyses in which IV-based estimates of the relevant causal effects diverge substantially from those obtained with appropriate nonlinear estimation methods. Conclusions We offer this research as a cautionary note to those who would opt for the use of linear specifications in inherently nonlinear settings involving endogeneity. PMID:18546544

  16. An application of the Health Action Process Approach model to oral hygiene behaviour and dental plaque in adolescents with fixed orthodontic appliances.

    PubMed

    Scheerman, Janneke F M; van Empelen, Pepijn; van Loveren, Cor; Pakpour, Amir H; van Meijel, Berno; Gholami, Maryam; Mierzaie, Zaher; van den Braak, Matheus C T; Verrips, Gijsbert H W

    2017-11-01

    The Health Action Process Approach (HAPA) model addresses health behaviours, but it has never been applied to model adolescents' oral hygiene behaviour during fixed orthodontic treatment. This study aimed to apply the HAPA model to explain adolescents' oral hygiene behaviour and dental plaque during orthodontic treatment with fixed appliances. In this cross-sectional study, 116 adolescents with fixed appliances from an orthodontic clinic situated in Almere (the Netherlands) completed a questionnaire assessing oral health behaviours and the psychosocial factors of the HAPA model. Linear regression analyses were performed to examine the factors associated with dental plaque, toothbrushing, and the use of a proxy brush. Stepwise regression analysis showed that lower amounts of plaque were significantly associated with higher frequency of the use of a proxy brush (R 2 = 45%), higher intention of the use of a proxy brush (R 2 = 5%), female gender (R 2 = 2%), and older age (R 2 = 2%). The multiple regression analyses revealed that higher action self-efficacy, intention, maintenance self-efficacy, and a higher education were significantly associated with the use of a proxy brush (R 2 = 45%). Decreased levels of dental plaque are mainly associated with increased use of a proxy brush that is subsequently associated with a higher intention and self-efficacy to use the proxy brush. © 2017 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. [New method of mixed gas infrared spectrum analysis based on SVM].

    PubMed

    Bai, Peng; Xie, Wen-Jun; Liu, Jun-Hua

    2007-07-01

    A new method of infrared spectrum analysis based on support vector machine (SVM) for mixture gas was proposed. The kernel function in SVM was used to map the seriously overlapping absorption spectrum into high-dimensional space, and after transformation, the high-dimensional data could be processed in the original space, so the regression calibration model was established, then the regression calibration model with was applied to analyze the concentration of component gas. Meanwhile it was proved that the regression calibration model with SVM also could be used for component recognition of mixture gas. The method was applied to the analysis of different data samples. Some factors such as scan interval, range of the wavelength, kernel function and penalty coefficient C that affect the model were discussed. Experimental results show that the component concentration maximal Mean AE is 0.132%, and the component recognition accuracy is higher than 94%. The problems of overlapping absorption spectrum, using the same method for qualitative and quantitative analysis, and limit number of training sample, were solved. The method could be used in other mixture gas infrared spectrum analyses, promising theoretic and application values.

  18. Utility-Based Instruments for People with Dementia: A Systematic Review and Meta-Regression Analysis.

    PubMed

    Li, Li; Nguyen, Kim-Huong; Comans, Tracy; Scuffham, Paul

    2018-04-01

    Several utility-based instruments have been applied in cost-utility analysis to assess health state values for people with dementia. Nevertheless, concerns and uncertainty regarding their performance for people with dementia have been raised. To assess the performance of available utility-based instruments for people with dementia by comparing their psychometric properties and to explore factors that cause variations in the reported health state values generated from those instruments by conducting meta-regression analyses. A literature search was conducted and psychometric properties were synthesized to demonstrate the overall performance of each instrument. When available, health state values and variables such as the type of instrument and cognitive impairment levels were extracted from each article. A meta-regression analysis was undertaken and available covariates were included in the models. A total of 64 studies providing preference-based values were identified and included. The EuroQol five-dimension questionnaire demonstrated the best combination of feasibility, reliability, and validity. Meta-regression analyses suggested that significant differences exist between instruments, type of respondents, and mode of administration and the variations in estimated utility values had influences on incremental quality-adjusted life-year calculation. This review finds that the EuroQol five-dimension questionnaire is the most valid utility-based instrument for people with dementia, but should be replaced by others under certain circumstances. Although no utility estimates were reported in the article, the meta-regression analyses that examined variations in utility estimates produced by different instruments impact on cost-utility analysis, potentially altering the decision-making process in some circumstances. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Quantification of the impact of a confounding variable on functional connectivity confirms anti-correlated networks in the resting-state.

    PubMed

    Carbonell, F; Bellec, P; Shmuel, A

    2014-02-01

    The effect of regressing out the global average signal (GAS) in resting state fMRI data has become a concern for interpreting functional connectivity analyses. It is not clear whether the reported anti-correlations between the Default Mode and the Dorsal Attention Networks are intrinsic to the brain, or are artificially created by regressing out the GAS. Here we introduce a concept, Impact of the Global Average on Functional Connectivity (IGAFC), for quantifying the sensitivity of seed-based correlation analyses to the regression of the GAS. This voxel-wise IGAFC index is defined as the product of two correlation coefficients: the correlation between the GAS and the fMRI time course of a voxel, times the correlation between the GAS and the seed time course. This definition enables the calculation of a threshold at which the impact of regressing-out the GAS would be large enough to introduce spurious negative correlations. It also yields a post-hoc impact correction procedure via thresholding, which eliminates spurious correlations introduced by regressing out the GAS. In addition, we introduce an Artificial Negative Correlation Index (ANCI), defined as the absolute difference between the IGAFC index and the impact threshold. The ANCI allows a graded confidence scale for ranking voxels according to their likelihood of showing artificial correlations. By applying this method, we observed regions in the Default Mode and Dorsal Attention Networks that were anti-correlated. These findings confirm that the previously reported negative correlations between the Dorsal Attention and Default Mode Networks are intrinsic to the brain and not the result of statistical manipulations. Our proposed quantification of the impact that a confound may have on functional connectivity can be generalized to global effect estimators other than the GAS. It can be readily applied to other confounds, such as systemic physiological or head movement interferences, in order to quantify their impact on functional connectivity in the resting state. © 2013.

  20. New strategy for determination of anthocyanins, polyphenols and antioxidant capacity of Brassica oleracea liquid extract using infrared spectroscopies and multivariate regression

    NASA Astrophysics Data System (ADS)

    de Oliveira, Isadora R. N.; Roque, Jussara V.; Maia, Mariza P.; Stringheta, Paulo C.; Teófilo, Reinaldo F.

    2018-04-01

    A new method was developed to determine the antioxidant properties of red cabbage extract (Brassica oleracea) by mid (MID) and near (NIR) infrared spectroscopies and partial least squares (PLS) regression. A 70% (v/v) ethanolic extract of red cabbage was concentrated to 9° Brix and further diluted (12 to 100%) in water. The dilutions were used as external standards for the building of PLS models. For the first time, this strategy was applied for building multivariate regression models. Reference analyses and spectral data were obtained from diluted extracts. The determinate properties were total and monomeric anthocyanins, total polyphenols and antioxidant capacity by ABTS (2,2-azino-bis(3-ethyl-benzothiazoline-6-sulfonate)) and DPPH (2,2-diphenyl-1-picrylhydrazyl) methods. Ordered predictors selection (OPS) and genetic algorithm (GA) were used for feature selection before PLS regression (PLS-1). In addition, a PLS-2 regression was applied to all properties simultaneously. PLS-1 models provided more predictive models than did PLS-2 regression. PLS-OPS and PLS-GA models presented excellent prediction results with a correlation coefficient higher than 0.98. However, the best models were obtained using PLS and variable selection with the OPS algorithm and the models based on NIR spectra were considered more predictive for all properties. Then, these models provided a simple, rapid and accurate method for determination of red cabbage extract antioxidant properties and its suitability for use in the food industry.

  1. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  2. Optimizing methods for linking cinematic features to fMRI data.

    PubMed

    Kauttonen, Janne; Hlushchuk, Yevhen; Tikka, Pia

    2015-04-15

    One of the challenges of naturalistic neurosciences using movie-viewing experiments is how to interpret observed brain activations in relation to the multiplicity of time-locked stimulus features. As previous studies have shown less inter-subject synchronization across viewers of random video footage than story-driven films, new methods need to be developed for analysis of less story-driven contents. To optimize the linkage between our fMRI data collected during viewing of a deliberately non-narrative silent film 'At Land' by Maya Deren (1944) and its annotated content, we combined the method of elastic-net regularization with the model-driven linear regression and the well-established data-driven independent component analysis (ICA) and inter-subject correlation (ISC) methods. In the linear regression analysis, both IC and region-of-interest (ROI) time-series were fitted with time-series of a total of 36 binary-valued and one real-valued tactile annotation of film features. The elastic-net regularization and cross-validation were applied in the ordinary least-squares linear regression in order to avoid over-fitting due to the multicollinearity of regressors, the results were compared against both the partial least-squares (PLS) regression and the un-regularized full-model regression. Non-parametric permutation testing scheme was applied to evaluate the statistical significance of regression. We found statistically significant correlation between the annotation model and 9 ICs out of 40 ICs. Regression analysis was also repeated for a large set of cubic ROIs covering the grey matter. Both IC- and ROI-based regression analyses revealed activations in parietal and occipital regions, with additional smaller clusters in the frontal lobe. Furthermore, we found elastic-net based regression more sensitive than PLS and un-regularized regression since it detected a larger number of significant ICs and ROIs. Along with the ISC ranking methods, our regression analysis proved a feasible method for ordering the ICs based on their functional relevance to the annotated cinematic features. The novelty of our method is - in comparison to the hypothesis-driven manual pre-selection and observation of some individual regressors biased by choice - in applying data-driven approach to all content features simultaneously. We found especially the combination of regularized regression and ICA useful when analyzing fMRI data obtained using non-narrative movie stimulus with a large set of complex and correlated features. Copyright © 2015. Published by Elsevier Inc.

  3. A regularization corrected score method for nonlinear regression models with covariate error.

    PubMed

    Zucker, David M; Gorfine, Malka; Li, Yi; Tadesse, Mahlet G; Spiegelman, Donna

    2013-03-01

    Many regression analyses involve explanatory variables that are measured with error, and failing to account for this error is well known to lead to biased point and interval estimates of the regression coefficients. We present here a new general method for adjusting for covariate error. Our method consists of an approximate version of the Stefanski-Nakamura corrected score approach, using the method of regularization to obtain an approximate solution of the relevant integral equation. We develop the theory in the setting of classical likelihood models; this setting covers, for example, linear regression, nonlinear regression, logistic regression, and Poisson regression. The method is extremely general in terms of the types of measurement error models covered, and is a functional method in the sense of not involving assumptions on the distribution of the true covariate. We discuss the theoretical properties of the method and present simulation results in the logistic regression setting (univariate and multivariate). For illustration, we apply the method to data from the Harvard Nurses' Health Study concerning the relationship between physical activity and breast cancer mortality in the period following a diagnosis of breast cancer. Copyright © 2013, The International Biometric Society.

  4. A Primer for Analyzing Nested Data: Multilevel Modeling in SPSS Using an Example from a REL Study. REL 2015-046

    ERIC Educational Resources Information Center

    O'Dwyer, Laura M.; Parker, Caroline E.

    2014-01-01

    Analyzing data that possess some form of nesting is often challenging for applied researchers or district staff who are involved in or in charge of conducting data analyses. This report provides a description of the challenges for analyzing nested data and provides a primer of how multilevel regression modeling may be used to resolve these…

  5. Cardiovascular risk from water arsenic exposure in Vietnam: Application of systematic review and meta-regression analysis in chemical health risk assessment.

    PubMed

    Phung, Dung; Connell, Des; Rutherford, Shannon; Chu, Cordia

    2017-06-01

    A systematic review (SR) and meta-analysis cannot provide the endpoint answer for a chemical risk assessment (CRA). The objective of this study was to apply SR and meta-regression (MR) analysis to address this limitation using a case study in cardiovascular risk from arsenic exposure in Vietnam. Published studies were searched from PubMed using the keywords of arsenic exposure and cardiovascular diseases (CVD). Random-effects meta-regression was applied to model the linear relationship between arsenic concentration in water and risk of CVD, and then the no-observable-adverse-effect level (NOAEL) were identified from the regression function. The probabilistic risk assessment (PRA) technique was applied to characterize risk of CVD due to arsenic exposure by estimating the overlapping coefficient between dose-response and exposure distribution curves. The risks were evaluated for groundwater, treated and drinking water. A total of 8 high quality studies for dose-response and 12 studies for exposure data were included for final analyses. The results of MR suggested a NOAEL of 50 μg/L and a guideline of 5 μg/L for arsenic in water which valued as a half of NOAEL and guidelines recommended from previous studies and authorities. The results of PRA indicated that the observed exposure level with exceeding CVD risk was 52% for groundwater, 24% for treated water, and 10% for drinking water in Vietnam, respectively. The study found that systematic review and meta-regression can be considered as an ideal method to chemical risk assessment due to its advantages to bring the answer for the endpoint question of a CRA. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  7. Ratio manipulating spectrophotometry versus chemometry as stability indicating methods for cefquinome sulfate determination

    NASA Astrophysics Data System (ADS)

    Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.

    2016-01-01

    Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.

  8. Bayesian models for comparative analysis integrating phylogenetic uncertainty.

    PubMed

    de Villemereuil, Pierre; Wells, Jessie A; Edwards, Robert D; Blomberg, Simon P

    2012-06-28

    Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.

  9. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    PubMed Central

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602

  10. Recognising out-of-hospital cardiac arrest during emergency calls increases bystander cardiopulmonary resuscitation and survival.

    PubMed

    Viereck, Søren; Møller, Thea Palsgaard; Ersbøll, Annette Kjær; Bækgaard, Josefine Stokholm; Claesson, Andreas; Hollenberg, Jacob; Folke, Fredrik; Lippert, Freddy K

    2017-06-01

    Initiation of early bystander cardiopulmonary resuscitation (CPR) depends on bystanders' or medical dispatchers' recognition of out-of-hospital cardiac arrest (OHCA). The primary aim of our study was to investigate if OHCA recognition during the emergency call was associated with bystander CPR, return of spontaneous circulation (ROSC), and 30-day survival. Our secondary aim was to identify patient-, setting-, and dispatcher-related predictors of OHCA recognition. We performed an observational study of all OHCA patients' emergency calls in the Capital Region of Denmark from 01/01/2013-31/12/2013. OHCAs were collected from the Danish Cardiac Arrest Registry and the Mobile Critical Care Unit database. Emergency call recordings were identified and evaluated. Multivariable logistic regression analyses were applied to all OHCAs and witnessed OHCAs only to analyse the association between OHCA recognition and bystander CPR, ROSC, and 30-day survival. Univariable logistic regression analyses were applied to identify predictors of OHCA recognition. We included 779 emergency calls in the analyses. During the emergency calls, 70.1% (n=534) of OHCAs were recognised; OHCA recognition was positively associated with bystander CPR (odds ratio [OR]=7.84, 95% confidence interval [CI]: 5.10-12.05) in all OHCAs; and ROSC (OR=1.86, 95% CI: 1.13-3.06) and 30-day survival (OR=2.80, 95% CI: 1.58-4.96) in witnessed OHCA. Predictors of OHCA recognition were addressing breathing (OR=1.76, 95% CI: 1.17-2.66) and callers located by the patient's side (OR=2.16, 95% CI: 1.46-3.19). Recognition of OHCA during emergency calls was positively associated with the provision of bystander CPR, ROSC, and 30-day survival in witnessed OHCA. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Collapse susceptibility mapping in karstified gypsum terrain (Sivas basin - Turkey) by conditional probability, logistic regression, artificial neural network models

    NASA Astrophysics Data System (ADS)

    Yilmaz, Isik; Keskin, Inan; Marschalko, Marian; Bednarik, Martin

    2010-05-01

    This study compares the GIS based collapse susceptibility mapping methods such as; conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) applied in gypsum rock masses in Sivas basin (Turkey). Digital Elevation Model (DEM) was first constructed using GIS software. Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index- TWI, stream power index- SPI, Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from CP, LR and ANN models, and they were then compared by means of their validations. Area Under Curve (AUC) values obtained from all three methodologies showed that the map obtained from ANN model looks like more accurate than the other models, and the results also showed that the artificial neural networks is a usefull tool in preparation of collapse susceptibility map and highly compatible with GIS operating features. Key words: Collapse; doline; susceptibility map; gypsum; GIS; conditional probability; logistic regression; artificial neural networks.

  12. Application of neural networks and sensitivity analysis to improved prediction of trauma survival.

    PubMed

    Hunter, A; Kennedy, L; Henry, J; Ferguson, I

    2000-05-01

    The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.

  13. Differential item functioning analysis with ordinal logistic regression techniques. DIFdetect and difwithpar.

    PubMed

    Crane, Paul K; Gibbons, Laura E; Jolley, Lance; van Belle, Gerald

    2006-11-01

    We present an ordinal logistic regression model for identification of items with differential item functioning (DIF) and apply this model to a Mini-Mental State Examination (MMSE) dataset. We employ item response theory ability estimation in our models. Three nested ordinal logistic regression models are applied to each item. Model testing begins with examination of the statistical significance of the interaction term between ability and the group indicator, consistent with nonuniform DIF. Then we turn our attention to the coefficient of the ability term in models with and without the group term. If including the group term has a marked effect on that coefficient, we declare that it has uniform DIF. We examined DIF related to language of test administration in addition to self-reported race, Hispanic ethnicity, age, years of education, and sex. We used PARSCALE for IRT analyses and STATA for ordinal logistic regression approaches. We used an iterative technique for adjusting IRT ability estimates on the basis of DIF findings. Five items were found to have DIF related to language. These same items also had DIF related to other covariates. The ordinal logistic regression approach to DIF detection, when combined with IRT ability estimates, provides a reasonable alternative for DIF detection. There appear to be several items with significant DIF related to language of test administration in the MMSE. More attention needs to be paid to the specific criteria used to determine whether an item has DIF, not just the technique used to identify DIF.

  14. Independent contrasts and PGLS regression estimators are equivalent.

    PubMed

    Blomberg, Simon P; Lefevre, James G; Wells, Jessie A; Waterhouse, Mary

    2012-05-01

    We prove that the slope parameter of the ordinary least squares regression of phylogenetically independent contrasts (PICs) conducted through the origin is identical to the slope parameter of the method of generalized least squares (GLSs) regression under a Brownian motion model of evolution. This equivalence has several implications: 1. Understanding the structure of the linear model for GLS regression provides insight into when and why phylogeny is important in comparative studies. 2. The limitations of the PIC regression analysis are the same as the limitations of the GLS model. In particular, phylogenetic covariance applies only to the response variable in the regression and the explanatory variable should be regarded as fixed. Calculation of PICs for explanatory variables should be treated as a mathematical idiosyncrasy of the PIC regression algorithm. 3. Since the GLS estimator is the best linear unbiased estimator (BLUE), the slope parameter estimated using PICs is also BLUE. 4. If the slope is estimated using different branch lengths for the explanatory and response variables in the PIC algorithm, the estimator is no longer the BLUE, so this is not recommended. Finally, we discuss whether or not and how to accommodate phylogenetic covariance in regression analyses, particularly in relation to the problem of phylogenetic uncertainty. This discussion is from both frequentist and Bayesian perspectives.

  15. Impact of covariate models on the assessment of the air pollution-mortality association in a single- and multipollutant context.

    PubMed

    Sacks, Jason D; Ito, Kazuhiko; Wilson, William E; Neas, Lucas M

    2012-10-01

    With the advent of multicity studies, uniform statistical approaches have been developed to examine air pollution-mortality associations across cities. To assess the sensitivity of the air pollution-mortality association to different model specifications in a single and multipollutant context, the authors applied various regression models developed in previous multicity time-series studies of air pollution and mortality to data from Philadelphia, Pennsylvania (May 1992-September 1995). Single-pollutant analyses used daily cardiovascular mortality, fine particulate matter (particles with an aerodynamic diameter ≤2.5 µm; PM(2.5)), speciated PM(2.5), and gaseous pollutant data, while multipollutant analyses used source factors identified through principal component analysis. In single-pollutant analyses, risk estimates were relatively consistent across models for most PM(2.5) components and gaseous pollutants. However, risk estimates were inconsistent for ozone in all-year and warm-season analyses. Principal component analysis yielded factors with species associated with traffic, crustal material, residual oil, and coal. Risk estimates for these factors exhibited less sensitivity to alternative regression models compared with single-pollutant models. Factors associated with traffic and crustal material showed consistently positive associations in the warm season, while the coal combustion factor showed consistently positive associations in the cold season. Overall, mortality risk estimates examined using a source-oriented approach yielded more stable and precise risk estimates, compared with single-pollutant analyses.

  16. Personal discrimination and satisfaction with life: Exploring perceived functional effects of Asian American race/ethnicity as a moderator.

    PubMed

    Tran, Alisia G T T; Sangalang, Cindy C

    2016-01-01

    This study aims to understand the relations between experiences of racial/ethnic discrimination, perceptions of the harmful or helpful effects of one's Asian American race/ethnicity within educational and occupational contexts (perceived functional effects), and well-being (i.e., satisfaction with life). A primary focus was to evaluate whether the association between racial/ethnic discrimination and satisfaction with life varied based on the degree to which Asian Americans believe that their race or ethnicity is helpful or harmful to educational and occupational functioning. This study draws on nationally representative data from ethnically diverse Asian American adults (N = 3,335) and utilizes weighted descriptive, correlational, and logistic regression moderation analyses. Ethnic variations emerged across analyses. Logistic regression analyses revealed a significant moderation effect for Chinese and Filipino Americans. Follow-up analyses revealed a protective effect of perceiving more positive or helpful functional effects in nullifying the link between discrimination and dissatisfaction with life for Chinese Americans. By contrast, viewing more harmful functional effects had a buffering effect for Filipino Americans. Results have implications for conceptualizing the potential impact of perspectives that imply Asian American advantage or disadvantage. Opportunities to apply and extend these initial findings are discussed. (c) 2016 APA, all rights reserved).

  17. Incorporating Measurement Error from Modeled Air Pollution Exposures into Epidemiological Analyses.

    PubMed

    Samoli, Evangelia; Butland, Barbara K

    2017-12-01

    Outdoor air pollution exposures used in epidemiological studies are commonly predicted from spatiotemporal models incorporating limited measurements, temporal factors, geographic information system variables, and/or satellite data. Measurement error in these exposure estimates leads to imprecise estimation of health effects and their standard errors. We reviewed methods for measurement error correction that have been applied in epidemiological studies that use model-derived air pollution data. We identified seven cohort studies and one panel study that have employed measurement error correction methods. These methods included regression calibration, risk set regression calibration, regression calibration with instrumental variables, the simulation extrapolation approach (SIMEX), and methods under the non-parametric or parameter bootstrap. Corrections resulted in small increases in the absolute magnitude of the health effect estimate and its standard error under most scenarios. Limited application of measurement error correction methods in air pollution studies may be attributed to the absence of exposure validation data and the methodological complexity of the proposed methods. Future epidemiological studies should consider in their design phase the requirements for the measurement error correction method to be later applied, while methodological advances are needed under the multi-pollutants setting.

  18. "Assessing the methodological quality of systematic reviews in radiation oncology: A systematic review".

    PubMed

    Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen

    2017-10-01

    The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Prediction models for Arabica coffee beverage quality based on aroma analyses and chemometrics.

    PubMed

    Ribeiro, J S; Augusto, F; Salva, T J G; Ferreira, M M C

    2012-11-15

    In this work, soft modeling based on chemometric analyses of coffee beverage sensory data and the chromatographic profiles of volatile roasted coffee compounds is proposed to predict the scores of acidity, bitterness, flavor, cleanliness, body, and overall quality of the coffee beverage. A partial least squares (PLS) regression method was used to construct the models. The ordered predictor selection (OPS) algorithm was applied to select the compounds for the regression model of each sensory attribute in order to take only significant chromatographic peaks into account. The prediction errors of these models, using 4 or 5 latent variables, were equal to 0.28, 0.33, 0.35, 0.33, 0.34 and 0.41, for each of the attributes and compatible with the errors of the mean scores of the experts. Thus, the results proved the feasibility of using a similar methodology in on-line or routine applications to predict the sensory quality of Brazilian Arabica coffee. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Advanced spectrophotometric chemometric methods for resolving the binary mixture of doxylamine succinate and pyridoxine hydrochloride.

    PubMed

    Katsarov, Plamen; Gergov, Georgi; Alin, Aylin; Pilicheva, Bissera; Al-Degs, Yahya; Simeonov, Vasil; Kassarova, Margarita

    2018-03-01

    The prediction power of partial least squares (PLS) and multivariate curve resolution-alternating least squares (MCR-ALS) methods have been studied for simultaneous quantitative analysis of the binary drug combination - doxylamine succinate and pyridoxine hydrochloride. Analysis of first-order UV overlapped spectra was performed using different PLS models - classical PLS1 and PLS2 as well as partial robust M-regression (PRM). These linear models were compared to MCR-ALS with equality and correlation constraints (MCR-ALS-CC). All techniques operated within the full spectral region and extracted maximum information for the drugs analysed. The developed chemometric methods were validated on external sample sets and were applied to the analyses of pharmaceutical formulations. The obtained statistical parameters were satisfactory for calibration and validation sets. All developed methods can be successfully applied for simultaneous spectrophotometric determination of doxylamine and pyridoxine both in laboratory-prepared mixtures and commercial dosage forms.

  1. Robust inference under the beta regression model with application to health care studies.

    PubMed

    Ghosh, Abhik

    2017-01-01

    Data on rates, percentages, or proportions arise frequently in many different applied disciplines like medical biology, health care, psychology, and several others. In this paper, we develop a robust inference procedure for the beta regression model, which is used to describe such response variables taking values in (0, 1) through some related explanatory variables. In relation to the beta regression model, the issue of robustness has been largely ignored in the literature so far. The existing maximum likelihood-based inference has serious lack of robustness against outliers in data and generate drastically different (erroneous) inference in the presence of data contamination. Here, we develop the robust minimum density power divergence estimator and a class of robust Wald-type tests for the beta regression model along with several applications. We derive their asymptotic properties and describe their robustness theoretically through the influence function analyses. Finite sample performances of the proposed estimators and tests are examined through suitable simulation studies and real data applications in the context of health care and psychology. Although we primarily focus on the beta regression models with a fixed dispersion parameter, some indications are also provided for extension to the variable dispersion beta regression models with an application.

  2. Using heart rate to predict energy expenditure in large domestic dogs.

    PubMed

    Gerth, N; Ruoß, C; Dobenecker, B; Reese, S; Starck, J M

    2016-06-01

    The aim of this study was to establish heart rate as a measure of energy expenditure in large active kennel dogs (28 ± 3 kg bw). Therefore, the heart rate (HR)-oxygen consumption (V˙O2) relationship was analysed in Foxhound-Boxer-Ingelheim-Labrador cross-breds (FBI dogs) at rest and graded levels of exercise on a treadmill up to 60-65% of maximal aerobic capacity. To test for effects of training, HR and V˙O2 were measured in female dogs, before and after a training period, and after an adjacent training pause to test for reversibility of potential effects. Least squares regression was applied to describe the relationship between HR and V˙O2. The applied training had no statistically significant effect on the HR-V˙O2 regression. A general regression line from all data collected was prepared to establish a general predictive equation for energy expenditure from HR in FBI dogs. The regression equation established in this study enables fast estimation of energy requirement for running activity. The equation is valid for large dogs weighing around 30 kg that run at ground level up to 15 km/h with a heart rate maximum of 190 bpm irrespective of the training level. Journal of Animal Physiology and Animal Nutrition © 2015 Blackwell Verlag GmbH.

  3. PSHREG: A SAS macro for proportional and nonproportional subdistribution hazards regression

    PubMed Central

    Kohl, Maria; Plischke, Max; Leffondré, Karen; Heinze, Georg

    2015-01-01

    We present a new SAS macro %pshreg that can be used to fit a proportional subdistribution hazards model for survival data subject to competing risks. Our macro first modifies the input data set appropriately and then applies SAS's standard Cox regression procedure, PROC PHREG, using weights and counting-process style of specifying survival times to the modified data set. The modified data set can also be used to estimate cumulative incidence curves for the event of interest. The application of PROC PHREG has several advantages, e.g., it directly enables the user to apply the Firth correction, which has been proposed as a solution to the problem of undefined (infinite) maximum likelihood estimates in Cox regression, frequently encountered in small sample analyses. Deviation from proportional subdistribution hazards can be detected by both inspecting Schoenfeld-type residuals and testing correlation of these residuals with time, or by including interactions of covariates with functions of time. We illustrate application of these extended methods for competing risk regression using our macro, which is freely available at: http://cemsiis.meduniwien.ac.at/en/kb/science-research/software/statistical-software/pshreg, by means of analysis of a real chronic kidney disease study. We discuss differences in features and capabilities of %pshreg and the recent (January 2014) SAS PROC PHREG implementation of proportional subdistribution hazards modelling. PMID:25572709

  4. Ratio manipulating spectrophotometry versus chemometry as stability indicating methods for cefquinome sulfate determination.

    PubMed

    Yehia, Ali M; Arafa, Reham M; Abbas, Samah S; Amer, Sawsan M

    2016-01-15

    Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL(-1). Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Adjusting for publication biases across similar interventions performed well when compared with gold standard data.

    PubMed

    Moreno, Santiago G; Sutton, Alex J; Ades, A E; Cooper, Nicola J; Abrams, Keith R

    2011-11-01

    To extend, apply, and evaluate a regression-based approach to adjusting meta-analysis for publication and related biases. The approach uses related meta-analyses to improve estimation by borrowing strength on the degree of bias. The proposed adjustment approach is described. Adjustments are applied both independently and by borrowing strength across journal-extracted data on the effectiveness of 12 antidepressant drugs from placebo-controlled trials. The methods are also applied to Food and Drug Administration (FDA) data obtained on the same 12 drugs. Results are compared, viewing the FDA observed data as gold standard. Estimates adjusted for publication biases made independently for each drug were very uncertain using both the journal and FDA data. Adjusted estimates were much more precise when borrowing strength across meta-analyses. Reassuringly, adjustments in this way made to the journal data agreed closely with the observed estimates from the FDA data, while the adjusted FDA results changed only minimally from those observed from the FDA data. The method worked well in the case study considered and therefore further evaluation is encouraged. It is suggested that this approach may be especially useful when adjusting several meta-analyses on similar interventions and outcomes, particularly when there are small numbers of studies. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  7. Measurement of tidal volume using respiratory ultrasonic plethysmography in anaesthetized, mechanically ventilated horses.

    PubMed

    Russold, Elena; Ambrisko, Tamas D; Schramel, Johannes P; Auer, Ulrike; Van Den Hoven, Rene; Moens, Yves P

    2013-01-01

    To compare tidal volume estimations obtained from Respiratory Ultrasonic Plethysmography (RUP) with simultaneous spirometric measurements in anaesthetized, mechanically ventilated horses. Prospective randomized experimental study. Five experimental horses. Five horses were anaesthetized twice (1 week apart) in random order in lateral and in dorsal recumbency. Nine ventilation modes (treatments) were scheduled in random order (each lasting 4 minutes) applying combinations of different tidal volumes (8, 10, 12 mL kg(-1)) and positive end-expiratory pressures (PEEP) (0, 10, 20 cm H(2)O). Baseline ventilation mode (tidal volume=15 mL kg(-1), PEEP=0 cm H(2)O) was applied for 4 minutes between all treatments. Spirometry and RUP data were downloaded to personal computers. Linear regression analyses (RUP versus spirometric tidal volume) were performed using different subsets of data. Additonally RUP was calibrated against spirometry using a regression equation for all RUP signal values (thoracic, abdominal and combined) with all data collectively and also by an individually determined best regression equation (highest R(2)) for each experiment (horse versus recumbency) separately. Agreement between methods was assessed with Bland-Altman analyses. The highest correlation of RUP and spirometric tidal volume (R(2)=0.81) was found with the combined RUP signal in horses in lateral recumbency and ventilated without PEEP. The bias ±2 SD was 0±2.66 L when RUP was calibrated for collective data, but decreased to 0±0.87 L when RUP was calibrated with individual data. A possible use of RUP for tidal volume measurement during IPPV needs individual calibration to obtain limits of agreement within ±20%. © 2012 The Authors. Veterinary Anaesthesia and Analgesia. © 2012 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesiologists.

  8. Interannual drought index variations in Central Europe related to large-scale atmospheric circulation

    NASA Astrophysics Data System (ADS)

    Beck, Christoph; Philipp, Andreas; Jacobeit, Jucundus

    2014-05-01

    This contribution investigates the relationship between large-scale atmospheric circulation and interannual variations of the standardized precipitation index (SPI) in central Europe. To this end occurrence frequencies of circulation types (CT) derived from a variety of circulation type classifications (CTC) applied to daily sea level pressure (SLP) data and mean circulation indices of vorticity (V), zonality (Z) and meridionality (M) have been utilized as predictors within multiple regression models (MRM) for the estimation of gridded 3-month SPI values over central Europe for the period 1950 to 2010. CTC based MRMs used in the analyses comprise variants concerning the basic method for CT classification, the number of CTs, the size and location of the spatial domain used for CTCs and the exclusive use of CT frequencies or the combined use of CT frequencies and mean circulation indices as predictors. Adequate MRM predictor combinations have been identified by applying stepwise multiple regression analyses within a resampling framework. The performance (robustness) of the resulting MRMs has been quantified based on a leave-one out cross-validation procedure applying several skill scores. Furthermore the relative importance of individual predictors has been estimated for each MRM. From these analyses it can be stated that i.) the consideration of vorticity characteristics within CTCs, ii.) a relatively small size of the spatial domain to which CTCs are applied and iii.) the inclusion of mean circulation indices appear to improve model skill. However model skill exhibits distinct variations between seasons and regions. Whereas promising skill can be stated for the western and northwestern parts of the central European domain only unsatisfactorily skill is reached in the more continental regions and particularly during summer. Thus it can be concluded that the here presented approaches feature the potential for the downscaling of central European drought index variations from large-scale circulation at least for some regions. Further improvements of CTC based approaches may be expected from the optimization of CTCs for explaining the SPI e.g. via the inclusion of additional variables into the classification procedure.

  9. Interannual drought index variations in Central Europe related to the large-scale atmospheric circulation—application and evaluation of statistical downscaling approaches based on circulation type classifications

    NASA Astrophysics Data System (ADS)

    Beck, Christoph; Philipp, Andreas; Jacobeit, Jucundus

    2015-08-01

    This contribution investigates the relationship between the large-scale atmospheric circulation and interannual variations of the standardized precipitation index (SPI) in Central Europe. To this end, circulation types (CT) have been derived from a variety of circulation type classifications (CTC) applied to daily sea level pressure (SLP) data and mean circulation indices of vorticity ( V), zonality ( Z) and meridionality ( M) have been calculated. Occurrence frequencies of CTs and circulation indices have been utilized as predictors within multiple regression models (MRM) for the estimation of gridded 3-month SPI values over Central Europe, for the period 1950 to 2010. CTC-based MRMs used in the analyses comprise variants concerning the basic method for CT classification, the number of CTs, the size and location of the spatial domain used for CTCs and the exclusive use of CT frequencies or the combined use of CT frequencies and mean circulation indices as predictors. Adequate MRM predictor combinations have been identified by applying stepwise multiple regression analyses within a resampling framework. The performance (robustness) of the resulting MRMs has been quantified based on a leave-one-out cross-validation procedure applying several skill scores. Furthermore, the relative importance of individual predictors has been estimated for each MRM. From these analyses, it can be stated that model skill is improved by (i) the consideration of vorticity characteristics within CTCs, (ii) a relatively small size of the spatial domain to which CTCs are applied and (iii) the inclusion of mean circulation indices. However, model skill exhibits distinct variations between seasons and regions. Whereas promising skill can be stated for the western and northwestern parts of the Central European domain, only unsatisfactory skill is reached in the more continental regions and particularly during summer. Thus, it can be concluded that the presented approaches feature the potential for the downscaling of Central European drought index variations from the large-scale circulation, at least for some regions. Further improvements of CTC-based approaches may be expected from the optimization of CTCs for explaining the SPI, e.g. via the inclusion of additional variables in the classification procedure.

  10. Non-proportional odds multivariate logistic regression of ordinal family data.

    PubMed

    Zaloumis, Sophie G; Scurrah, Katrina J; Harrap, Stephen B; Ellis, Justine A; Gurrin, Lyle C

    2015-03-01

    Methods to examine whether genetic and/or environmental sources can account for the residual variation in ordinal family data usually assume proportional odds. However, standard software to fit the non-proportional odds model to ordinal family data is limited because the correlation structure of family data is more complex than for other types of clustered data. To perform these analyses we propose the non-proportional odds multivariate logistic regression model and take a simulation-based approach to model fitting using Markov chain Monte Carlo methods, such as partially collapsed Gibbs sampling and the Metropolis algorithm. We applied the proposed methodology to male pattern baldness data from the Victorian Family Heart Study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Analysis of Low Bidding and Change Order Rates for Navy Facilities Construction Contracts.

    DTIC Science & Technology

    1984-06-01

    examine his motives and strategies prior to bidding. Several measures of " level cf competitiveness" are introduced from bidding theory literature that...bidders of fixed-price Government construction contracts have on contract prices when the level FORM, 1473 EDITION OF INOV 6 o IS OBSOLETE S N 0 102...conventional measures of the . level of competition intensity are applied in regression and variance analyses. en, z e ., . , 144 , UNCLASSIFIED 2 SgCURITlY

  12. Template based rotation: A method for functional connectivity analysis with a priori templates☆

    PubMed Central

    Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.

    2014-01-01

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based analyses. This flexibility owes to the reduced spatial and temporal orthogonality constraints of template based rotation as compared to dual regression. These results suggest that template based rotation can provide a useful alternative to existing fcMRI analytic methods, particularly in clinical trial settings where predefined outcome measures and conserved network descriptions across groups are at a premium. PMID:25150630

  13. The relationship between perceived health and physical activity indoors, outdoors in built environments, and outdoors in nature.

    PubMed

    Pasanen, Tytti P; Tyrväinen, Liisa; Korpela, Kalevi M

    2014-11-01

    A body of evidence shows that both physical activity and exposure to nature are connected to improved general and mental health. Experimental studies have consistently found short term positive effects of physical activity in nature compared with built environments. This study explores whether these benefits are also evident in everyday life, perceived over repeated contact with nature. The topic is important from the perspectives of city planning, individual well-being, and public health. National survey data (n = 2,070) from Finland was analysed using structural regression analyses. Perceived general health, emotional well-being, and sleep quality were regressed on the weekly frequency of physical activity indoors, outdoors in built environments, and in nature. Socioeconomic factors and other plausible confounders were controlled for. Emotional well-being showed the most consistent positive connection to physical activity in nature, whereas general health was positively associated with physical activity in both built and natural outdoor settings. Better sleep quality was weakly connected to frequent physical activity in nature, but the connection was outweighed by other factors. The results indicate that nature provides an added value to the known benefits of physical activity. Repeated exercise in nature is, in particular, connected to better emotional well-being. © 2014 The Authors. Applied Psychology: Health and Well-Being published by John Wiley & Sons Ltd on behalf of The International Association of Applied Psychology.

  14. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  15. Application of classification tree and logistic regression for the management and health intervention plans in a community-based study.

    PubMed

    Teng, Ju-Hsi; Lin, Kuan-Chia; Ho, Bin-Shenq

    2007-10-01

    A community-based aboriginal study was conducted and analysed to explore the application of classification tree and logistic regression. A total of 1066 aboriginal residents in Yilan County were screened during 2003-2004. The independent variables include demographic characteristics, physical examinations, geographic location, health behaviours, dietary habits and family hereditary diseases history. Risk factors of cardiovascular diseases were selected as the dependent variables in further analysis. The completion rate for heath interview is 88.9%. The classification tree results find that if body mass index is higher than 25.72 kg m(-2) and the age is above 51 years, the predicted probability for number of cardiovascular risk factors > or =3 is 73.6% and the population is 322. If body mass index is higher than 26.35 kg m(-2) and geographical latitude of the village is lower than 24 degrees 22.8', the predicted probability for number of cardiovascular risk factors > or =4 is 60.8% and the population is 74. As the logistic regression results indicate that body mass index, drinking habit and menopause are the top three significant independent variables. The classification tree model specifically shows the discrimination paths and interactions between the risk groups. The logistic regression model presents and analyses the statistical independent factors of cardiovascular risks. Applying both models to specific situations will provide a different angle for the design and management of future health intervention plans after community-based study.

  16. Comparisons of Office and 24-Hour Ambulatory Blood Pressure Monitoring in Children with Obstructive Sleep Apnea.

    PubMed

    Kang, Kun-Tai; Chiu, Shuenn-Nan; Weng, Wen-Chin; Lee, Pei-Lin; Hsu, Wei-Chung

    2017-03-01

    To compare office blood pressure (BP) and 24-hour ambulatory BP (ABP) monitoring to facilitate the diagnosis and management of hypertension in children with obstructive sleep apnea (OSA). Children aged 4-16 years with OSA-related symptoms were recruited from a tertiary referral medical center. All children underwent overnight polysomnography, office BP, and 24-hour ABP studies. Multiple linear regression analyses were applied to elucidate the association between the apnea-hypopnea index and BP. Correlation and consistency between office BP and 24-hour ABP were measured by Pearson correlation, intraclass correlation, and Bland-Altman analyses. In the 163 children enrolled (mean age, 8.2 ± 3.3 years; 67% male). The prevalence of systolic hypertension at night was significantly higher in children with moderate-to-severe OSA than in those with primary snoring (44.9% vs 16.1%, P = .006). Pearson correlation and intraclass correlation analyses revealed associations between office BP and 24-hour BP, and Bland-Altman analysis indicated an agreement between office and 24-hour BP measurements. However, multiple linear regression analyses demonstrated that 24-hour BP (nighttime systolic BP and mean arterial pressure), unlike office BP, was independently associated with the apnea-hypopnea index, after adjustment for adiposity variables. Twenty-four-hour ABP is more strongly correlated with OSA in children, compared with office BP. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less

  18. Spatial Autocorrelation Approaches to Testing Residuals from Least Squares Regression.

    PubMed

    Chen, Yanguang

    2016-01-01

    In geo-statistics, the Durbin-Watson test is frequently employed to detect the presence of residual serial correlation from least squares regression analyses. However, the Durbin-Watson statistic is only suitable for ordered time or spatial series. If the variables comprise cross-sectional data coming from spatial random sampling, the test will be ineffectual because the value of Durbin-Watson's statistic depends on the sequence of data points. This paper develops two new statistics for testing serial correlation of residuals from least squares regression based on spatial samples. By analogy with the new form of Moran's index, an autocorrelation coefficient is defined with a standardized residual vector and a normalized spatial weight matrix. Then by analogy with the Durbin-Watson statistic, two types of new serial correlation indices are constructed. As a case study, the two newly presented statistics are applied to a spatial sample of 29 China's regions. These results show that the new spatial autocorrelation models can be used to test the serial correlation of residuals from regression analysis. In practice, the new statistics can make up for the deficiencies of the Durbin-Watson test.

  19. Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate

    NASA Astrophysics Data System (ADS)

    Minh, Vu Trieu; Katushin, Dmitri; Antonov, Maksim; Veinthal, Renno

    2017-03-01

    This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM) based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), rock brittleness index (BI), the distance between planes of weakness (DPW), and the alpha angle (Alpha) between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP). Four (4) statistical regression models (two linear and two nonlinear) are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2) of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.

  20. Multilevel covariance regression with correlated random effects in the mean and variance structure.

    PubMed

    Quintero, Adrian; Lesaffre, Emmanuel

    2017-09-01

    Multivariate regression methods generally assume a constant covariance matrix for the observations. In case a heteroscedastic model is needed, the parametric and nonparametric covariance regression approaches can be restrictive in the literature. We propose a multilevel regression model for the mean and covariance structure, including random intercepts in both components and allowing for correlation between them. The implied conditional covariance function can be different across clusters as a result of the random effect in the variance structure. In addition, allowing for correlation between the random intercepts in the mean and covariance makes the model convenient for skewedly distributed responses. Furthermore, it permits us to analyse directly the relation between the mean response level and the variability in each cluster. Parameter estimation is carried out via Gibbs sampling. We compare the performance of our model to other covariance modelling approaches in a simulation study. Finally, the proposed model is applied to the RN4CAST dataset to identify the variables that impact burnout of nurses in Belgium. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    NASA Astrophysics Data System (ADS)

    Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.

    2016-04-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  2. Enhanced fertility prediction of cryopreserved boar spermatozoa using novel sperm function assessment.

    PubMed

    Daigneault, B W; McNamara, K A; Purdy, P H; Krisher, R L; Knox, R V; Rodriguez-Zas, S L; Miller, D J

    2015-05-01

    Due to reduced fertility, cryopreserved semen is seldom used for commercial porcine artificial insemination (AI). Predicting the fertility of individual frozen ejaculates for selection of higher quality semen prior to AI would increase overall success. Our objective was to test novel and traditional laboratory analyses to identify characteristics of cryopreserved spermatozoa that are related to boar fertility. Traditional post-thaw analyses of motility, viability, and acrosome integrity were performed on each ejaculate. In vitro fertilization, cleavage, and blastocyst development were also determined. Finally, spermatozoa-oviduct binding and competitive zona-binding assays were applied to assess sperm adhesion to these two matrices. Fertility of the same ejaculates subjected to laboratory assays was determined for each boar by multi-sire AI and defined as (i) the mean percentage of the litter sired and (ii) the mean number of piglets sired in each litter. Means of each laboratory evaluation were calculated for each boar and those values were applied to multiple linear regression analyses to determine which sperm traits could collectively estimate fertility in the simplest model. The regression model to predict the percent of litter sired by each boar was highly effective (p < 0.001, r(2) = 0.87) and included five traits; acrosome-compromised spermatozoa, percent live spermatozoa (0 and 60 min post-thaw), percent total motility, and the number of zona-bound spermatozoa. A second model to predict the number of piglets sired by boar was also effective (p < 0.05, r(2) = 0.57). These models indicate that the fertility of cryopreserved boar spermatozoa can be predicted effectively by including traditional and novel laboratory assays that consider functions of spermatozoa. © 2015 American Society of Andrology and European Academy of Andrology.

  3. Predictors of the number of under-five malnourished children in Bangladesh: application of the generalized poisson regression model

    PubMed Central

    2013-01-01

    Background Malnutrition is one of the principal causes of child mortality in developing countries including Bangladesh. According to our knowledge, most of the available studies, that addressed the issue of malnutrition among under-five children, considered the categorical (dichotomous/polychotomous) outcome variables and applied logistic regression (binary/multinomial) to find their predictors. In this study malnutrition variable (i.e. outcome) is defined as the number of under-five malnourished children in a family, which is a non-negative count variable. The purposes of the study are (i) to demonstrate the applicability of the generalized Poisson regression (GPR) model as an alternative of other statistical methods and (ii) to find some predictors of this outcome variable. Methods The data is extracted from the Bangladesh Demographic and Health Survey (BDHS) 2007. Briefly, this survey employs a nationally representative sample which is based on a two-stage stratified sample of households. A total of 4,460 under-five children is analysed using various statistical techniques namely Chi-square test and GPR model. Results The GPR model (as compared to the standard Poisson regression and negative Binomial regression) is found to be justified to study the above-mentioned outcome variable because of its under-dispersion (variance < mean) property. Our study also identify several significant predictors of the outcome variable namely mother’s education, father’s education, wealth index, sanitation status, source of drinking water, and total number of children ever born to a woman. Conclusions Consistencies of our findings in light of many other studies suggest that the GPR model is an ideal alternative of other statistical models to analyse the number of under-five malnourished children in a family. Strategies based on significant predictors may improve the nutritional status of children in Bangladesh. PMID:23297699

  4. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    PubMed

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.

  5. Application of logistic regression for landslide susceptibility zoning of Cekmece Area, Istanbul, Turkey

    NASA Astrophysics Data System (ADS)

    Duman, T. Y.; Can, T.; Gokceoglu, C.; Nefeslioglu, H. A.; Sonmez, H.

    2006-11-01

    As a result of industrialization, throughout the world, cities have been growing rapidly for the last century. One typical example of these growing cities is Istanbul, the population of which is over 10 million. Due to rapid urbanization, new areas suitable for settlement and engineering structures are necessary. The Cekmece area located west of the Istanbul metropolitan area is studied, because the landslide activity is extensive in this area. The purpose of this study is to develop a model that can be used to characterize landslide susceptibility in map form using logistic regression analysis of an extensive landslide database. A database of landslide activity was constructed using both aerial-photography and field studies. About 19.2% of the selected study area is covered by deep-seated landslides. The landslides that occur in the area are primarily located in sandstones with interbedded permeable and impermeable layers such as claystone, siltstone and mudstone. About 31.95% of the total landslide area is located at this unit. To apply logistic regression analyses, a data matrix including 37 variables was constructed. The variables used in the forwards stepwise analyses are different measures of slope, aspect, elevation, stream power index (SPI), plan curvature, profile curvature, geology, geomorphology and relative permeability of lithological units. A total of 25 variables were identified as exerting strong influence on landslide occurrence, and included by the logistic regression equation. Wald statistics values indicate that lithology, SPI and slope are more important than the other parameters in the equation. Beta coefficients of the 25 variables included the logistic regression equation provide a model for landslide susceptibility in the Cekmece area. This model is used to generate a landslide susceptibility map that correctly classified 83.8% of the landslide-prone areas.

  6. Methods for estimating confidence intervals in interrupted time series analyses of health interventions.

    PubMed

    Zhang, Fang; Wagner, Anita K; Soumerai, Stephen B; Ross-Degnan, Dennis

    2009-02-01

    Interrupted time series (ITS) is a strong quasi-experimental research design, which is increasingly applied to estimate the effects of health services and policy interventions. We describe and illustrate two methods for estimating confidence intervals (CIs) around absolute and relative changes in outcomes calculated from segmented regression parameter estimates. We used multivariate delta and bootstrapping methods (BMs) to construct CIs around relative changes in level and trend, and around absolute changes in outcome based on segmented linear regression analyses of time series data corrected for autocorrelated errors. Using previously published time series data, we estimated CIs around the effect of prescription alerts for interacting medications with warfarin on the rate of prescriptions per 10,000 warfarin users per month. Both the multivariate delta method (MDM) and the BM produced similar results. BM is preferred for calculating CIs of relative changes in outcomes of time series studies, because it does not require large sample sizes when parameter estimates are obtained correctly from the model. Caution is needed when sample size is small.

  7. Disentangling WTP per QALY data: different analytical approaches, different answers.

    PubMed

    Gyrd-Hansen, Dorte; Kjaer, Trine

    2012-03-01

    A large random sample of the Danish general population was asked to value health improvements by way of both the time trade-off elicitation technique and willingness-to-pay (WTP) using contingent valuation methods. The data demonstrate a high degree of heterogeneity across respondents in their relative valuations on the two scales. This has implications for data analysis. We show that the estimates of WTP per QALY are highly sensitive to the analytical strategy. For both open-ended and dichotomous choice data we demonstrate that choice of aggregated approach (ratios of means) or disaggregated approach (means of ratios) affects estimates markedly as does the interpretation of the constant term (which allows for disproportionality across the two scales) in the regression analyses. We propose that future research should focus on why some respondents are unwilling to trade on the time trade-off scale, on how to interpret the constant value in the regression analyses, and on how best to capture the heterogeneity in preference structures when applying mixed multinomial logit. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Is the Health of the Nation Outcome Scales appropriate for the assessment of symptom severity in patients with substance-related disorders?

    PubMed

    Andreas, Sylke; Harries-Hedder, Karin; Schwenk, Wolfgang; Hausberg, Maria; Koch, Uwe; Schulz, Holger

    2010-07-01

    The Health of the Nation Outcome Scales (HoNOS) is an internationally established clinician-rated instrument. The aim of the study was to assess the psychometric properties in inpatients with substance-related disorders. The HoNOS was applied in a multicenter, consecutive sample of 417 inpatients. Interrater reliability coefficients, confirmatory factor analysis, and regression tree analyses were calculated to assess the reliability and validity of the HoNOS. The factor validity of the HoNOS and its total score could not be confirmed. After training, all items of the HoNOS revealed sufficient values of interrater reliabilities. As the results of the regression tree analyses showed, the single items of the HoNOS were one of the most important predictor of service utilization. The HoNOS can be recommended for obtaining detailed ratings of the problems of inpatients with substance-related disorders as a clinical application in routine mental health care at present. Further studies should include comparisons of HoNOS and Addiction Severity Index. Copyright 2010 Elsevier Inc. All rights reserved.

  9. [Turnover of Non-medical Staff in Outpatient Oncology Practices: Is Building Social Capital a Solution?].

    PubMed

    Gloede, T D; Ernstmann, N; Baumann, W; Groß, S E; Ansmann, L; Nitzsche, A; Neumann, M; Wirtz, M; Schmitz, S; Schulz-Nieswandt, F; Pfaff, H

    2015-11-01

    While a lot is known about potential and actual turnover of non-medical hospital staff, only few data exist for the outpatient setting. In addition, little is known about actual instruments which leaders can use to influence staff turnover in physician practices. In the literature, the social capital of an organisation, which means the amount of trust, common values and reciprocal behaviour in the organisation, has been discussed as a possible field of action. In the present study, staff turnover as perceived by outpatient haematologists and oncologists is presented and analysed as to whether social capital is associated with that staff turnover. In conclusion, measures to increase the social capital of a practice are presented. The present study is based on data gathered in a questionnaire-based survey with members of the Professional Organisation of -Office-Based Haematologists and Oncologists (N=551). The social capital of the practice was captured from the haematologists and oncologists using an existing and validated scale. To analyse the impact of the practice's social capital on staff turnover, as perceived by the physicians, bivariate correlations and linear regression analyses were calculated. In total, 152 haematologists and oncologists participated in the study which represents a response rate of 28%. In the regression analyses, social capital appears as a significant and strong predictor of staff turnover (beta=-0.34; p<0.001). Building social capital within the practice may be an important contribution to reducing staff turnover although the underlying study design does not allow for drawing causal conclusions regarding this relationship. To create social capital in their practice, outpatient physicians may apply measures that facilitate social interaction among staff, foster trust and facilitate cooperation. Such measures may already be applied when hiring and training new staff, but also continuously when leading employees and when organising work tasks, e.g., by establishing regular team meetings. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Prediction of pork quality parameters by applying fractals and data mining on MRI.

    PubMed

    Caballero, Daniel; Pérez-Palacios, Trinidad; Caro, Andrés; Amigo, José Manuel; Dahl, Anders B; ErsbØll, Bjarne K; Antequera, Teresa

    2017-09-01

    This work firstly investigates the use of MRI, fractal algorithms and data mining techniques to determine pork quality parameters non-destructively. The main objective was to evaluate the capability of fractal algorithms (Classical Fractal algorithm, CFA; Fractal Texture Algorithm, FTA and One Point Fractal Texture Algorithm, OPFTA) to analyse MRI in order to predict quality parameters of loin. In addition, the effect of the sequence acquisition of MRI (Gradient echo, GE; Spin echo, SE and Turbo 3D, T3D) and the predictive technique of data mining (Isotonic regression, IR and Multiple linear regression, MLR) were analysed. Both fractal algorithm, FTA and OPFTA are appropriate to analyse MRI of loins. The sequence acquisition, the fractal algorithm and the data mining technique seems to influence on the prediction results. For most physico-chemical parameters, prediction equations with moderate to excellent correlation coefficients were achieved by using the following combinations of acquisition sequences of MRI, fractal algorithms and data mining techniques: SE-FTA-MLR, SE-OPFTA-IR, GE-OPFTA-MLR, SE-OPFTA-MLR, with the last one offering the best prediction results. Thus, SE-OPFTA-MLR could be proposed as an alternative technique to determine physico-chemical traits of fresh and dry-cured loins in a non-destructive way with high accuracy. Copyright © 2017. Published by Elsevier Ltd.

  11. Attitude towards the incorporation of the selective collection of biowaste in a municipal solid waste management system. A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernad-Beltrán, D.; Simó, A.; Bovea, M.D., E-mail: bovea@uji.es

    Highlights: • Attitude towards incorporating biowaste selective collection is analysed. • Willingness to participate and to pay in biowaste selective collection is obtained. • Socioeconomic aspects affecting WtParticipate and WtPay are identified. - Abstract: European waste legislation has been encouraging for years the incorporation of selective collection systems for the biowaste fraction. European countries are therefore incorporating it into their current municipal solid waste management (MSWM) systems. However, this incorporation involves changes in the current waste management habits of households. In this paper, the attitude of the public towards the incorporation of selective collection of biowaste into an existing MSWMmore » system in a Spanish municipality is analysed. A semi-structured telephone interview was used to obtain information regarding aspects such as: level of participation in current waste collection systems, willingness to participate in selective collection of biowaste, reasons and barriers that affect participation, willingness to pay for the incorporation of the selective collection of biowaste and the socioeconomic characteristics of citizens who are willing to participate and pay for selective collection of biowaste. The results showed that approximately 81% of the respondents were willing to participate in selective collection of biowaste. This percentage would increase until 89% if the Town Council provided specific waste bins and bags, since the main barrier to participate in the new selective collection system is the need to use specific waste bin and bags for the separation of biowaste. A logit response model was applied to estimate the average willingness to pay, obtaining an estimated mean of 7.5% on top of the current waste management annual tax. The relationship of willingness to participate and willingness to pay for the implementation of this new selective collection with the socioeconomic variables (age, gender, size of the household, work, education and income) was analysed. Chi-square independence tests and binary logistic regression was used for willingness to participate, not being obtained any significant relationship. Chi-square independence tests, ordinal logistic regression and ordinary linear regression was applied for willingness to pay, obtaining statistically significant relationship for most of the socioeconomic variables.« less

  12. Using Classification and Regression Trees (CART) and random forests to analyze attrition: Results from two simulations.

    PubMed

    Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J

    2015-12-01

    In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).

  13. Using Classification and Regression Trees (CART) and Random Forests to Analyze Attrition: Results From Two Simulations

    PubMed Central

    Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.

    2016-01-01

    In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526

  14. Evaluation of the comprehensive palatability of Japanese sake paired with dishes by multiple regression analysis based on subdomains.

    PubMed

    Nakamura, Ryo; Nakano, Kumiko; Tamura, Hiroyasu; Mizunuma, Masaki; Fushiki, Tohru; Hirata, Dai

    2017-08-01

    Many factors contribute to palatability. In order to evaluate the palatability of Japanese alcohol sake paired with certain dishes by integrating multiple factors, here we applied an evaluation method previously reported for palatability of cheese by multiple regression analysis based on 3 subdomain factors (rewarding, cultural, and informational). We asked 94 Japanese participants/subjects to evaluate the palatability of sake (1st evaluation/E1 for the first cup, 2nd/E2 and 3rd/E3 for the palatability with aftertaste/afterglow of certain dishes) and to respond to a questionnaire related to 3 subdomains. In E1, 3 factors were extracted by a factor analysis, and the subsequent multiple regression analyses indicated that the palatability of sake was interpreted by mainly the rewarding. Further, the results of attribution-dissections in E1 indicated that 2 factors (rewarding and informational) contributed to the palatability. Finally, our results indicated that the palatability of sake was influenced by the dish eaten just before drinking.

  15. Comparative study of contrast-enhanced ultrasound qualitative and quantitative analysis for identifying benign and malignant breast tumor lumps.

    PubMed

    Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting

    2014-01-01

    To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.

  16. Mining hidden data to predict patient prognosis: texture feature extraction and machine learning in mammography

    NASA Astrophysics Data System (ADS)

    Leighs, J. A.; Halling-Brown, M. D.; Patel, M. N.

    2018-03-01

    The UK currently has a national breast cancer-screening program and images are routinely collected from a number of screening sites, representing a wealth of invaluable data that is currently under-used. Radiologists evaluate screening images manually and recall suspicious cases for further analysis such as biopsy. Histological testing of biopsy samples confirms the malignancy of the tumour, along with other diagnostic and prognostic characteristics such as disease grade. Machine learning is becoming increasingly popular for clinical image classification problems, as it is capable of discovering patterns in data otherwise invisible. This is particularly true when applied to medical imaging features; however clinical datasets are often relatively small. A texture feature extraction toolkit has been developed to mine a wide range of features from medical images such as mammograms. This study analysed a dataset of 1,366 radiologist-marked, biopsy-proven malignant lesions obtained from the OPTIMAM Medical Image Database (OMI-DB). Exploratory data analysis methods were employed to better understand extracted features. Machine learning techniques including Classification and Regression Trees (CART), ensemble methods (e.g. random forests), and logistic regression were applied to the data to predict the disease grade of the analysed lesions. Prediction scores of up to 83% were achieved; sensitivity and specificity of the models trained have been discussed to put the results into a clinical context. The results show promise in the ability to predict prognostic indicators from the texture features extracted and thus enable prioritisation of care for patients at greatest risk.

  17. An epidemiologic study of index and family infectious mononucleosis and adult Hodgkin's disease (HD): evidence for a specific association with EBV+ve HD in young adults.

    PubMed

    Alexander, Freda E; Lawrence, Davia J; Freeland, June; Krajewski, Andrew S; Angus, Brian; Taylor, G Malcolm; Jarrett, Ruth F

    2003-11-01

    Infectious mononucleosis (IM) is an established risk factor for Hodgkin's disease (HD). A substantial minority (33%) of cases of HD have Epstein-Barr virus (EBV) DNA within the malignant cells (are EBV+ve). It is unclear whether risk after IM applies specifically to EBV+ve HD. We report the results of a population-based case-control study of HD in adults (n = 408 cases of classical HD, 513 controls) aged 16-74 years; the case series included 113 EBV+ve and 243 EBV+ve HD. Analyses compared total HD, EBV+ve HD and EBV-ve HD with the controls and EBV+ve HD with EBV-ve HD cases using, mainly, logistic regression. Regression analyses were adjusted for gender, age-group and socioeconomic status, and were performed for the whole age range and separately for young (< 35 years) and old adults (> or = 35 years); formal tests of effect modification by age were included. For the young adults, reported IM in index or relative was strongly and significantly associated with EBV+ve HD when compared to controls (odds ratio [OR] = 2.94, 95% confidence interval [CI]: 1.08-7.98 and OR = 5.22, 95% CI: 2.15-12.68, respectively). These results may be interpreted as indications that late first exposure to EBV increases risk of HD, especially in young adults; this applies primarily to EBV+ve HD. Copyright 2003 Wiley-Liss, Inc.

  18. Women, Physical Activity, and Quality of Life: Self-concept as a Mediator.

    PubMed

    Gonzalo Silvestre, Tamara; Ubillos Landa, Silvia

    2016-02-22

    The objectives of this research are: (a) analyze the incremental validity of physical activity's (PA) influence on perceived quality of life (PQL); (b) determine if PA's predictive power is mediated by self-concept; and (c) study if results vary according to a unidimensional or multidimensional approach to self-concept measurement. The sample comprised 160 women from Burgos, Spain aged 18 to 45 years old. Non-probability sampling was used. Two three-step hierarchical regression analyses were applied to forecast PQL. The hedonic quality-of-life indicators, self-concept, self-esteem, and PA were included as independent variables. The first regression analysis included global self-concept as predictor variable, while the second included its five dimensions. Two mediation analyses were conducted to see if PA's ability to predict PQL was mediated by global and physical self-concept. Results from the first regression shows that self-concept, satisfaction with life, and PA were significant predictors. PA slightly but significantly increased explained variance in PQL (2.1%). In the second regression, substituting global self-concept with its five constituent factors, only the physical dimension and satisfaction with life predicted PQL, while PA ceased to be a significant predictor. Mediation analysis revealed that only physical self-concept mediates the relationship between PA and PQL (z = 1.97, p < .050), and not global self-concept. Physical self-concept was the strongest predictor and approximately 32.45 % of PA's effect on PQL was mediated by it. This study's findings support a multidimensional view of self-concept, and represent a more accurate image of the relationship between PQL, PA, and self-concept.

  19. Using Dual Regression to Investigate Network Shape and Amplitude in Functional Connectivity Analyses

    PubMed Central

    Nickerson, Lisa D.; Smith, Stephen M.; Öngür, Döst; Beckmann, Christian F.

    2017-01-01

    Independent Component Analysis (ICA) is one of the most popular techniques for the analysis of resting state FMRI data because it has several advantageous properties when compared with other techniques. Most notably, in contrast to a conventional seed-based correlation analysis, it is model-free and multivariate, thus switching the focus from evaluating the functional connectivity of single brain regions identified a priori to evaluating brain connectivity in terms of all brain resting state networks (RSNs) that simultaneously engage in oscillatory activity. Furthermore, typical seed-based analysis characterizes RSNs in terms of spatially distributed patterns of correlation (typically by means of simple Pearson's coefficients) and thereby confounds together amplitude information of oscillatory activity and noise. ICA and other regression techniques, on the other hand, retain magnitude information and therefore can be sensitive to both changes in the spatially distributed nature of correlations (differences in the spatial pattern or “shape”) as well as the amplitude of the network activity. Furthermore, motion can mimic amplitude effects so it is crucial to use a technique that retains such information to ensure that connectivity differences are accurately localized. In this work, we investigate the dual regression approach that is frequently applied with group ICA to assess group differences in resting state functional connectivity of brain networks. We show how ignoring amplitude effects and how excessive motion corrupts connectivity maps and results in spurious connectivity differences. We also show how to implement the dual regression to retain amplitude information and how to use dual regression outputs to identify potential motion effects. Two key findings are that using a technique that retains magnitude information, e.g., dual regression, and using strict motion criteria are crucial for controlling both network amplitude and motion-related amplitude effects, respectively, in resting state connectivity analyses. We illustrate these concepts using realistic simulated resting state FMRI data and in vivo data acquired in healthy subjects and patients with bipolar disorder and schizophrenia. PMID:28348512

  20. Robustness of meta-analyses in finding gene × environment interactions

    PubMed Central

    Shi, Gang; Nehorai, Arye

    2017-01-01

    Meta-analyses that synthesize statistical evidence across studies have become important analytical tools for genetic studies. Inspired by the success of genome-wide association studies of the genetic main effect, researchers are searching for gene × environment interactions. Confounders are routinely included in the genome-wide gene × environment interaction analysis as covariates; however, this does not control for any confounding effects on the results if covariate × environment interactions are present. We carried out simulation studies to evaluate the robustness to the covariate × environment confounder for meta-regression and joint meta-analysis, which are two commonly used meta-analysis methods for testing the gene × environment interaction or the genetic main effect and interaction jointly. Here we show that meta-regression is robust to the covariate × environment confounder while joint meta-analysis is subject to the confounding effect with inflated type I error rates. Given vast sample sizes employed in genome-wide gene × environment interaction studies, non-significant covariate × environment interactions at the study level could substantially elevate the type I error rate at the consortium level. When covariate × environment confounders are present, type I errors can be controlled in joint meta-analysis by including the covariate × environment terms in the analysis at the study level. Alternatively, meta-regression can be applied, which is robust to potential covariate × environment confounders. PMID:28362796

  1. Regression Trees Identify Relevant Interactions: Can This Improve the Predictive Performance of Risk Adjustment?

    PubMed

    Buchner, Florian; Wasem, Jürgen; Schillo, Sonja

    2017-01-01

    Risk equalization formulas have been refined since their introduction about two decades ago. Because of the complexity and the abundance of possible interactions between the variables used, hardly any interactions are considered. A regression tree is used to systematically search for interactions, a methodologically new approach in risk equalization. Analyses are based on a data set of nearly 2.9 million individuals from a major German social health insurer. A two-step approach is applied: In the first step a regression tree is built on the basis of the learning data set. Terminal nodes characterized by more than one morbidity-group-split represent interaction effects of different morbidity groups. In the second step the 'traditional' weighted least squares regression equation is expanded by adding interaction terms for all interactions detected by the tree, and regression coefficients are recalculated. The resulting risk adjustment formula shows an improvement in the adjusted R 2 from 25.43% to 25.81% on the evaluation data set. Predictive ratios are calculated for subgroups affected by the interactions. The R 2 improvement detected is only marginal. According to the sample level performance measures used, not involving a considerable number of morbidity interactions forms no relevant loss in accuracy. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    PubMed

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  3. Relationship between body composition and postural control in prepubertal overweight/obese children: A cross-sectional study.

    PubMed

    Villarrasa-Sapiña, Israel; Álvarez-Pitti, Julio; Cabeza-Ruiz, Ruth; Redón, Pau; Lurbe, Empar; García-Massó, Xavier

    2018-02-01

    Excess body weight during childhood causes reduced motor functionality and problems in postural control, a negative influence which has been reported in the literature. Nevertheless, no information regarding the effect of body composition on the postural control of overweight and obese children is available. The objective of this study was therefore to establish these relationships. A cross-sectional design was used to establish relationships between body composition and postural control variables obtained in bipedal eyes-open and eyes-closed conditions in twenty-two children. Centre of pressure signals were analysed in the temporal and frequency domains. Pearson correlations were applied to establish relationships between variables. Principal component analysis was applied to the body composition variables to avoid potential multicollinearity in the regression models. These principal components were used to perform a multiple linear regression analysis, from which regression models were obtained to predict postural control. Height and leg mass were the body composition variables that showed the highest correlation with postural control. Multiple regression models were also obtained and several of these models showed a higher correlation coefficient in predicting postural control than simple correlations. These models revealed that leg and trunk mass were good predictors of postural control. More equations were found in the eyes-open than eyes-closed condition. Body weight and height are negatively correlated with postural control. However, leg and trunk mass are better postural control predictors than arm or body mass. Finally, body composition variables are more useful in predicting postural control when the eyes are open. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. [PROGNOSTIC MODELS IN MODERN MANAGEMENT OF VULVAR CANCER].

    PubMed

    Tsvetkov, Ch; Gorchev, G; Tomov, S; Nikolova, M; Genchev, G

    2016-01-01

    The aim of the research was to evaluate and analyse prognosis and prognostic factors in patients with squamous cell vulvar carcinoma after primary surgery with individual approach applied during the course of treatment. In the period between January 2000 and July 2010, 113 patients with squamous cell carcinoma of the vulva were diagnosed and operated on at Gynecologic Oncology Clinic of Medical University, Pleven. All the patients were monitored at the same clinic. Individual approach was applied to each patient and whenever it was possible, more conservative operative techniques were applied. The probable clinicopathological characteristics influencing the overall survival and recurrence free survival were analyzed. Univariate statistical analysis and Cox regression analysis were made in order to evaluate the characteristics, which were statistically significant for overall survival and survival without recurrence. A multivariate logistic regression analysis (Forward Wald procedure) was applied to evaluate the combined influence of the significant factors. While performing the multivariate analysis, the synergic effect of the independent prognostic factors of both kinds of survivals was also evaluated. Approaching individually each patient, we applied the following operative techniques: 1. Deep total radical vulvectomy with separate incisions for lymph dissection (LD) or without dissection--68 (60.18 %) patients. 2. En-bloc vulvectomy with bilateral LD without vulva reconstruction--10 (8.85%) 3. Modified radical vulvactomy (hemivulvectomy, patial vulvactomy)--25 (22.02%). 4. wide-local excision--3 (2.65%). 5. Simple (total /partial) vulvectomy--5 (4.43%) patients. 6. En-bloc resection with reconstruction--2 (1.77%) After a thorough analysis of the overall survival and recurrence free survival, we made the conclusion that the relapse occurrence and clinical stage of FIGO were independent prognostic factors for overall survival and the independent prognostic factors for recurrence free survival were: metastatic inguinal nodes (unilateral or bilateral), tumor size (above or below 3 cm) and lymphovascular space invasion. On the basis of these results we created two prognostic models: 1. A prognostic model of overall survival 2. A prognostic model for survival without recurrence. Following the surgical staging of the disease, were able to gather and analyse important clinicopathological indexes, which gave us the opportunity to form prognostic groups for overall survival and recurrence-free survival.

  5. The effect of machine learning regression algorithms and sample size on individualized behavioral prediction with functional connectivity features.

    PubMed

    Cui, Zaixu; Gong, Gaolang

    2018-06-02

    Individualized behavioral/cognitive prediction using machine learning (ML) regression approaches is becoming increasingly applied. The specific ML regression algorithm and sample size are two key factors that non-trivially influence prediction accuracies. However, the effects of the ML regression algorithm and sample size on individualized behavioral/cognitive prediction performance have not been comprehensively assessed. To address this issue, the present study included six commonly used ML regression algorithms: ordinary least squares (OLS) regression, least absolute shrinkage and selection operator (LASSO) regression, ridge regression, elastic-net regression, linear support vector regression (LSVR), and relevance vector regression (RVR), to perform specific behavioral/cognitive predictions based on different sample sizes. Specifically, the publicly available resting-state functional MRI (rs-fMRI) dataset from the Human Connectome Project (HCP) was used, and whole-brain resting-state functional connectivity (rsFC) or rsFC strength (rsFCS) were extracted as prediction features. Twenty-five sample sizes (ranged from 20 to 700) were studied by sub-sampling from the entire HCP cohort. The analyses showed that rsFC-based LASSO regression performed remarkably worse than the other algorithms, and rsFCS-based OLS regression performed markedly worse than the other algorithms. Regardless of the algorithm and feature type, both the prediction accuracy and its stability exponentially increased with increasing sample size. The specific patterns of the observed algorithm and sample size effects were well replicated in the prediction using re-testing fMRI data, data processed by different imaging preprocessing schemes, and different behavioral/cognitive scores, thus indicating excellent robustness/generalization of the effects. The current findings provide critical insight into how the selected ML regression algorithm and sample size influence individualized predictions of behavior/cognition and offer important guidance for choosing the ML regression algorithm or sample size in relevant investigations. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Spatial Autocorrelation Approaches to Testing Residuals from Least Squares Regression

    PubMed Central

    Chen, Yanguang

    2016-01-01

    In geo-statistics, the Durbin-Watson test is frequently employed to detect the presence of residual serial correlation from least squares regression analyses. However, the Durbin-Watson statistic is only suitable for ordered time or spatial series. If the variables comprise cross-sectional data coming from spatial random sampling, the test will be ineffectual because the value of Durbin-Watson’s statistic depends on the sequence of data points. This paper develops two new statistics for testing serial correlation of residuals from least squares regression based on spatial samples. By analogy with the new form of Moran’s index, an autocorrelation coefficient is defined with a standardized residual vector and a normalized spatial weight matrix. Then by analogy with the Durbin-Watson statistic, two types of new serial correlation indices are constructed. As a case study, the two newly presented statistics are applied to a spatial sample of 29 China’s regions. These results show that the new spatial autocorrelation models can be used to test the serial correlation of residuals from regression analysis. In practice, the new statistics can make up for the deficiencies of the Durbin-Watson test. PMID:26800271

  7. Membrane Introduction Mass Spectrometry Combined with an Orthogonal Partial-Least Squares Calibration Model for Mixture Analysis.

    PubMed

    Li, Min; Zhang, Lu; Yao, Xiaolong; Jiang, Xingyu

    2017-01-01

    The emerging membrane introduction mass spectrometry technique has been successfully used to detect benzene, toluene, ethyl benzene and xylene (BTEX), while overlapped spectra have unfortunately hindered its further application to the analysis of mixtures. Multivariate calibration, an efficient method to analyze mixtures, has been widely applied. In this paper, we compared univariate and multivariate analyses for quantification of the individual components of mixture samples. The results showed that the univariate analysis creates poor models with regression coefficients of 0.912, 0.867, 0.440 and 0.351 for BTEX, respectively. For multivariate analysis, a comparison to the partial-least squares (PLS) model shows that the orthogonal partial-least squares (OPLS) regression exhibits an optimal performance with regression coefficients of 0.995, 0.999, 0.980 and 0.976, favorable calibration parameters (RMSEC and RMSECV) and a favorable validation parameter (RMSEP). Furthermore, the OPLS exhibits a good recovery of 73.86 - 122.20% and relative standard deviation (RSD) of the repeatability of 1.14 - 4.87%. Thus, MIMS coupled with the OPLS regression provides an optimal approach for a quantitative BTEX mixture analysis in monitoring and predicting water pollution.

  8. Key factors contributing to accident severity rate in construction industry in Iran: a regression modelling approach.

    PubMed

    Soltanzadeh, Ahmad; Mohammadfam, Iraj; Moghimbeigi, Abbas; Ghiasvand, Reza

    2016-03-01

    Construction industry involves the highest risk of occupational accidents and bodily injuries, which range from mild to very severe. The aim of this cross-sectional study was to identify the factors associated with accident severity rate (ASR) in the largest Iranian construction companies based on data about 500 occupational accidents recorded from 2009 to 2013. We also gathered data on safety and health risk management and training systems. Data were analysed using Pearson's chi-squared coefficient and multiple regression analysis. Median ASR (and the interquartile range) was 107.50 (57.24- 381.25). Fourteen of the 24 studied factors stood out as most affecting construction accident severity (p<0.05). These findings can be applied in the design and implementation of a comprehensive safety and health risk management system to reduce ASR.

  9. Applying additive modeling and gradient boosting to assess the effects of watershed and reach characteristics on riverine assemblages

    USGS Publications Warehouse

    Maloney, Kelly O.; Schmid, Matthias; Weller, Donald E.

    2012-01-01

    Issues with ecological data (e.g. non-normality of errors, nonlinear relationships and autocorrelation of variables) and modelling (e.g. overfitting, variable selection and prediction) complicate regression analyses in ecology. Flexible models, such as generalized additive models (GAMs), can address data issues, and machine learning techniques (e.g. gradient boosting) can help resolve modelling issues. Gradient boosted GAMs do both. Here, we illustrate the advantages of this technique using data on benthic macroinvertebrates and fish from 1573 small streams in Maryland, USA.

  10. The progressivity of health-care financing in Kenya.

    PubMed

    Munge, Kenneth; Briggs, Andrew Harvey

    2014-10-01

    Health-care financing should be equitable. In many developing countries such as Kenya, changes to health-care financing systems are being implemented as a means of providing equitable access to health care with the aim of attaining universal coverage. Vertical equity means that people of dissimilar ability to pay make dissimilar levels of contribution to the health-care financing system. Vertical equity can be analysed by measuring progressivity. The aim of this study was to analyse progressivity by measuring deviations from proportionality in the relationship between sources of health-care financing and ability to pay using Kakwani indices applied to data from the Kenya Household Health Utilisation and Expenditure Survey 2007. Concentration indices and Kakwani indices were obtained for the sources of health-care financing: direct and indirect taxes, out of pocket (OOP) payments, private insurance contributions and contributions to the National Hospital Insurance Fund. The bootstrap method was used to analyse the sensitivity of the Kakwani index to changes in the equivalence scale or the use of an alternative measure of ability to pay. The overall health-care financing system was regressive. Out of pocket payments were regressive with all other payments being proportional. Direct taxes, indirect taxes and private insurance premiums were sensitive to the use of income as an alternative measure of ability to pay. However, the overall finding of a regressive health-care system remained. Reforms to the Kenyan health-care financing system are required to reduce dependence on out of pocket payments. The bootstrap method can be used in determining the sensitivity of the Kakwani index to various assumptions made in the analysis. Further analyses are required to determine the equity of health-care utilization and the effect of proposed reforms on overall equity of the Kenyan health-care system. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.

  11. The effect of marital status on stage and survival of prostate cancer patients treated with radical prostatectomy: a population-based study.

    PubMed

    Abdollah, Firas; Sun, Maxine; Thuret, Rodolphe; Abdo, Al'a; Morgan, Monica; Jeldres, Claudio; Shariat, Shahrokh F; Perrotte, Paul; Montorsi, Francesco; Karakiewicz, Pierre I

    2011-08-01

    The detrimental effect of unmarried marital status on stage and survival has been confirmed in several malignancies. We set to test whether this applied to patients diagnosed with prostate cancer (PCa) treated with radical prostatectomy (RP). We identified 163,697 non-metastatic PCa patients treated with RP, within 17 Surveillance, Epidemiology, and End Results registries. Logistic regression analyses focused on the rate of locally advanced stage (pT3-4/pN1) at RP. Cox regression analyses tested the relationship between marital status and cancer-specific (CSM), as well as all-cause mortality (ACM). Respectively, 9.1 and 7.8% of individuals were separated/divorced/widowed (SDW) and never married. SDW men had more advanced stage at surgery (odds ratio: 1.1; p < 0.001), higher CSM and ACM (both hazard ratio [HR]: 1.3; p < 0.001) than married men. Similarly, never married marital status portended to a higher ACM rate (HR:1.2, p = 0.001). These findings were consistent when analyses were stratified according to organ confined vs. locally advanced stages. Being SDW significantly increased the risk of more advanced stage at RP. Following surgery, SDW men portended to a higher CSM and ACM rate than married men. Consequently, these individuals may benefit from a more focused health care throughout the natural history of their disease.

  12. Space-Time Point Pattern Analysis of Flavescence Dorée Epidemic in a Grapevine Field: Disease Progression and Recovery

    PubMed Central

    Maggi, Federico; Bosco, Domenico; Galetto, Luciana; Palmano, Sabrina; Marzachì, Cristina

    2017-01-01

    Analyses of space-time statistical features of a flavescence dorée (FD) epidemic in Vitis vinifera plants are presented. FD spread was surveyed from 2011 to 2015 in a vineyard of 17,500 m2 surface area in the Piemonte region, Italy; count and position of symptomatic plants were used to test the hypothesis of epidemic Complete Spatial Randomness and isotropicity in the space-time static (year-by-year) point pattern measure. Space-time dynamic (year-to-year) point pattern analyses were applied to newly infected and recovered plants to highlight statistics of FD progression and regression over time. Results highlighted point patterns ranging from disperse (at small scales) to aggregated (at large scales) over the years, suggesting that the FD epidemic is characterized by multiscale properties that may depend on infection incidence, vector population, and flight behavior. Dynamic analyses showed moderate preferential progression and regression along rows. Nearly uniform distributions of direction and negative exponential distributions of distance of newly symptomatic and recovered plants relative to existing symptomatic plants highlighted features of vector mobility similar to Brownian motion. These evidences indicate that space-time epidemics modeling should include environmental setting (e.g., vineyard geometry and topography) to capture anisotropicity as well as statistical features of vector flight behavior, plant recovery and susceptibility, and plant mortality. PMID:28111581

  13. Use of Multiple Regression and Use-Availability Analyses in Determining Habitat Selection by Gray Squirrels (Sciurus Carolinensis)

    Treesearch

    John W. Edwards; Susan C. Loeb; David C. Guynn

    1994-01-01

    Multiple regression and use-availability analyses are two methods for examining habitat selection. Use-availability analysis is commonly used to evaluate macrohabitat selection whereas multiple regression analysis can be used to determine microhabitat selection. We compared these techniques using behavioral observations (n = 5534) and telemetry locations (n = 2089) of...

  14. Application of conditional moment tests to model checking for generalized linear models.

    PubMed

    Pan, Wei

    2002-06-01

    Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.

  15. Quantifying and analysing food waste generated by Indonesian undergraduate students

    NASA Astrophysics Data System (ADS)

    Mandasari, P.

    2018-03-01

    Despite the fact that environmental consequences derived from food waste have been widely known, studies on the amount of food waste and its influencing factors have relatively been paid little attention. Addressing this shortage, this paper aimed to quantify monthly avoidable food waste generated by Indonesian undergraduate students and analyse factors influencing the occurrence of avoidable food waste. Based on data from 106 undergraduate students, descriptive statistics and logistic regression were applied in this study. The results indicated that 4,987.5 g of food waste was generated in a month (equal to 59,850 g yearly); or 47.05 g per person monthly (equal to 564.62 g per person per a year). Meanwhile, eating out frequency and gender were found to be significant predictors of food waste occurrence.

  16. Transformation (normalization) of slope gradient and surface curvatures, automated for statistical analyses from DEMs

    NASA Astrophysics Data System (ADS)

    Csillik, O.; Evans, I. S.; Drăguţ, L.

    2015-03-01

    Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.

  17. An Expert System for the Evaluation of Cost Models

    DTIC Science & Technology

    1990-09-01

    contrast to the condition of equal error variance, called homoscedasticity. (Reference: Applied Linear Regression Models by John Neter - page 423...normal. (Reference: Applied Linear Regression Models by John Neter - page 125) Click Here to continue -> Autocorrelation Click Here for the index - Index...over time. Error terms correlated over time are said to be autocorrelated or serially correlated. (REFERENCE: Applied Linear Regression Models by John

  18. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  19. Meta-epidemiologic study showed frequent time trends in summary estimates from meta-analyses of diagnostic accuracy studies.

    PubMed

    Cohen, Jérémie F; Korevaar, Daniël A; Wang, Junfeng; Leeflang, Mariska M; Bossuyt, Patrick M

    2016-09-01

    To evaluate changes over time in summary estimates from meta-analyses of diagnostic accuracy studies. We included 48 meta-analyses from 35 MEDLINE-indexed systematic reviews published between September 2011 and January 2012 (743 diagnostic accuracy studies; 344,015 participants). Within each meta-analysis, we ranked studies by publication date. We applied random-effects cumulative meta-analysis to follow how summary estimates of sensitivity and specificity evolved over time. Time trends were assessed by fitting a weighted linear regression model of the summary accuracy estimate against rank of publication. The median of the 48 slopes was -0.02 (-0.08 to 0.03) for sensitivity and -0.01 (-0.03 to 0.03) for specificity. Twelve of 96 (12.5%) time trends in sensitivity or specificity were statistically significant. We found a significant time trend in at least one accuracy measure for 11 of the 48 (23%) meta-analyses. Time trends in summary estimates are relatively frequent in meta-analyses of diagnostic accuracy studies. Results from early meta-analyses of diagnostic accuracy studies should be considered with caution. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Glomerular structural-functional relationship models of diabetic nephropathy are robust in type 1 diabetic patients.

    PubMed

    Mauer, Michael; Caramori, Maria Luiza; Fioretto, Paola; Najafian, Behzad

    2015-06-01

    Studies of structural-functional relationships have improved understanding of the natural history of diabetic nephropathy (DN). However, in order to consider structural end points for clinical trials, the robustness of the resultant models needs to be verified. This study examined whether structural-functional relationship models derived from a large cohort of type 1 diabetic (T1D) patients with a wide range of renal function are robust. The predictability of models derived from multiple regression analysis and piecewise linear regression analysis was also compared. T1D patients (n = 161) with research renal biopsies were divided into two equal groups matched for albumin excretion rate (AER). Models to explain AER and glomerular filtration rate (GFR) by classical DN lesions in one group (T1D-model, or T1D-M) were applied to the other group (T1D-test, or T1D-T) and regression analyses were performed. T1D-M-derived models explained 70 and 63% of AER variance and 32 and 21% of GFR variance in T1D-M and T1D-T, respectively, supporting the substantial robustness of the models. Piecewise linear regression analyses substantially improved predictability of the models with 83% of AER variance and 66% of GFR variance explained by classical DN glomerular lesions alone. These studies demonstrate that DN structural-functional relationship models are robust, and if appropriate models are used, glomerular lesions alone explain a major proportion of AER and GFR variance in T1D patients. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  1. Demographic responses of Pinguicula ionantha to prescribed fire: a regression-design LTRE approach.

    PubMed

    Kesler, Herbert C; Trusty, Jennifer L; Hermann, Sharon M; Guyer, Craig

    2008-06-01

    This study describes the use of periodic matrix analysis and regression-design life table response experiments (LTRE) to investigate the effects of prescribed fire on demographic responses of Pinguicula ionantha, a federally listed plant endemic to the herb bog/savanna community in north Florida. Multi-state mark-recapture models with dead recoveries were used to estimate survival and transition probabilities for over 2,300 individuals in 12 populations of P. ionantha. These estimates were applied to parameterize matrix models used in further analyses. P. ionantha demographics were found to be strongly dependent on prescribed fire events. Periodic matrix models were used to evaluate season of burn (either growing or dormant season) for fire return intervals ranging from 1 to 20 years. Annual growing and biannual dormant season fires maximized population growth rates for this species. A regression design LTRE was used to evaluate the effect of number of days since last fire on population growth. Maximum population growth rates calculated using standard asymptotic analysis were realized shortly following a burn event (<2 years), and a regression design LTRE showed that short-term fire-mediated changes in vital rates translated into observed increases in population growth. The LTRE identified fecundity and individual growth as contributing most to increases in post-fire population growth. Our analyses found that the current four-year prescribed fire return intervals used at the study sites can be significantly shortened to increase the population growth rates of this rare species. Understanding the role of fire frequency and season in creating and maintaining appropriate habitat for this species may aid in the conservation of this and other rare herb bog/savanna inhabitants.

  2. A critical review of the protracted domestication model for Near-Eastern founder crops: linear regression, long-distance gene flow, archaeological, and archaeobotanical evidence.

    PubMed

    Heun, Manfred; Abbo, Shahal; Lev-Yadun, Simcha; Gopher, Avi

    2012-07-01

    The recent review by Fuller et al. (2012a) in this journal is part of a series of papers maintaining that plant domestication in the Near East was a slow process lasting circa 4000 years and occurring independently in different locations across the Fertile Crescent. Their protracted domestication scenario is based entirely on linear regression derived from the percentage of domesticated plant remains at specific archaeological sites and the age of these sites themselves. This paper discusses why estimates like haldanes and darwins cannot be applied to the seven founder crops in the Near East (einkorn and emmer wheat, barley, peas, chickpeas, lentils, and bitter vetch). All of these crops are self-fertilizing plants and for this reason they do not fulfil the requirements for performing calculations of this kind. In addition, the percentage of domesticates at any site may be the result of factors other than those that affect the selection for domesticates growing in the surrounding area. These factors are unlikely to have been similar across prehistoric sites of habitation, societies, and millennia. The conclusion here is that single crop analyses are necessary rather than general reviews drawing on regression analyses based on erroneous assumptions. The fact that all seven of these founder crops are self-fertilizers should be incorporated into a comprehensive domestication scenario for the Near East, as self-fertilization naturally isolates domesticates from their wild progenitors.

  3. Spatial Autocorrelation of Cancer Incidence in Saudi Arabia

    PubMed Central

    Al-Ahmadi, Khalid; Al-Zahrani, Ali

    2013-01-01

    Little is known about the geographic distribution of common cancers in Saudi Arabia. We explored the spatial incidence patterns of common cancers in Saudi Arabia using spatial autocorrelation analyses, employing the global Moran’s I and Anselin’s local Moran’s I statistics to detect nonrandom incidence patterns. Global ordinary least squares (OLS) regression and local geographically-weighted regression (GWR) were applied to examine the spatial correlation of cancer incidences at the city level. Population-based records of cancers diagnosed between 1998 and 2004 were used. Male lung cancer and female breast cancer exhibited positive statistically significant global Moran’s I index values, indicating a tendency toward clustering. The Anselin’s local Moran’s I analyses revealed small significant clusters of lung cancer, prostate cancer and Hodgkin’s disease among males in the Eastern region and significant clusters of thyroid cancers in females in the Eastern and Riyadh regions. Additionally, both regression methods found significant associations among various cancers. For example, OLS and GWR revealed significant spatial associations among NHL, leukemia and Hodgkin’s disease (r² = 0.49–0.67 using OLS and r² = 0.52–0.68 using GWR) and between breast and prostate cancer (r² = 0.53 OLS and 0.57 GWR) in Saudi Arabian cities. These findings may help to generate etiologic hypotheses of cancer causation and identify spatial anomalies in cancer incidence in Saudi Arabia. Our findings should stimulate further research on the possible causes underlying these clusters and associations. PMID:24351742

  4. Sports spectator behavior: a test of the theory of planned behavior.

    PubMed

    Lu, Wan-Chen; Lin, Shin-Huei; Cheng, Chih-Fu

    2011-12-01

    The theory of planned behavior has been applied to sports and exercise behaviors. According to this theory, human intention to take action in a specific context is guided by three antecedents: attitude, subjective norm, and perceived behavioral control. Behavioral intention mediates the relationships between these three considerations and its ultimate performance. However, this theory has seldom been applied to the behaviors of spectators of sporting events. A sample of 269 volleyball spectators in Taiwan was studied to examine whether people's intention mediated their attitudes, subjective norms, and perceived behavioral control toward a given behavior, watching the 2010 Fédération Internationale de Volleyball World Grand Prix in Taipei. Regression analyses did not support behavioral intention as a mediator. This result is discussed in the context of planned behavior.

  5. Time Series Analysis and Forecasting of Wastewater Inflow into Bandar Tun Razak Sewage Treatment Plant in Selangor, Malaysia

    NASA Astrophysics Data System (ADS)

    Abunama, Taher; Othman, Faridah

    2017-06-01

    Analysing the fluctuations of wastewater inflow rates in sewage treatment plants (STPs) is essential to guarantee a sufficient treatment of wastewater before discharging it to the environment. The main objectives of this study are to statistically analyze and forecast the wastewater inflow rates into the Bandar Tun Razak STP in Kuala Lumpur, Malaysia. A time series analysis of three years’ weekly influent data (156weeks) has been conducted using the Auto-Regressive Integrated Moving Average (ARIMA) model. Various combinations of ARIMA orders (p, d, q) have been tried to select the most fitted model, which was utilized to forecast the wastewater inflow rates. The linear regression analysis was applied to testify the correlation between the observed and predicted influents. ARIMA (3, 1, 3) model was selected with the highest significance R-square and lowest normalized Bayesian Information Criterion (BIC) value, and accordingly the wastewater inflow rates were forecasted to additional 52weeks. The linear regression analysis between the observed and predicted values of the wastewater inflow rates showed a positive linear correlation with a coefficient of 0.831.

  6. Effects of Sleep Quality on the Association between Problematic Mobile Phone Use and Mental Health Symptoms in Chinese College Students.

    PubMed

    Tao, Shuman; Wu, Xiaoyan; Zhang, Yukun; Zhang, Shichen; Tong, Shilu; Tao, Fangbiao

    2017-02-14

    Problematic mobile phone use (PMPU) is a risk factor for both adolescents' sleep quality and mental health. It is important to examine the potential negative health effects of PMPU exposure. This study aims to evaluate PMPU and its association with mental health in Chinese college students. Furthermore, we investigated how sleep quality influences this association. In 2013, we collected data regarding participants' PMPU, sleep quality, and mental health (psychopathological symptoms, anxiety, and depressive symptoms) by standardized questionnaires in 4747 college students. Multivariate logistic regression analysis was applied to assess independent effects and interactions of PMPU and sleep quality with mental health. PMPU and poor sleep quality were observed in 28.2% and 9.8% of participants, respectively. Adjusted logistic regression models suggested independent associations of PMPU and sleep quality with mental health ( p < 0.001). Further regression analyses suggested a significant interaction between these measures ( p < 0.001). The study highlights that poor sleep quality may play a more significant role in increasing the risk of mental health problems in students with PMPU than in those without PMPU.

  7. Control Variate Selection for Multiresponse Simulation.

    DTIC Science & Technology

    1987-05-01

    M. H. Knuter, Applied Linear Regression Mfodels, Richard D. Erwin, Inc., Homewood, Illinois, 1983. Neuts, Marcel F., Probability, Allyn and Bacon...1982. Neter, J., V. Wasserman, and M. H. Knuter, Applied Linear Regression .fodels, Richard D. Erwin, Inc., Homewood, Illinois, 1983. Neuts, Marcel F...Aspects of J%,ultivariate Statistical Theory, John Wiley and Sons, New York, New York, 1982. dY Neter, J., W. Wasserman, and M. H. Knuter, Applied Linear Regression Mfodels

  8. Linear regression in astronomy. II

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  9. Regionalization of winter low-flow characteristics of Tennessee streams

    USGS Publications Warehouse

    Bingham, R.H.

    1986-01-01

    Procedures were developed for estimating winter (December-April) low flows at ungaged stream sites in Tennessee based on surface geology and drainage area size. One set of equations applies to West Tennessee streams, and another set applies to Middle and East Tennessee streams. The equations do not apply to streams where flow is significantly altered by the activities of man. Standard errors of estimate of equations for West Tennessee are 22% - 35% and for middle and East Tennessee 31% - 36%. Statistical analyses indicate that summer low-flow characteristics are the same as annual low-flow characteristics, and that winter low flows are larger than annual low flows. Streamflow-recession indexes, in days per log cycle of decrease in discharge, were used to account for effects of geology on low flow of streams. The indexes in Tennessee range from 32 days/log cycle for clay and shale to 350 days/log cycle for gravel and sand, indicating different aquifer characteristics of the geologic units that contribute to streamflows during periods of no surface runoff. Streamflow-recession rate depends primarily on transmissivity and storage characteristics of the aquifers, and the average distance from stream channels to basin divides. Geology and drainage basin size are the most significant variables affecting low flow in Tennessee streams according to regression analyses. (Author 's abstract)

  10. Poisson regression models outperform the geometrical model in estimating the peak-to-trough ratio of seasonal variation: a simulation study.

    PubMed

    Christensen, A L; Lundbye-Christensen, S; Dethlefsen, C

    2011-12-01

    Several statistical methods of assessing seasonal variation are available. Brookhart and Rothman [3] proposed a second-order moment-based estimator based on the geometrical model derived by Edwards [1], and reported that this estimator is superior in estimating the peak-to-trough ratio of seasonal variation compared with Edwards' estimator with respect to bias and mean squared error. Alternatively, seasonal variation may be modelled using a Poisson regression model, which provides flexibility in modelling the pattern of seasonal variation and adjustments for covariates. Based on a Monte Carlo simulation study three estimators, one based on the geometrical model, and two based on log-linear Poisson regression models, were evaluated in regards to bias and standard deviation (SD). We evaluated the estimators on data simulated according to schemes varying in seasonal variation and presence of a secular trend. All methods and analyses in this paper are available in the R package Peak2Trough[13]. Applying a Poisson regression model resulted in lower absolute bias and SD for data simulated according to the corresponding model assumptions. Poisson regression models had lower bias and SD for data simulated to deviate from the corresponding model assumptions than the geometrical model. This simulation study encourages the use of Poisson regression models in estimating the peak-to-trough ratio of seasonal variation as opposed to the geometrical model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Using air/water/sediment temperature contrasts to identify groundwater seepage locations in small streams

    NASA Astrophysics Data System (ADS)

    Karan, S.; Sebok, E.; Engesgaard, P. K.

    2016-12-01

    For identifying groundwater seepage locations in small streams within a headwater catchment, we present a method expanding on the linear regression of air and stream temperatures. Thus, by measuring the temperatures in dual-depth; in the stream column and at the streambed-water interface (SWI), we apply metrics from linear regression analysis of temperatures between air/stream and air/SWI (linear regression slope, intercept and coefficient of determination), and the daily mean temperatures (temperature variance and the average difference between the minimum and maximum daily temperatures). Our study show that using metrics from single-depth stream temperature measurements only are not sufficient to identify substantial groundwater seepage locations within a headwater stream. Conversely, comparing the metrics from dual-depth temperatures show significant differences so that at groundwater seepage locations, temperatures at the SWI, merely explain 43-75 % of the variation opposed to ≥91 % at the corresponding stream column temperatures. The figure showing a box-plot of the variation in daily mean temperature depict that at several locations there is great variation in the range the upper and lower loggers due to groundwater seepage. In general, the linear regression show that at these locations at the SWI, the slopes (<0.25) and intercepts (>6.5oC) are substantially lower and higher, while the mean diel amplitudes (<0.98oC) are decreased compared to remaining locations. The dual-depth approach was applied in a post-glacial fluvial setting, where metrics analyses overall corresponded to field measurements of groundwater fluxes deduced from vertical streambed temperatures and stream flow accretions. Thus, we propose a method reliably identifying groundwater seepage locations along streambed in such settings.

  12. New 1,6-heptadienes with pyrimidine bases attached: Syntheses and spectroscopic analyses

    NASA Astrophysics Data System (ADS)

    Hammud, Hassan H.; Ghannoum, Amer M.; Fares, Fares A.; Abramian, Lara K.; Bouhadir, Kamal H.

    2008-06-01

    A simple, high yielding synthesis leading to the functionalization of some pyrimidine bases with a 1,6-heptadienyl moiety spaced from the N - 1 position by a methylene group is described. A key step in this synthesis involves a Mitsunobu reaction by coupling 3N-benzoyluracil and 3N-benzoylthymine to 2-allyl-pent-4-en-1-ol followed by alkaline hydrolysis of the 3N-benzoyl protecting groups. This protocol should eventually lend itself to the synthesis of a host of N-alkylated nucleoside analogs. The absorption and emission properties of these pyrimidine derivatives ( 3- 6) were studied in solvents of different physical properties. Computerized analysis and multiple regression techniques were applied to calculate the regression and correlation coefficients based on the equation that relates peak position λmax to the solvent parameters that depend on the H-bonding ability, refractive index, and dielectric constant of solvents.

  13. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation

    PubMed Central

    Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W.

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, ΦPSIImax) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors. PMID:27285611

  14. Quantitative analysis of aircraft multispectral-scanner data and mapping of water-quality parameters in the James River in Virginia

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Bahn, G. S.

    1977-01-01

    Statistical analysis techniques were applied to develop quantitative relationships between in situ river measurements and the remotely sensed data that were obtained over the James River in Virginia on 28 May 1974. The remotely sensed data were collected with a multispectral scanner and with photographs taken from an aircraft platform. Concentration differences among water quality parameters such as suspended sediment, chlorophyll a, and nutrients indicated significant spectral variations. Calibrated equations from the multiple regression analysis were used to develop maps that indicated the quantitative distributions of water quality parameters and the dispersion characteristics of a pollutant plume entering the turbid river system. Results from further analyses that use only three preselected multispectral scanner bands of data indicated that regression coefficients and standard errors of estimate were not appreciably degraded compared with results from the 10-band analysis.

  15. Prediction of elemental creep. [steady state and cyclic data from regression analysis

    NASA Technical Reports Server (NTRS)

    Davis, J. W.; Rummler, D. R.

    1975-01-01

    Cyclic and steady-state creep tests were performed to provide data which were used to develop predictive equations. These equations, describing creep as a function of stress, temperature, and time, were developed through the use of a least squares regression analyses computer program for both the steady-state and cyclic data sets. Comparison of the data from the two types of tests, revealed that there was no significant difference between the cyclic and steady-state creep strains for the L-605 sheet under the experimental conditions investigated (for the same total time at load). Attempts to develop a single linear equation describing the combined steady-state and cyclic creep data resulted in standard errors of estimates higher than obtained for the individual data sets. A proposed approach to predict elemental creep in metals uses the cyclic creep equation and a computer program which applies strain and time hardening theories of creep accumulation.

  16. The use of segmented regression in analysing interrupted time series studies: an example in pre-hospital ambulance care.

    PubMed

    Taljaard, Monica; McKenzie, Joanne E; Ramsay, Craig R; Grimshaw, Jeremy M

    2014-06-19

    An interrupted time series design is a powerful quasi-experimental approach for evaluating effects of interventions introduced at a specific point in time. To utilize the strength of this design, a modification to standard regression analysis, such as segmented regression, is required. In segmented regression analysis, the change in intercept and/or slope from pre- to post-intervention is estimated and used to test causal hypotheses about the intervention. We illustrate segmented regression using data from a previously published study that evaluated the effectiveness of a collaborative intervention to improve quality in pre-hospital ambulance care for acute myocardial infarction (AMI) and stroke. In the original analysis, a standard regression model was used with time as a continuous variable. We contrast the results from this standard regression analysis with those from segmented regression analysis. We discuss the limitations of the former and advantages of the latter, as well as the challenges of using segmented regression in analysing complex quality improvement interventions. Based on the estimated change in intercept and slope from pre- to post-intervention using segmented regression, we found insufficient evidence of a statistically significant effect on quality of care for stroke, although potential clinically important effects for AMI cannot be ruled out. Segmented regression analysis is the recommended approach for analysing data from an interrupted time series study. Several modifications to the basic segmented regression analysis approach are available to deal with challenges arising in the evaluation of complex quality improvement interventions.

  17. Reduction of interferences in graphite furnace atomic absorption spectrometry by multiple linear regression modelling

    NASA Astrophysics Data System (ADS)

    Grotti, Marco; Abelmoschi, Maria Luisa; Soggia, Francesco; Tiberiade, Christian; Frache, Roberto

    2000-12-01

    The multivariate effects of Na, K, Mg and Ca as nitrates on the electrothermal atomisation of manganese, cadmium and iron were studied by multiple linear regression modelling. Since the models proved to efficiently predict the effects of the considered matrix elements in a wide range of concentrations, they were applied to correct the interferences occurring in the determination of trace elements in seawater after pre-concentration of the analytes. In order to obtain a statistically significant number of samples, a large volume of the certified seawater reference materials CASS-3 and NASS-3 was treated with Chelex-100 resin; then, the chelating resin was separated from the solution, divided into several sub-samples, each of them was eluted with nitric acid and analysed by electrothermal atomic absorption spectrometry (for trace element determinations) and inductively coupled plasma optical emission spectrometry (for matrix element determinations). To minimise any other systematic error besides that due to matrix effects, accuracy of the pre-concentration step and contamination levels of the procedure were checked by inductively coupled plasma mass spectrometric measurements. Analytical results obtained by applying the multiple linear regression models were compared with those obtained with other calibration methods, such as external calibration using acid-based standards, external calibration using matrix-matched standards and the analyte addition technique. Empirical models proved to efficiently reduce interferences occurring in the analysis of real samples, allowing an improvement of accuracy better than for other calibration methods.

  18. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures

    NASA Astrophysics Data System (ADS)

    Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-01

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.

  19. CyTOF workflow: differential discovery in high-throughput high-dimensional cytometry datasets

    PubMed Central

    Nowicka, Malgorzata; Krieg, Carsten; Weber, Lukas M.; Hartmann, Felix J.; Guglietta, Silvia; Becher, Burkhard; Levesque, Mitchell P.; Robinson, Mark D.

    2017-01-01

    High dimensional mass and flow cytometry (HDCyto) experiments have become a method of choice for high throughput interrogation and characterization of cell populations.Here, we present an R-based pipeline for differential analyses of HDCyto data, largely based on Bioconductor packages. We computationally define cell populations using FlowSOM clustering, and facilitate an optional but reproducible strategy for manual merging of algorithm-generated clusters. Our workflow offers different analysis paths, including association of cell type abundance with a phenotype or changes in signaling markers within specific subpopulations, or differential analyses of aggregated signals. Importantly, the differential analyses we show are based on regression frameworks where the HDCyto data is the response; thus, we are able to model arbitrary experimental designs, such as those with batch effects, paired designs and so on. In particular, we apply generalized linear mixed models to analyses of cell population abundance or cell-population-specific analyses of signaling markers, allowing overdispersion in cell count or aggregated signals across samples to be appropriately modeled. To support the formal statistical analyses, we encourage exploratory data analysis at every step, including quality control (e.g. multi-dimensional scaling plots), reporting of clustering results (dimensionality reduction, heatmaps with dendrograms) and differential analyses (e.g. plots of aggregated signals). PMID:28663787

  20. Gender Gaps in Mathematics, Science and Reading Achievements in Muslim Countries: Evidence from Quantile Regression Analyses

    ERIC Educational Resources Information Center

    Shafiq, M. Najeeb

    2011-01-01

    Using quantile regression analyses, this study examines gender gaps in mathematics, science, and reading in Azerbaijan, Indonesia, Jordan, the Kyrgyz Republic, Qatar, Tunisia, and Turkey among 15 year-old students. The analyses show that girls in Azerbaijan achieve as well as boys in mathematics and science and overachieve in reading. In Jordan,…

  1. Biomedical research competencies for osteopathic medical students

    PubMed Central

    Cruser, des Anges; Dubin, Bruce; Brown, Sarah K; Bakken, Lori L; Licciardone, John C; Podawiltz, Alan L; Bulik, Robert J

    2009-01-01

    Background Without systematic exposure to biomedical research concepts or applications, osteopathic medical students may be generally under-prepared to efficiently consume and effectively apply research and evidence-based medicine information in patient care. The academic literature suggests that although medical residents are increasingly expected to conduct research in their post graduate training specialties, they generally have limited understanding of research concepts. With grant support from the National Center for Complementary and Alternative Medicine, and a grant from the Osteopathic Heritage Foundation, the University of North Texas Health Science Center (UNTHSC) is incorporating research education in the osteopathic medical school curriculum. The first phase of this research education project involved a baseline assessment of students' understanding of targeted research concepts. This paper reports the results of that assessment and discusses implications for research education during medical school. Methods Using a novel set of research competencies supported by the literature as needed for understanding research information, we created a questionnaire to measure students' confidence and understanding of selected research concepts. Three matriculating medical school classes completed the on-line questionnaire. Data were analyzed for differences between groups using analysis of variance and t-tests. Correlation coefficients were computed for the confidence and applied understanding measures. We performed a principle component factor analysis of the confidence items, and used multiple regression analyses to explore how confidence might be related to the applied understanding. Results Of 496 total incoming, first, and second year medical students, 354 (71.4%) completed the questionnaire. Incoming students expressed significantly more confidence than first or second year students (F = 7.198, df = 2, 351, P = 0.001) in their ability to understand the research concepts. Factor analyses of the confidence items yielded conceptually coherent groupings. Regression analysis confirmed a relationship between confidence and applied understanding referred to as knowledge. Confidence scores were important in explaining variability in knowledge scores of the respondents. Conclusion Medical students with limited understanding of research concepts may struggle to understand the medical literature. Assessing medical students' confidence to understand and objectively measured ability to interpret basic research concepts can be used to incorporate competency based research material into the existing curriculum. PMID:19825171

  2. A tutorial on the use of Excel 2010 and Excel for Mac 2011 for conducting delay-discounting analyses.

    PubMed

    Reed, Derek D; Kaplan, Brent A; Brewer, Adam T

    2012-01-01

    In recent years, researchers and practitioners in the behavioral sciences have profited from a growing literature on delay discounting. The purpose of this article is to provide readers with a brief tutorial on how to use Microsoft Office Excel 2010 and Excel for Mac 2011 to analyze discounting data to yield parameters for both the hyperbolic discounting model and area under the curve. This tutorial is intended to encourage the quantitative analysis of behavior in both research and applied settings by readers with relatively little formal training in nonlinear regression.

  3. A TUTORIAL ON THE USE OF EXCEL 2010 AND EXCEL FOR MAC 2011 FOR CONDUCTING DELAY-DISCOUNTING ANALYSES

    PubMed Central

    Reed, Derek D; Kaplan, Brent A; Brewer, Adam T

    2012-01-01

    In recent years, researchers and practitioners in the behavioral sciences have profited from a growing literature on delay discounting. The purpose of this article is to provide readers with a brief tutorial on how to use Microsoft Office Excel 2010 and Excel for Mac 2011 to analyze discounting data to yield parameters for both the hyperbolic discounting model and area under the curve. This tutorial is intended to encourage the quantitative analysis of behavior in both research and applied settings by readers with relatively little formal training in nonlinear regression. PMID:22844143

  4. Applying ethnic equivalence and cultural values models to African-American teens' perceptions of parents.

    PubMed

    Lamborn, Susie D; Felbab, Amanda J

    2003-10-01

    This study evaluated both the parenting styles and family ecologies models with interview responses from 93 14- and 15-year-old African-American adolescents. The parenting styles model was more strongly represented in both open-ended and structured interview responses. Using variables from the structured interview as independent variables, regression analyses contrasted each model with a joint model for predicting self-esteem, self-reliance, work orientation, and ethnic identity. Overall, the findings suggest that a joint model that combines elements from both models provides a richer understanding of African-American families.

  5. What We Have Learned from the Recent Meta-analyses on Diagnostic Methods for Atherosclerotic Plaque Regression.

    PubMed

    Biondi-Zoccai, Giuseppe; Mastrangeli, Simona; Romagnoli, Enrico; Peruzzi, Mariangela; Frati, Giacomo; Roever, Leonardo; Giordano, Arturo

    2018-01-17

    Atherosclerosis has major morbidity and mortality implications globally. While it has often been considered an irreversible degenerative process, recent evidence provides compelling proof that atherosclerosis can be reversed. Plaque regression is however difficult to appraise and quantify, with competing diagnostic methods available. Given the potential of evidence synthesis to provide clinical guidance, we aimed to review recent meta-analyses on diagnostic methods for atherosclerotic plaque regression. We identified 8 meta-analyses published between 2015 and 2017, including 79 studies and 14,442 patients, followed for a median of 12 months. They reported on atherosclerotic plaque regression appraised with carotid duplex ultrasound, coronary computed tomography, carotid magnetic resonance, coronary intravascular ultrasound, and coronary optical coherence tomography. Overall, all meta-analyses showed significant atherosclerotic plaque regression with lipid-lowering therapy, with the most notable effects on echogenicity, lipid-rich necrotic core volume, wall/plaque volume, dense calcium volume, and fibrous cap thickness. Significant interactions were found with concomitant changes in low density lipoprotein cholesterol, high density lipoprotein cholesterol, and C-reactive protein levels, and with ethnicity. Atherosclerotic plaque regression and conversion to a stable phenotype is possible with intensive medical therapy and can be demonstrated in patients using a variety of non-invasive and invasive imaging modalities.

  6. Geo-Chip analysis reveals reduced functional diversity of the bacterial community at a dumping site for dredged Elbe sediment.

    PubMed

    Störmer, Rebecca; Wichels, Antje; Gerdts, Gunnar

    2013-12-15

    The dumping of dredged sediments represents a major stressor for coastal ecosystems. The impact on the ecosystem function is determined by its complexity not easy to assess. In the present study, we evaluated the potential of bacterial community analyses to act as ecological indicators in environmental monitoring programmes. We investigated the functional structure of bacterial communities, applying functional gene arrays (GeoChip4.2). The relationship between functional genes and environmental factors was analysed using distance-based multivariate multiple regression. Apparently, both the function and structure of the bacterial communities are impacted by dumping activities. The bacterial community at the dumping centre displayed a significant reduction of its entire functional diversity compared with that found at a reference site. DDX compounds separated bacterial communities of the dumping site from those of un-impacted sites. Thus, bacterial community analyses show great potential as ecological indicators in environmental monitoring. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Modelling nitrate pollution pressure using a multivariate statistical approach: the case of Kinshasa groundwater body, Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    Mfumu Kihumba, Antoine; Ndembo Longo, Jean; Vanclooster, Marnik

    2016-03-01

    A multivariate statistical modelling approach was applied to explain the anthropogenic pressure of nitrate pollution on the Kinshasa groundwater body (Democratic Republic of Congo). Multiple regression and regression tree models were compared and used to identify major environmental factors that control the groundwater nitrate concentration in this region. The analyses were made in terms of physical attributes related to the topography, land use, geology and hydrogeology in the capture zone of different groundwater sampling stations. For the nitrate data, groundwater datasets from two different surveys were used. The statistical models identified the topography, the residential area, the service land (cemetery), and the surface-water land-use classes as major factors explaining nitrate occurrence in the groundwater. Also, groundwater nitrate pollution depends not on one single factor but on the combined influence of factors representing nitrogen loading sources and aquifer susceptibility characteristics. The groundwater nitrate pressure was better predicted with the regression tree model than with the multiple regression model. Furthermore, the results elucidated the sensitivity of the model performance towards the method of delineation of the capture zones. For pollution modelling at the monitoring points, therefore, it is better to identify capture-zone shapes based on a conceptual hydrogeological model rather than to adopt arbitrary circular capture zones.

  8. Modelling of capital asset pricing by considering the lagged effects

    NASA Astrophysics Data System (ADS)

    Sukono; Hidayat, Y.; Bon, A. Talib bin; Supian, S.

    2017-01-01

    In this paper the problem of modelling the Capital Asset Pricing Model (CAPM) with the effect of the lagged is discussed. It is assumed that asset returns are analysed influenced by the market return and the return of risk-free assets. To analyse the relationship between asset returns, the market return, and the return of risk-free assets, it is conducted by using a regression equation of CAPM, and regression equation of lagged distributed CAPM. Associated with the regression equation lagged CAPM distributed, this paper also developed a regression equation of Koyck transformation CAPM. Results of development show that the regression equation of Koyck transformation CAPM has advantages, namely simple as it only requires three parameters, compared with regression equation of lagged distributed CAPM.

  9. An Analysis of COLA (Cost of Living Adjustment) Allocation within the United States Coast Guard.

    DTIC Science & Technology

    1983-09-01

    books Applied Linear Regression [Ref. 39], and Statistical Methods in Research and Production [Ref. 40], or any other book on regression. In the event...Indexes, Master’s Thesis, Air Force Institute of Technology, Wright-Patterson AFB, 1976. 39. Weisberg, Stanford, Applied Linear Regression , Wiley, 1980. 40

  10. A novel method linking neural connectivity to behavioral fluctuations: Behavior-regressed connectivity.

    PubMed

    Passaro, Antony D; Vettel, Jean M; McDaniel, Jonathan; Lawhern, Vernon; Franaszczuk, Piotr J; Gordon, Stephen M

    2017-03-01

    During an experimental session, behavioral performance fluctuates, yet most neuroimaging analyses of functional connectivity derive a single connectivity pattern. These conventional connectivity approaches assume that since the underlying behavior of the task remains constant, the connectivity pattern is also constant. We introduce a novel method, behavior-regressed connectivity (BRC), to directly examine behavioral fluctuations within an experimental session and capture their relationship to changes in functional connectivity. This method employs the weighted phase lag index (WPLI) applied to a window of trials with a weighting function. Using two datasets, the BRC results are compared to conventional connectivity results during two time windows: the one second before stimulus onset to identify predictive relationships, and the one second after onset to capture task-dependent relationships. In both tasks, we replicate the expected results for the conventional connectivity analysis, and extend our understanding of the brain-behavior relationship using the BRC analysis, demonstrating subject-specific BRC maps that correspond to both positive and negative relationships with behavior. Comparison with Existing Method(s): Conventional connectivity analyses assume a consistent relationship between behaviors and functional connectivity, but the BRC method examines performance variability within an experimental session to understand dynamic connectivity and transient behavior. The BRC approach examines connectivity as it covaries with behavior to complement the knowledge of underlying neural activity derived from conventional connectivity analyses. Within this framework, BRC may be implemented for the purpose of understanding performance variability both within and between participants. Published by Elsevier B.V.

  11. Who Adopts Improved Fuels and Cookstoves? A Systematic Review

    PubMed Central

    Lewis, Jessica J.

    2012-01-01

    Background: The global focus on improved cookstoves (ICSs) and clean fuels has increased because of their potential for delivering triple dividends: household health, local environmental quality, and regional climate benefits. However, ICS and clean fuel dissemination programs have met with low rates of adoption. Objectives: We reviewed empirical studies on ICSs and fuel choice to describe the literature, examine determinants of fuel and stove choice, and identify knowledge gaps. Methods: We conducted a systematic review of the literature on the adoption of ICSs or cleaner fuels by households in developing countries. Results are synthesized through a simple vote-counting meta-analysis. Results: We identified 32 research studies that reported 146 separate regression analyses of ICS adoption (11 analyses) or fuel choice (135 analyses) from Asia (60%), Africa (27%), and Latin America (19%). Most studies apply multivariate regression methods to consider 7–13 determinants of choice. Income, education, and urban location were positively associated with adoption in most but not all studies. However, the influence of fuel availability and prices, household size and composition, and sex is unclear. Potentially important drivers such as credit, supply-chain strengthening, and social marketing have been ignored. Conclusions: Adoption studies of ICSs or clean energy are scarce, scattered, and of differential quality, even though global distribution programs are quickly expanding. Future research should examine an expanded set of contextual variables to improve implementation of stove programs that can realize the “win-win-win” of health, local environmental quality, and climate associated with these technologies. PMID:22296719

  12. Monitoring Building Deformation with InSAR: Experiments and Validation.

    PubMed

    Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng

    2016-12-20

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.

  13. Declining trends in alcohol consumption among Swedish youth-does the theory of collectivity of drinking cultures apply?

    PubMed

    Raninen, Jonas; Livingston, Michael; Leifman, Håkan

    2014-11-01

    To analyse trends in alcohol consumption among young people in Sweden between 2004 and 2012, to test whether the theory of collectivity of drinking cultures is valid for a population of young people and to investigate the impact of an increasing proportion of abstainers on the overall per capita trends. Data were drawn from an annual survey of a nationally representative sample of students in year 11 (17-18 years old). The data covered 9 years and the total sample comprised 36,141 students. Changes in the overall per capita consumption were tested using linear regression on log-transformed data, and changes in abstention rates were tested using logistic regression. The analyses were then continued by calculating average consumption in deciles. Alcohol consumption among year 11 students declined significantly among both boys and girls between 2004 and 2012. These changes were reflected at all levels of consumption, and the same results were found when abstainers were excluded from the analyses. The increasing proportion of abstainers had a minimal effect on the overall decline in consumption; rather, this was driven by a decline in consumption among the heaviest drinkers. The theory of collectivity of drinking cultures seems valid for understanding changes in alcohol consumption among Swedish year 11 students. No support was found for a polarization of alcohol consumption in this nationally representative sample. © The Author 2014. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  14. Regression modeling of ground-water flow

    USGS Publications Warehouse

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  15. Application of Response Surface Methodology for characterization of ozone production from Multi-Cylinder Reactor in non-thermal plasma device

    NASA Astrophysics Data System (ADS)

    Lian See, Tan; Zulazlan Shah Zulkifli, Ahmad; Mook Tzeng, Lim

    2018-04-01

    Ozone is a reactant which can be applied for various environmental treatment processes. It can be generated via atmospheric air non-thermal plasmas when sufficient voltages are applied through a combination of electrodes and dielectric materials. In this study, the concentration of ozone generated via two different configurations of multi-cylinder dielectric barrier discharge (DBD) reactor (3 x 40 mm and 10 x 10 mm) was investigated. The influence of the voltage and the duty cycle to the concentration of ozone generated by each configuration was analysed using response surface methodology. Voltage was identified as significant factor to the ozone production process. However, the regressed model was biased towards one of the configuration, leaving the predicted results of another configuration to be out of range.

  16. Impact of Contextual Factors on Prostate Cancer Risk and Outcomes

    DTIC Science & Technology

    2013-07-01

    framework with random effects (“frailty models”) while the case-control analyses (Aim 4) will use multilevel unconditional logistic regression models...contextual-level SES on prostate cancer risk within racial/ethnic groups. The survival analyses (Aims 1-3) will utilize a proportional hazards regression

  17. Efficiency of primary care in rural Burkina Faso. A two-stage DEA analysis

    PubMed Central

    2011-01-01

    Background Providing health care services in Africa is hampered by severe scarcity of personnel, medical supplies and financial funds. Consequently, managers of health care institutions are called to measure and improve the efficiency of their facilities in order to provide the best possible services with their resources. However, very little is known about the efficiency of health care facilities in Africa and instruments of performance measurement are hardly applied in this context. Objective This study determines the relative efficiency of primary care facilities in Nouna, a rural health district in Burkina Faso. Furthermore, it analyses the factors influencing the efficiency of these institutions. Methodology We apply a two-stage Data Envelopment Analysis (DEA) based on data from a comprehensive provider and household information system. In the first stage, the relative efficiency of each institution is calculated by a traditional DEA model. In the second stage, we identify the reasons for being inefficient by regression technique. Results The DEA projections suggest that inefficiency is mainly a result of poor utilization of health care facilities as they were either too big or the demand was too low. Regression results showed that distance is an important factor influencing the efficiency of a health care institution Conclusions Compared to the findings of existing one-stage DEA analyses of health facilities in Africa, the share of relatively efficient units is slightly higher. The difference might be explained by a rather homogenous structure of the primary care facilities in the Burkina Faso sample. The study also indicates that improving the accessibility of primary care facilities will have a major impact on the efficiency of these institutions. Thus, health decision-makers are called to overcome the demand-side barriers in accessing health care. PMID:22828358

  18. Active ageing and quality of life: factors associated with participation in leisure activities among institutionalized older adults, with and without dementia.

    PubMed

    Fernández-Mayoralas, Gloria; Rojo-Pérez, Fermina; Martínez-Martín, Pablo; Prieto-Flores, Maria-Eugenia; Rodríguez-Blázquez, Carmen; Martín-García, Salomé; Rojo-Abuín, José-Manuel; Forjaz, Maria-Joao

    2015-01-01

    Active ageing, considered from the perspective of participation in leisure activities, promotes life satisfaction and personal well-being. The aims of this work are to define and explain leisure activity profiles among institutionalized older adults, considering their sociodemographic characteristics and objective and subjective conditions in relation to their quality of life. Two samples of institutionalized people aged 60 and over were analysed together: 234 older adults without dementia and 525 with dementia. Sociodemographic, economic, family and social network, and health and functioning variables were selected. Cluster analysis was applied to obtain activity profiles according to the leisure activities, and ordinal regression models were performed to analyse factors associated to activity level. The sample was clustered into three groups of people: active (27%), moderately active (35%) and inactive people (38%). In the final regression model (Nagelkerke pseudo R(2) = 0.500), a higher level of activity was associated with better cognitive function (Pfeiffer scale), self-perceived health status and functional ability, as well as with a higher frequency of gathering with family and friends, and higher educational level. The decline in physical and mental health, the loss of functional capabilities and the weakening of family and social ties represent a significant barrier to active ageing in a context of institutionalization.

  19. Dynamic modelling of n-of-1 data: powerful and flexible data analytics applied to individualised studies.

    PubMed

    Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin

    2017-09-01

    N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.

  20. Metabolic Profiling of Adiponectin Levels in Adults: Mendelian Randomization Analysis.

    PubMed

    Borges, Maria Carolina; Barros, Aluísio J D; Ferreira, Diana L Santos; Casas, Juan Pablo; Horta, Bernardo Lessa; Kivimaki, Mika; Kumari, Meena; Menon, Usha; Gaunt, Tom R; Ben-Shlomo, Yoav; Freitas, Deise F; Oliveira, Isabel O; Gentry-Maharaj, Aleksandra; Fourkala, Evangelia; Lawlor, Debbie A; Hingorani, Aroon D

    2017-12-01

    Adiponectin, a circulating adipocyte-derived protein, has insulin-sensitizing, anti-inflammatory, antiatherogenic, and cardiomyocyte-protective properties in animal models. However, the systemic effects of adiponectin in humans are unknown. Our aims were to define the metabolic profile associated with higher blood adiponectin concentration and investigate whether variation in adiponectin concentration affects the systemic metabolic profile. We applied multivariable regression in ≤5909 adults and Mendelian randomization (using cis -acting genetic variants in the vicinity of the adiponectin gene as instrumental variables) for analyzing the causal effect of adiponectin in the metabolic profile of ≤37 545 adults. Participants were largely European from 6 longitudinal studies and 1 genome-wide association consortium. In the multivariable regression analyses, higher circulating adiponectin was associated with higher high-density lipoprotein lipids and lower very-low-density lipoprotein lipids, glucose levels, branched-chain amino acids, and inflammatory markers. However, these findings were not supported by Mendelian randomization analyses for most metabolites. Findings were consistent between sexes and after excluding high-risk groups (defined by age and occurrence of previous cardiovascular event) and 1 study with admixed population. Our findings indicate that blood adiponectin concentration is more likely to be an epiphenomenon in the context of metabolic disease than a key determinant. © 2017 The Authors.

  1. Comparing the index-flood and multiple-regression methods using L-moments

    NASA Astrophysics Data System (ADS)

    Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.

    In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin in central Iran. To estimate floods of various return periods for gauged catchments in the study area, the mean annual peak flood of the catchments may be multiplied by corresponding values of the growth factors, and computed using the GEV distribution.

  2. Interrelations between orthostatic postural deviations and subjects' age, sex, malocclusion, and specific signs and symptoms of functional pathologies of the temporomandibular system: a preliminary correlation and regression study.

    PubMed

    Munhoz, Wagner Cesar; Hsing, Wu Tu

    2014-07-01

    Studies on the relationships between postural deviations and the temporomandibular system (TS) functional health are controversial and inconclusive. This study stems from the hypothesis that such inconclusiveness is due to authors considering functional pathologies of the TS (FPTS) as a whole, without taking into account subjects' specific FPTS signs and symptoms. Based on the author and collaborators' previous studies, the present study analyzed data on body posture from a sample of 50 subjects with (30) and without (20) FPTS. Correlation analyses were applied, taking as independent variables age, sex, Helkimo anamnestic, occlusal, and dysfunction indices, as well as FPTS specific signs and symptoms. Postural assessments of the head, cervical spine, shoulders, lumbar spine, and hips were the dependent variables. Linear regression equations were built that proved to partially predict the presence and magnitude of body posture deviations by drawing on subjects' characteristics and specific FPTS symptoms. Determination coefficients for these equations ranged from 0.082 to 0.199 in the univariate, and from 0.121 to 0.502 in the multivariate regression analyses. Results show that factors intrinsic to the subjects or the TS may potentially interfere in results of studies that analyze relationships between FPTS and body posture. Furthermore, a trend to specificity was found, e.g. the degree of cervical lordosis was found to correlate to age and FPTS degree of severity, suggesting that some TS pathological features, or malocclusion, age or sex, may be more strongly correlated than others with specific posture patterns.

  3. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    PubMed

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We demonstrate our proposed approach for a two-sample summary data MR analysis to estimate the causal effect of low-density lipoprotein on heart disease risk. A high value of IGX2 close to 1 indicates that dilution does not materially affect the standard MR-Egger analyses for these data. : Care must be taken to assess the NOME assumption via the IGX2 statistic before implementing standard MR-Egger regression in the two-sample summary data context. If IGX2 is sufficiently low (less than 90%), inferences from the method should be interpreted with caution and adjustment methods considered. © The Author 2016. Published by Oxford University Press on behalf of the International Epidemiological Association.

  4. High-flow oxygen therapy: pressure analysis in a pediatric airway model.

    PubMed

    Urbano, Javier; del Castillo, Jimena; López-Herce, Jesús; Gallardo, José A; Solana, María J; Carrillo, Ángel

    2012-05-01

    The mechanism of high-flow oxygen therapy and the pressures reached in the airway have not been defined. We hypothesized that the flow would generate a low continuous positive pressure, and that elevated flow rates in this model could produce moderate pressures. The objective of this study was to analyze the pressure generated by a high-flow oxygen therapy system in an experimental model of the pediatric airway. An experimental in vitro study was performed. A high-flow oxygen therapy system was connected to 3 types of interface (nasal cannulae, nasal mask, and oronasal mask) and applied to 2 types of pediatric manikin (infant and neonatal). The pressures generated in the circuit, in the airway, and in the pharynx were measured at different flow rates (5, 10, 15, and 20 L/min). The experiment was conducted with and without a leak (mouth sealed and unsealed). Linear regression analyses were performed for each set of measurements. The pressures generated with the different interfaces were very similar. The maximum pressure recorded was 4 cm H(2)O with a flow of 20 L/min via nasal cannulae or nasal mask. When the mouth of the manikin was held open, the pressures reached in the airway and pharynxes were undetectable. Linear regression analyses showed a similar linear relationship between flow and pressures measured in the pharynx (pressure = -0.375 + 0.138 × flow) and in the airway (pressure = -0.375 + 0.158 × flow) with the closed mouth condition. According to our hypothesis, high-flow oxygen therapy systems produced a low-level CPAP in an experimental pediatric model, even with the use of very high flow rates. Linear regression analyses showed similar linear relationships between flow and pressures measured in the pharynx and in the airway. This finding suggests that, at least in part, the effects may be due to other mechanisms.

  5. Bisphenol-A exposures and behavioural aberrations: median and linear spline and meta-regression analyses of 12 toxicity studies in rodents.

    PubMed

    Peluso, Marco E M; Munnia, Armelle; Ceppi, Marcello

    2014-11-05

    Exposures to bisphenol-A, a weak estrogenic chemical, largely used for the production of plastic containers, can affect the rodent behaviour. Thus, we examined the relationships between bisphenol-A and the anxiety-like behaviour, spatial skills, and aggressiveness, in 12 toxicity studies of rodent offspring from females orally exposed to bisphenol-A, while pregnant and/or lactating, by median and linear splines analyses. Subsequently, the meta-regression analysis was applied to quantify the behavioural changes. U-shaped, inverted U-shaped and J-shaped dose-response curves were found to describe the relationships between bisphenol-A with the behavioural outcomes. The occurrence of anxiogenic-like effects and spatial skill changes displayed U-shaped and inverted U-shaped curves, respectively, providing examples of effects that are observed at low-doses. Conversely, a J-dose-response relationship was observed for aggressiveness. When the proportion of rodents expressing certain traits or the time that they employed to manifest an attitude was analysed, the meta-regression indicated that a borderline significant increment of anxiogenic-like effects was present at low-doses regardless of sexes (β)=-0.8%, 95% C.I. -1.7/0.1, P=0.076, at ≤120 μg bisphenol-A. Whereas, only bisphenol-A-males exhibited a significant inhibition of spatial skills (β)=0.7%, 95% C.I. 0.2/1.2, P=0.004, at ≤100 μg/day. A significant increment of aggressiveness was observed in both the sexes (β)=67.9,C.I. 3.4, 172.5, P=0.038, at >4.0 μg. Then, bisphenol-A treatments significantly abrogated spatial learning and ability in males (P<0.001 vs. females). Overall, our study showed that developmental exposures to low-doses of bisphenol-A, e.g. ≤120 μg/day, were associated to behavioural aberrations in offspring. Copyright © 2014. Published by Elsevier Ireland Ltd.

  6. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  7. Multi-scaling allometric analysis for urban and regional development

    NASA Astrophysics Data System (ADS)

    Chen, Yanguang

    2017-01-01

    The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.

  8. A Feasibility Study on Monitoring Residual Sugar and Alcohol Strength in Kiwi Wine Fermentation Using a Fiber-Optic FT-NIR Spectrometry and PLS Regression.

    PubMed

    Wang, Bingqian; Peng, Bangzhu

    2017-02-01

    This work aims to investigate the potential of fiber-optic Fourier transform-near-infrared (FT-NIR) spectrometry associated with chemometric analysis, which will be applied to monitor time-related changes in residual sugar and alcohol strength during kiwi wine fermentation. NIR calibration models for residual sugar and alcohol strength during kiwi wine fermentation were established on the FT-NIR spectra of 98 samples scanned in a fiber-optic FT-NIR spectrometer, and partial least squares regression method. The results showed that R 2 and root mean square error of cross-validation could achieve 0.982 and 3.81 g/L for residual sugar, and 0.984 and 0.34% for alcohol strength, respectively. Furthermore, crucial process information on kiwi must and wine fermentations provided by fiber-optic FT-NIR spectrometry was found to agree with those obtained from traditional chemical methods, and therefore this fiber-optic FT-NIR spectrometry can be applied as an effective and suitable alternative for analyses and monitoring of those processes. The overall results suggested that fiber-optic FT-NIR spectrometry is a promising tool for monitoring and controlling the kiwi wine fermentation process. © 2017 Institute of Food Technologists®.

  9. Has there been a change in the knowledge of GP registrars between 2011 and 2016 as measured by performance on common items in the Applied Knowledge Test?

    PubMed

    Neden, Catherine A; Parkin, Claire; Blow, Carol; Siriwardena, Aloysius Niroshan

    2018-05-08

    The aim of this study was to assess whether the absolute standard of candidates sitting the MRCGP Applied Knowledge Test (AKT) between 2011 and 2016 had changed. It is a descriptive study comparing the performance on marker questions of a reference group of UK graduates taking the AKT for the first time between 2011 and 2016. Using aggregated examination data, the performance of individual 'marker' questions was compared using Pearson's chi-squared tests and trend-line analysis. Binary logistic regression was used to analyse changes in performance over the study period. Changes in performance of individual marker questions using Pearson's chi-squared test showed statistically significant differences in 32 of the 49 questions included in the study. Trend line analysis showed a positive trend in 29 questions and a negative trend in the remaining 23. The magnitude of change was small. Logistic regression did not demonstrate any evidence for a change in the performance of the question set over the study period. However, candidates were more likely to get items on administration wrong compared with clinical medicine or research. There was no evidence of a change in performance of the question set as a whole.

  10. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  11. Visual abilities distinguish pitchers from hitters in professional baseball.

    PubMed

    Klemish, David; Ramger, Benjamin; Vittetoe, Kelly; Reiter, Jerome P; Tokdar, Surya T; Appelbaum, Lawrence Gregory

    2018-01-01

    This study aimed to evaluate the possibility that differences in sensorimotor abilities exist between hitters and pitchers in a large cohort of baseball players of varying levels of experience. Secondary data analysis was performed on 9 sensorimotor tasks comprising the Nike Sensory Station assessment battery. Bayesian hierarchical regression modelling was applied to test for differences between pitchers and hitters in data from 566 baseball players (112 high school, 85 college, 369 professional) collected at 20 testing centres. Explanatory variables including height, handedness, eye dominance, concussion history, and player position were modelled along with age curves using basis regression splines. Regression analyses revealed better performance for hitters relative to pitchers at the professional level in the visual clarity and depth perception tasks, but these differences did not exist at the high school or college levels. No significant differences were observed in the other 7 measures of sensorimotor capabilities included in the test battery, and no systematic biases were found between the testing centres. These findings, indicating that professional-level hitters have better visual acuity and depth perception than professional-level pitchers, affirm the notion that highly experienced athletes have differing perceptual skills. Findings are discussed in relation to deliberate practice theory.

  12. Mothers' education and childhood mortality in Ghana.

    PubMed

    Buor, Daniel

    2003-06-01

    The significant extent to which maternal education affects child health has been advanced in several sociodemographic-medical literature, but not much has been done in analysing the spatial dimension of the problem; and also using graphic and linear regression models of representation. In Ghana, very little has been done to relate the two variables and offer pragmatic explanations. The need to correlate the two, using a regression model, which is rarely applied in previous studies, is a methodological necessity. The paper examines the impact of mothers' education on childhood mortality in Ghana using, primarily, Ghana Demographic and Health Survey data of 1998 and World Bank data of 2000. The survey has emphatically established that there is an inverse relationship between mothers' education and child survivorship. The use of basic health facilities that relate to childhood survival shows a direct relationship with mothers' education. Recommendations for policy initiatives to simultaneously emphasise the education of the girl-child, and to ensure adequate access to maternal and child health services, have been made. The need for an experimental project of integrating maternal education and child health services has also been recommended. A linear regression model that illustrates the relationship between maternal education and childhood survival has emerged.

  13. Effects of Sleep Quality on the Association between Problematic Mobile Phone Use and Mental Health Symptoms in Chinese College Students

    PubMed Central

    Tao, Shuman; Wu, Xiaoyan; Zhang, Yukun; Zhang, Shichen; Tong, Shilu; Tao, Fangbiao

    2017-01-01

    Problematic mobile phone use (PMPU) is a risk factor for both adolescents’ sleep quality and mental health. It is important to examine the potential negative health effects of PMPU exposure. This study aims to evaluate PMPU and its association with mental health in Chinese college students. Furthermore, we investigated how sleep quality influences this association. In 2013, we collected data regarding participants’ PMPU, sleep quality, and mental health (psychopathological symptoms, anxiety, and depressive symptoms) by standardized questionnaires in 4747 college students. Multivariate logistic regression analysis was applied to assess independent effects and interactions of PMPU and sleep quality with mental health. PMPU and poor sleep quality were observed in 28.2% and 9.8% of participants, respectively. Adjusted logistic regression models suggested independent associations of PMPU and sleep quality with mental health (p < 0.001). Further regression analyses suggested a significant interaction between these measures (p < 0.001). The study highlights that poor sleep quality may play a more significant role in increasing the risk of mental health problems in students with PMPU than in those without PMPU. PMID:28216583

  14. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures.

    PubMed

    Hegazy, Maha A; Lotfy, Hayam M; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-05

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Reproducibility of geometrical acquisition of intra-thoracic organs of children on CT scans.

    PubMed

    Coulongeat, François; Jarrar, Mohamed-Salah; Serre, Thierry; Thollon, Lionel

    2011-08-01

    This paper analyses geometry of intra-thoracic organs from computed tomography (CT) scans performed on 20 children aged from 4 months to 16 years. A set of two measurements on lungs and heart were performed by the same observer. A third set was performed by a second observer. Thus, the intra- and inter-observer relative deviation of measurements was analysed. Multiple regressions were used in order to study the relationship between the CT properties (scanner, voltage, dose, pixel size, slice increment) and the relative deviation of measurements. There is a very low systematic intra- and inter-observer bias in measurements except for the volume of the heart. None of the CT data properties has a significant influence on the relative deviation of measurement. In the present paper, the measurements and 3D reconstruction protocol described can be applied to characterise the growth of the intra-thoracic organs.

  16. Meta-analysis for genome-wide association studies using case-control design: application and practice

    PubMed Central

    2016-01-01

    This review aimed to arrange the process of a systematic review of genome-wide association studies in order to practice and apply a genome-wide meta-analysis (GWMA). The process has a series of five steps: searching and selection, extraction of related information, evaluation of validity, meta-analysis by type of genetic model, and evaluation of heterogeneity. In contrast to intervention meta-analyses, GWMA has to evaluate the Hardy–Weinberg equilibrium (HWE) in the third step and conduct meta-analyses by five potential genetic models, including dominant, recessive, homozygote contrast, heterozygote contrast, and allelic contrast in the fourth step. The ‘genhwcci’ and ‘metan’ commands of STATA software evaluate the HWE and calculate a summary effect size, respectively. A meta-regression using the ‘metareg’ command of STATA should be conducted to evaluate related factors of heterogeneities. PMID:28092928

  17. A chemometric approach to the characterisation of historical mortars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rampazzi, L.; Pozzi, A.; Sansonetti, A.

    2006-06-15

    The compositional knowledge of historical mortars is of great concern in case of provenance and dating investigations and of conservation works since the nature of the raw materials suggests the most compatible conservation products. The classic characterisation usually goes through various analytical determinations, while conservation laboratories call for simple and quick analyses able to enlighten the nature of mortars, usually in terms of the binder fraction. A chemometric approach to the matter is here undertaken. Specimens of mortars were prepared with calcitic and dolomitic binders and analysed by Atomic Spectroscopy. Principal Components Analysis (PCA) was used to investigate the featuresmore » of specimens and samples. A Partial Least Square (PLS1) regression was done in order to predict the binder/aggregate ratio. The model was applied to historical mortars from the churches of St. Lorenzo (Milan) and St. Abbondio (Como). The accordance between the predictive model and the real samples is discussed.« less

  18. Discovery of potent NEK2 inhibitors as potential anticancer agents using structure-based exploration of NEK2 pharmacophoric space coupled with QSAR analyses.

    PubMed

    Khanfar, Mohammad A; Banat, Fahmy; Alabed, Shada; Alqtaishat, Saja

    2017-02-01

    High expression of Nek2 has been detected in several types of cancer and it represents a novel target for human cancer. In the current study, structure-based pharmacophore modeling combined with multiple linear regression (MLR)-based QSAR analyses was applied to disclose the structural requirements for NEK2 inhibition. Generated pharmacophoric models were initially validated with receiver operating characteristic (ROC) curve, and optimum models were subsequently implemented in QSAR modeling with other physiochemical descriptors. QSAR-selected models were implied as 3D search filters to mine the National Cancer Institute (NCI) database for novel NEK2 inhibitors, whereas the associated QSAR model prioritized the bioactivities of captured hits for in vitro evaluation. Experimental validation identified several potent NEK2 inhibitors of novel structural scaffolds. The most potent captured hit exhibited an [Formula: see text] value of 237 nM.

  19. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  20. Covariate Imbalance and Adjustment for Logistic Regression Analysis of Clinical Trial Data

    PubMed Central

    Ciolino, Jody D.; Martin, Reneé H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.

    2014-01-01

    In logistic regression analysis for binary clinical trial data, adjusted treatment effect estimates are often not equivalent to unadjusted estimates in the presence of influential covariates. This paper uses simulation to quantify the benefit of covariate adjustment in logistic regression. However, International Conference on Harmonization guidelines suggest that covariate adjustment be pre-specified. Unplanned adjusted analyses should be considered secondary. Results suggest that that if adjustment is not possible or unplanned in a logistic setting, balance in continuous covariates can alleviate some (but never all) of the shortcomings of unadjusted analyses. The case of log binomial regression is also explored. PMID:24138438

  1. Metatraits and assessment of attributional style.

    PubMed

    Chamberlain, John M; Haaga, David A F; Thorndike, Frances P; Ahrens, Anthony H

    2004-11-01

    Some personality trait dimensions may not be equally applicable to all people. The degree of applicability of a given trait, or traitedness, is conceptually distinct from trait level. In this study, 3 ways of assessing traitedness--interitem variance (R. F. Baumeister & D. M. Tice, 1988), scalability (K. Lanning, 1988), and construct similarity (W. F. Chaplin, 1991)--were applied to attributional style. A nonclinical sample (N = 123) completed measures of attributional style and depressive symptoms. In a series of multiple regression analyses, none of the traitedness indicators significantly moderated the relation of attributional style with depressive symptoms. The authors discuss several methodological and conceptual explanations for these null results.

  2. An Empirical Study of Eight Nonparametric Tests in Hierarchical Regression.

    ERIC Educational Resources Information Center

    Harwell, Michael; Serlin, Ronald C.

    When normality does not hold, nonparametric tests represent an important data-analytic alternative to parametric tests. However, the use of nonparametric tests in educational research has been limited by the absence of easily performed tests for complex experimental designs and analyses, such as factorial designs and multiple regression analyses,…

  3. How Many Subjects Does It Take to Do a Regression Analysis?

    ERIC Educational Resources Information Center

    Green, Samuel B.

    1991-01-01

    An evaluation of the rules-of-thumb used to determine the minimum number of subjects required to conduct multiple regression analyses suggests that researchers who use a rule of thumb rather than power analyses trade simplicity of use for accuracy and specificity of response. Insufficient power is likely to result. (SLD)

  4. Hydrology and trout populations of cold-water rivers of Michigan and Wisconsin

    USGS Publications Warehouse

    Hendrickson, G.E.; Knutilla, R.L.

    1974-01-01

    Statistical multiple-regression analyses showed significant relationships between trout populations and hydrologic parameters. Parameters showing the higher levels of significance were temperature, hardness of water, percentage of gravel bottom, percentage of bottom vegetation, variability of streamflow, and discharge per unit drainage area. Trout populations increase with lower levels of annual maximum water temperatures, with increase in water hardness, and with increase in percentage of gravel and bottom vegetation. Trout populations also increase with decrease in variability of streamflow, and with increase in discharge per unit drainage area. Most hydrologic parameters were significant when evaluated collectively, but no parameter, by itself, showed a high degree of correlation with trout populations in regression analyses that included all the streams sampled. Regression analyses of stream segments that were restricted to certain limits of hardness, temperature, or percentage of gravel bottom showed improvements in correlation. Analyses of trout populations, in pounds per acre and pounds per mile and hydrologic parameters resulted in regression equations from which trout populations could be estimated with standard errors of 89 and 84 per cent, respectively.

  5. Traffic-related air pollution and spectacles use in schoolchildren

    PubMed Central

    Nieuwenhuijsen, Mark J.; Basagaña, Xavier; Alvarez-Pedrerol, Mar; Dalmau-Bueno, Albert; Cirach, Marta; Rivas, Ioar; Brunekreef, Bert; Querol, Xavier; Morgan, Ian G.; Sunyer, Jordi

    2017-01-01

    Purpose To investigate the association between exposure to traffic-related air pollution and use of spectacles (as a surrogate measure for myopia) in schoolchildren. Methods We analyzed the impact of exposure to NO2 and PM2.5 light absorbance at home (predicted by land-use regression models) and exposure to NO2 and black carbon (BC) at school (measured by monitoring campaigns) on the use of spectacles in a cohort of 2727 schoolchildren (7–10 years old) in Barcelona (2012–2015). We conducted cross-sectional analyses based on lifelong exposure to air pollution and prevalent cases of spectacles at baseline data collection campaign as well as longitudinal analyses based on incident cases of spectacles use and exposure to air pollution during the three-year period between the baseline and last data collection campaigns. Logistic regression models were developed to quantify the association between spectacles use and each of air pollutants adjusted for relevant covariates. Results An interquartile range increase in exposure to NO2 and PM2.5 absorbance at home was respectively associated with odds ratios (95% confidence intervals (CIs)) for spectacles use of 1.16 (1.03, 1.29) and 1.13 (0.99, 1.28) in cross-sectional analyses and 1.15 (1.00, 1.33) and 1.23 (1.03, 1.46) in longitudinal analyses. Similarly, odds ratio (95% CIs) of spectacles use associated with an interquartile range increase in exposures to NO2 and black carbon at school was respectively 1.32 (1.09, 1.59) and 1.13 (0.97, 1.32) in cross-sectional analyses and 1.12 (0.84, 1.50) and 1.27 (1.03, 1.56) in longitudinal analyses. These findings were robust to a range of sensitivity analyses that we conducted. Conclusion We observed increased risk of spectacles use associated with exposure to traffic-related air pollution. These findings require further confirmation by future studies applying more refined outcome measures such as quantified visual acuity and separating different types of refractive errors. PMID:28369072

  6. Further Insight and Additional Inference Methods for Polynomial Regression Applied to the Analysis of Congruence

    ERIC Educational Resources Information Center

    Cohen, Ayala; Nahum-Shani, Inbal; Doveh, Etti

    2010-01-01

    In their seminal paper, Edwards and Parry (1993) presented the polynomial regression as a better alternative to applying difference score in the study of congruence. Although this method is increasingly applied in congruence research, its complexity relative to other methods for assessing congruence (e.g., difference score methods) was one of the…

  7. Applications of machine learning and data mining methods to detect associations of rare and common variants with complex traits.

    PubMed

    Lu, Ake Tzu-Hui; Austin, Erin; Bonner, Ashley; Huang, Hsin-Hsiung; Cantor, Rita M

    2014-09-01

    Machine learning methods (MLMs), designed to develop models using high-dimensional predictors, have been used to analyze genome-wide genetic and genomic data to predict risks for complex traits. We summarize the results from six contributions to our Genetic Analysis Workshop 18 working group; these investigators applied MLMs and data mining to analyses of rare and common genetic variants measured in pedigrees. To develop risk profiles, group members analyzed blood pressure traits along with single-nucleotide polymorphisms and rare variant genotypes derived from sequence and imputation analyses in large Mexican American pedigrees. Supervised MLMs included penalized regression with varying penalties, support vector machines, and permanental classification. Unsupervised MLMs included sparse principal components analysis and sparse graphical models. Entropy-based components analyses were also used to mine these data. None of the investigators fully capitalized on the genetic information provided by the complete pedigrees. Their approaches either corrected for the nonindependence of the individuals within the pedigrees or analyzed only those who were independent. Some methods allowed for covariate adjustment, whereas others did not. We evaluated these methods using a variety of metrics. Four contributors conducted primary analyses on the real data, and the other two research groups used the simulated data with and without knowledge of the underlying simulation model. One group used the answers to the simulated data to assess power and type I errors. Although the MLMs applied were substantially different, each research group concluded that MLMs have advantages over standard statistical approaches with these high-dimensional data. © 2014 WILEY PERIODICALS, INC.

  8. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994

  9. Studying Gene and Gene-Environment Effects of Uncommon and Common Variants on Continuous Traits: A Marker-Set Approach Using Gene-Trait Similarity Regression

    PubMed Central

    Tzeng, Jung-Ying; Zhang, Daowen; Pongpanich, Monnat; Smith, Chris; McCarthy, Mark I.; Sale, Michèle M.; Worrall, Bradford B.; Hsu, Fang-Chi; Thomas, Duncan C.; Sullivan, Patrick F.

    2011-01-01

    Genomic association analyses of complex traits demand statistical tools that are capable of detecting small effects of common and rare variants and modeling complex interaction effects and yet are computationally feasible. In this work, we introduce a similarity-based regression method for assessing the main genetic and interaction effects of a group of markers on quantitative traits. The method uses genetic similarity to aggregate information from multiple polymorphic sites and integrates adaptive weights that depend on allele frequencies to accomodate common and uncommon variants. Collapsing information at the similarity level instead of the genotype level avoids canceling signals that have the opposite etiological effects and is applicable to any class of genetic variants without the need for dichotomizing the allele types. To assess gene-trait associations, we regress trait similarities for pairs of unrelated individuals on their genetic similarities and assess association by using a score test whose limiting distribution is derived in this work. The proposed regression framework allows for covariates, has the capacity to model both main and interaction effects, can be applied to a mixture of different polymorphism types, and is computationally efficient. These features make it an ideal tool for evaluating associations between phenotype and marker sets defined by linkage disequilibrium (LD) blocks, genes, or pathways in whole-genome analysis. PMID:21835306

  10. Impact of External Price Referencing on Medicine Prices – A Price Comparison Among 14 European Countries

    PubMed Central

    Leopold, Christine; Mantel-Teeuwisse, Aukje Katja; Seyfang, Leonhard; Vogler, Sabine; de Joncheere, Kees; Laing, Richard Ogilvie; Leufkens, Hubert

    2012-01-01

    Objectives: This study aims to examine the impact of external price referencing (EPR) on on-patent medicine prices, adjusting for other factors that may affect price levels such as sales volume, exchange rates, gross domestic product (GDP) per capita, total pharmaceutical expenditure (TPE), and size of the pharmaceutical industry. Methods: Price data of 14 on-patent products, in 14 European countries in 2007 and 2008 were obtained from the Pharmaceutical Price Information Service of the Austrian Health Institute. Based on the unit ex-factory prices in EURO, scaled ranks per country and per product were calculated. For the regression analysis the scaled ranks per country and product were weighted; each country had the same sum of weights but within a country the weights were proportional to its sales volume in the year (data obtained from IMS Health). Taking the scaled ranks, several statistical analyses were performed by using the program “R”, including a multiple regression analysis (including variables such as GDP per capita and national industry size). Results: This study showed that on average EPR as a pricing policy leads to lower prices. However, the large variation in price levels among countries using EPR confirmed that the price level is not only driven by EPR. The unadjusted linear regression model confirms that applying EPR in a country is associated with a lower scaled weighted rank (p=0.002). This interaction persisted after inclusion of total pharmaceutical expenditure per capita and GDP per capita in the final model. Conclusions: The study showed that for patented products, prices are in general lower in case the country applied EPR. Nevertheless substantial price differences among countries that apply EPR could be identified. Possible explanations could be found through a correlation between pharmaceutical industry and the scaled price ranks. In conclusion, we found that implementing external reference pricing could lead to lower prices. PMID:23532710

  11. Spatially Explicit Estimates of Suspended Sediment and Bedload Transport Rates for Western Oregon and Northwestern California

    NASA Astrophysics Data System (ADS)

    O'Connor, J. E.; Wise, D. R.; Mangano, J.; Jones, K.

    2015-12-01

    Empirical analyses of suspended sediment and bedload transport gives estimates of sediment flux for western Oregon and northwestern California. The estimates of both bedload and suspended load are from regression models relating measured annual sediment yield to geologic, physiographic, and climatic properties of contributing basins. The best models include generalized geology and either slope or precipitation. The best-fit suspended-sediment model is based on basin geology, precipitation, and area of recent wildfire. It explains 65% of the variance for 68 suspended sediment measurement sites within the model area. Predicted suspended sediment yields range from no yield from the High Cascades geologic province to 200 tonnes/ km2-yr in the northern Oregon Coast Range and 1000 tonnes/km2-yr in recently burned areas of the northern Klamath terrain. Bed-material yield is similarly estimated from a regression model based on 22 sites of measured bed-material transport, mostly from reservoir accumulation analyses but also from several bedload measurement programs. The resulting best-fit regression is based on basin slope and the presence/absence of the Klamath geologic terrane. For the Klamath terrane, bed-material yield is twice that of the other geologic provinces. This model explains more than 80% of the variance of the better-quality measurements. Predicted bed-material yields range up to 350 tonnes/ km2-yr in steep areas of the Klamath terrane. Applying these regressions to small individual watersheds (mean size; 66 km2 for bed-material; 3 km2 for suspended sediment) and cumulating totals down the hydrologic network (but also decreasing the bed-material flux by experimentally determined attrition rates) gives spatially explicit estimates of both bed-material and suspended sediment flux. This enables assessment of several management issues, including the effects of dams on bedload transport, instream gravel mining, habitat formation processes, and water-quality. The combined fluxes can also be compared to long-term rock uplift and cosmogenically determined landscape erosion rates.

  12. Spanish Multicenter Normative Studies (NEURONORMA Project): norms for Boston naming test and token test.

    PubMed

    Peña-Casanova, Jordi; Quiñones-Ubeda, Sonia; Gramunt-Fombuena, Nina; Aguilar, Miquel; Casas, Laura; Molinuevo, José Luis; Robles, Alfredo; Rodríguez, Dolores; Barquero, María Sagrario; Antúnez, Carmen; Martínez-Parra, Carlos; Frank-García, Anna; Fernández, Manuel; Molano, Ana; Alfonso, Verónica; Sol, Josep M; Blesa, Rafael

    2009-06-01

    As part of the Spanish Multicenter Normative Studies (NEURONORMA project), we provide age- and education-adjusted norms for the Boston naming test and Token test. The sample consists of 340 and 348 participants, respectively, who are cognitively normal, community-dwelling, and ranging in age from 50 to 94 years. Tables are provided to convert raw scores to age-adjusted scaled scores. These were further converted into education-adjusted scaled scores by applying regression-based adjustments. Age and education affected the score of the both tests, but sex was found to be unrelated to naming and verbal comprehension efficiency. Our norms should provide clinically useful data for evaluating elderly Spaniards. The normative data presented here were obtained from the same study sample as all the other NEURONORMA norms and the same statistical procedures for data analyses were applied. These co-normed data allow clinicians to compare scores from one test with all tests.

  13. Behavioral adjustment of siblings of children with autism engaged in applied behavior analysis early intervention programs: the moderating role of social support.

    PubMed

    Hastings, Richard P

    2003-04-01

    There have been few studies of the impact of intensive home-based early applied behavior analysis (ABA) intervention for children with autism on family functioning. In the present study, behavioral adjustment was explored in 78 siblings of children with autism on ABA programs. First, mothers' ratings of sibling adjustment were compared to a normative sample. There were no reported increases in behavioral adjustment problems in the present sample. Second, regression analyses revealed that social support functioned as a moderator of the impact of autism severity on sibling adjustment rather than a mediator or compensatory variable. In particular, siblings in families with a less severely autistic child had fewer adjustment problems when more formal social support was also available to the family. The implications of these data for future research and for practice are discussed.

  14. Spatial modeling in ecology: the flexibility of eigenfunction spatial analyses.

    PubMed

    Griffith, Daniel A; Peres-Neto, Pedro R

    2006-10-01

    Recently, analytical approaches based on the eigenfunctions of spatial configuration matrices have been proposed in order to consider explicitly spatial predictors. The present study demonstrates the usefulness of eigenfunctions in spatial modeling applied to ecological problems and shows equivalencies of and differences between the two current implementations of this methodology. The two approaches in this category are the distance-based (DB) eigenvector maps proposed by P. Legendre and his colleagues, and spatial filtering based upon geographic connectivity matrices (i.e., topology-based; CB) developed by D. A. Griffith and his colleagues. In both cases, the goal is to create spatial predictors that can be easily incorporated into conventional regression models. One important advantage of these two approaches over any other spatial approach is that they provide a flexible tool that allows the full range of general and generalized linear modeling theory to be applied to ecological and geographical problems in the presence of nonzero spatial autocorrelation.

  15. Determination of butter adulteration with margarine using Raman spectroscopy.

    PubMed

    Uysal, Reyhan Selin; Boyaci, Ismail Hakki; Genis, Hüseyin Efe; Tamer, Ugur

    2013-12-15

    In this study, adulteration of butter with margarine was analysed using Raman spectroscopy combined with chemometric methods (principal component analysis (PCA), principal component regression (PCR), partial least squares (PLS)) and artificial neural networks (ANNs). Different butter and margarine samples were mixed at various concentrations ranging from 0% to 100% w/w. PCA analysis was applied for the classification of butters, margarines and mixtures. PCR, PLS and ANN were used for the detection of adulteration ratios of butter. Models were created using a calibration data set and developed models were evaluated using a validation data set. The coefficient of determination (R(2)) values between actual and predicted values obtained for PCR, PLS and ANN for the validation data set were 0.968, 0.987 and 0.978, respectively. In conclusion, a combination of Raman spectroscopy with chemometrics and ANN methods can be applied for testing butter adulteration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Bayesian Unimodal Density Regression for Causal Inference

    ERIC Educational Resources Information Center

    Karabatsos, George; Walker, Stephen G.

    2011-01-01

    Karabatsos and Walker (2011) introduced a new Bayesian nonparametric (BNP) regression model. Through analyses of real and simulated data, they showed that the BNP regression model outperforms other parametric and nonparametric regression models of common use, in terms of predictive accuracy of the outcome (dependent) variable. The other,…

  17. Impact of an equality constraint on the class-specific residual variances in regression mixtures: A Monte Carlo simulation study

    PubMed Central

    Kim, Minjung; Lamont, Andrea E.; Jaki, Thomas; Feaster, Daniel; Howe, George; Van Horn, M. Lee

    2015-01-01

    Regression mixture models are a novel approach for modeling heterogeneous effects of predictors on an outcome. In the model building process residual variances are often disregarded and simplifying assumptions made without thorough examination of the consequences. This simulation study investigated the impact of an equality constraint on the residual variances across latent classes. We examine the consequence of constraining the residual variances on class enumeration (finding the true number of latent classes) and parameter estimates under a number of different simulation conditions meant to reflect the type of heterogeneity likely to exist in applied analyses. Results showed that bias in class enumeration increased as the difference in residual variances between the classes increased. Also, an inappropriate equality constraint on the residual variances greatly impacted estimated class sizes and showed the potential to greatly impact parameter estimates in each class. Results suggest that it is important to make assumptions about residual variances with care and to carefully report what assumptions were made. PMID:26139512

  18. Transmission clustering among newly diagnosed HIV patients in Chicago, 2008 to 2011: using phylogenetics to expand knowledge of regional HIV transmission patterns

    PubMed Central

    Lubelchek, Ronald J.; Hoehnen, Sarah C.; Hotton, Anna L.; Kincaid, Stacey L.; Barker, David E.; French, Audrey L.

    2014-01-01

    Introduction HIV transmission cluster analyses can inform HIV prevention efforts. We describe the first such assessment for transmission clustering among HIV patients in Chicago. Methods We performed transmission cluster analyses using HIV pol sequences from newly diagnosed patients presenting to Chicago’s largest HIV clinic between 2008 and 2011. We compared sequences via progressive pairwise alignment, using neighbor joining to construct an un-rooted phylogenetic tree. We defined clusters as >2 sequences among which each sequence had at least one partner within a genetic distance of ≤ 1.5%. We used multivariable regression to examine factors associated with clustering and used geospatial analysis to assess geographic proximity of phylogenetically clustered patients. Results We compared sequences from 920 patients; median age 35 years; 75% male; 67% Black, 23% Hispanic; 8% had a Rapid Plasma Reagin (RPR) titer ≥ 1:16 concurrent with their HIV diagnosis. We had HIV transmission risk data for 54%; 43% identified as men who have sex with men (MSM). Phylogenetic analysis demonstrated 123 patients (13%) grouped into 26 clusters, the largest having 20 members. In multivariable regression, age < 25, Black race, MSM status, male gender, higher HIV viral load, and RPR ≥ 1:16 associated with clustering. We did not observe geographic grouping of genetically clustered patients. Discussion Our results demonstrate high rates of HIV transmission clustering, without local geographic foci, among young Black MSM in Chicago. Applied prospectively, phylogenetic analyses could guide prevention efforts and help break the cycle of transmission. PMID:25321182

  19. Monitoring Building Deformation with InSAR: Experiments and Validation

    PubMed Central

    Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng

    2016-01-01

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated. PMID:27999403

  20. Placebo and nocebo reactions in randomized trials of pharmacological treatments for persistent depressive disorder. A meta-regression analysis.

    PubMed

    Meister, Ramona; Jansen, Alessa; Härter, Martin; Nestoriuc, Yvonne; Kriston, Levente

    2017-06-01

    We aimed to investigate placebo and nocebo reactions in randomized controlled trials (RCT) of pharmacological treatments for persistent depressive disorder (PDD). We conducted a systematic electronic search and included RCTs investigating antidepressants for the treatment of PDD. Outcomes were the number of patients experiencing response and remission in placebo arms (=placebo reaction). Additional outcomes were the incidence of patients experiencing adverse events and related discontinuations in placebo arms (=nocebo reaction). A priori defined effect modifiers were analyzed using a series of meta-regression analyses. Twenty-three trials were included in the analyses. We found a pooled placebo response rate of 31% and a placebo remission rate of 22%. The pooled adverse event rate and related discontinuations were 57% and 4%, respectively. All placebo arm outcomes were positively associated with the corresponding medication arm outcomes. Placebo response rate was associated with a greater proportion of patients with early onset depression, a smaller chance to receive placebo and a larger sample size. The adverse event rate in placebo arms was associated with a greater proportion of patients with early onset depression, a smaller proportion of females and a more recent publication. Pooled placebo and nocebo reaction rates in PDD were comparable to those in episodic depression. The identified effect modifiers should be considered to assess unbiased effects in RCTs, to influence placebo and nocebo reactions in practice. Limitations result from the methodology applied, the fact that we conducted only univariate analyses, and the number and quality of included trials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Partial Least Squares Regression Can Aid in Detecting Differential Abundance of Multiple Features in Sets of Metagenomic Samples

    PubMed Central

    Libiger, Ondrej; Schork, Nicholas J.

    2015-01-01

    It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061

  2. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  3. Valuing the visual impact of wind farms: A calculus method for synthesizing choice experiments studies.

    PubMed

    Wen, Cheng; Dallimer, Martin; Carver, Steve; Ziv, Guy

    2018-05-06

    Despite the great potential of mitigating carbon emission, development of wind farms is often opposed by local communities due to the visual impact on landscape. A growing number of studies have applied nonmarket valuation methods like Choice Experiments (CE) to value the visual impact by eliciting respondents' willingness to pay (WTP) or willingness to accept (WTA) for hypothetical wind farms through survey questions. Several meta-analyses have been found in the literature to synthesize results from different valuation studies, but they have various limitations related to the use of the prevailing multivariate meta-regression analysis. In this paper, we propose a new meta-analysis method to establish general functions for the relationships between the estimated WTP or WTA and three wind farm attributes, namely the distance to residential/coastal areas, the number of turbines and turbine height. This method involves establishing WTA or WTP functions for individual studies, fitting the average derivative functions and deriving the general integral functions of WTP or WTA against wind farm attributes. Results indicate that respondents in different studies consistently showed increasing WTP for moving wind farms to greater distances, which can be fitted by non-linear (natural logarithm) functions. However, divergent preferences for the number of turbines and turbine height were found in different studies. We argue that the new analysis method proposed in this paper is an alternative to the mainstream multivariate meta-regression analysis for synthesizing CE studies and the general integral functions of WTP or WTA against wind farm attributes are useful for future spatial modelling and benefit transfer studies. We also suggest that future multivariate meta-analyses should include non-linear components in the regression functions. Copyright © 2018. Published by Elsevier B.V.

  4. Genetic analysis of body weights of individually fed beef bulls in South Africa using random regression models.

    PubMed

    Selapa, N W; Nephawe, K A; Maiwashe, A; Norris, D

    2012-02-08

    The aim of this study was to estimate genetic parameters for body weights of individually fed beef bulls measured at centralized testing stations in South Africa using random regression models. Weekly body weights of Bonsmara bulls (N = 2919) tested between 1999 and 2003 were available for the analyses. The model included a fixed regression of the body weights on fourth-order orthogonal Legendre polynomials of the actual days on test (7, 14, 21, 28, 35, 42, 49, 56, 63, 70, 77, and 84) for starting age and contemporary group effects. Random regressions on fourth-order orthogonal Legendre polynomials of the actual days on test were included for additive genetic effects and additional uncorrelated random effects of the weaning-herd-year and the permanent environment of the animal. Residual effects were assumed to be independently distributed with heterogeneous variance for each test day. Variance ratios for additive genetic, permanent environment and weaning-herd-year for weekly body weights at different test days ranged from 0.26 to 0.29, 0.37 to 0.44 and 0.26 to 0.34, respectively. The weaning-herd-year was found to have a significant effect on the variation of body weights of bulls despite a 28-day adjustment period. Genetic correlations amongst body weights at different test days were high, ranging from 0.89 to 1.00. Heritability estimates were comparable to literature using multivariate models. Therefore, random regression model could be applied in the genetic evaluation of body weight of individually fed beef bulls in South Africa.

  5. The association between short interpregnancy interval and preterm birth in Louisiana: a comparison of methods.

    PubMed

    Howard, Elizabeth J; Harville, Emily; Kissinger, Patricia; Xiong, Xu

    2013-07-01

    There is growing interest in the application of propensity scores (PS) in epidemiologic studies, especially within the field of reproductive epidemiology. This retrospective cohort study assesses the impact of a short interpregnancy interval (IPI) on preterm birth and compares the results of the conventional logistic regression analysis with analyses utilizing a PS. The study included 96,378 singleton infants from Louisiana birth certificate data (1995-2007). Five regression models designed for methods comparison are presented. Ten percent (10.17 %) of all births were preterm; 26.83 % of births were from a short IPI. The PS-adjusted model produced a more conservative estimate of the exposure variable compared to the conventional logistic regression method (β-coefficient: 0.21 vs. 0.43), as well as a smaller standard error (0.024 vs. 0.028), odds ratio and 95 % confidence intervals [1.15 (1.09, 1.20) vs. 1.23 (1.17, 1.30)]. The inclusion of more covariate and interaction terms in the PS did not change the estimates of the exposure variable. This analysis indicates that PS-adjusted regression may be appropriate for validation of conventional methods in a large dataset with a fairly common outcome. PS's may be beneficial in producing more precise estimates, especially for models with many confounders and effect modifiers and where conventional adjustment with logistic regression is unsatisfactory. Short intervals between pregnancies are associated with preterm birth in this population, according to either technique. Birth spacing is an issue that women have some control over. Educational interventions, including birth control, should be applied during prenatal visits and following delivery.

  6. Landslide susceptibility mapping using frequency ratio, logistic regression, artificial neural networks and their comparison: A case study from Kat landslides (Tokat—Turkey)

    NASA Astrophysics Data System (ADS)

    Yilmaz, Işık

    2009-06-01

    The purpose of this study is to compare the landslide susceptibility mapping methods of frequency ratio (FR), logistic regression and artificial neural networks (ANN) applied in the Kat County (Tokat—Turkey). Digital elevation model (DEM) was first constructed using GIS software. Landslide-related factors such as geology, faults, drainage system, topographical elevation, slope angle, slope aspect, topographic wetness index (TWI) and stream power index (SPI) were used in the landslide susceptibility analyses. Landslide susceptibility maps were produced from the frequency ratio, logistic regression and neural networks models, and they were then compared by means of their validations. The higher accuracies of the susceptibility maps for all three models were obtained from the comparison of the landslide susceptibility maps with the known landslide locations. However, respective area under curve (AUC) values of 0.826, 0.842 and 0.852 for frequency ratio, logistic regression and artificial neural networks showed that the map obtained from ANN model is more accurate than the other models, accuracies of all models can be evaluated relatively similar. The results obtained in this study also showed that the frequency ratio model can be used as a simple tool in assessment of landslide susceptibility when a sufficient number of data were obtained. Input process, calculations and output process are very simple and can be readily understood in the frequency ratio model, however logistic regression and neural networks require the conversion of data to ASCII or other formats. Moreover, it is also very hard to process the large amount of data in the statistical package.

  7. The Energy Content and Composition of Meals Consumed after an Overnight Fast and Their Effects on Diet Induced Thermogenesis: A Systematic Review, Meta-Analyses and Meta-Regressions

    PubMed Central

    Quatela, Angelica; Callister, Robin; Patterson, Amanda; MacDonald-Wicks, Lesley

    2016-01-01

    This systematic review investigated the effects of differing energy intakes, macronutrient compositions, and eating patterns of meals consumed after an overnight fast on Diet Induced Thermogenesis (DIT). The initial search identified 2482 records; 26 papers remained once duplicates were removed and inclusion criteria were applied. Studies (n = 27) in the analyses were randomized crossover designs comparing the effects of two or more eating events on DIT. Higher energy intake increased DIT; in a mixed model meta-regression, for every 100 kJ increase in energy intake, DIT increased by 1.1 kJ/h (p < 0.001). Meals with a high protein or carbohydrate content had a higher DIT than high fat, although this effect was not always significant. Meals with medium chain triglycerides had a significantly higher DIT than long chain triglycerides (meta-analysis, p = 0.002). Consuming the same meal as a single bolus eating event compared to multiple small meals or snacks was associated with a significantly higher DIT (meta-analysis, p = 0.02). Unclear or inconsistent findings were found by comparing the consumption of meals quickly or slowly, and palatability was not significantly associated with DIT. These findings indicate that the magnitude of the increase in DIT is influenced by the energy intake, macronutrient composition, and eating pattern of the meal. PMID:27792142

  8. The use of single-date MODIS imagery for estimating large-scale urban impervious surface fraction with spectral mixture analysis and machine learning techniques

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Wu, Changshan

    2013-12-01

    Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.

  9. Predictors of HIV-protection behaviour in HIV-positive men who have sex with casual male partners: a test of the explanatory power of an extended Information-Motivation-Behavioural Skills model.

    PubMed

    Nideröst, Sibylle; Gredig, Daniel; Roulin, Christophe; Rickenbach, Martin

    2011-07-01

    This prospective study applies an extended Information-Motivation-Behavioural Skills (IMB) model to establish predictors of HIV-protection behaviour among HIV-positive men who have sex with men (MSM) during sex with casual partners. Data have been collected from anonymous, self-administered questionnaires and analysed by using descriptive and backward elimination regression analyses. In a sample of 165 HIV-positive MSM, 82 participants between the ages of 23 and 78 (M=46.4, SD=9.0) had sex with casual partners during the three-month period under investigation. About 62% (n=51) have always used a condom when having sex with casual partners. From the original IMB model, only subjective norm predicted condom use. More important predictors that increased condom use were low consumption of psychotropics, high satisfaction with sexuality, numerous changes in sexual behaviour after diagnosis, low social support from friends, alcohol use before sex and habitualised condom use with casual partner(s). The explanatory power of the calculated regression model was 49% (p<0.001). The study reveals the importance of personal and social resources and of routines for condom use, and provides information for the research-based conceptualisation of prevention offers addressing especially people living with HIV ("positive prevention").

  10. Gender Gaps in Mathematics, Science and Reading Achievements in Muslim Countries: A Quantile Regression Approach

    ERIC Educational Resources Information Center

    Shafiq, M. Najeeb

    2013-01-01

    Using quantile regression analyses, this study examines gender gaps in mathematics, science, and reading in Azerbaijan, Indonesia, Jordan, the Kyrgyz Republic, Qatar, Tunisia, and Turkey among 15-year-old students. The analyses show that girls in Azerbaijan achieve as well as boys in mathematics and science and overachieve in reading. In Jordan,…

  11. Connecting clinical and actuarial prediction with rule-based methods.

    PubMed

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  12. The impact of global signal regression on resting state correlations: Are anti-correlated networks introduced?

    PubMed Central

    Murphy, Kevin; Birn, Rasmus M.; Handwerker, Daniel A.; Jones, Tyler B.; Bandettini, Peter A.

    2009-01-01

    Low-frequency fluctuations in fMRI signal have been used to map several consistent resting state networks in the brain. Using the posterior cingulate cortex as a seed region, functional connectivity analyses have found not only positive correlations in the default mode network but negative correlations in another resting state network related to attentional processes. The interpretation is that the human brain is intrinsically organized into dynamic, anti-correlated functional networks. Global variations of the BOLD signal are often considered nuisance effects and are commonly removed using a general linear model (GLM) technique. This global signal regression method has been shown to introduce negative activation measures in standard fMRI analyses. The topic of this paper is whether such a correction technique could be the cause of anti-correlated resting state networks in functional connectivity analyses. Here we show that, after global signal regression, correlation values to a seed voxel must sum to a negative value. Simulations also show that small phase differences between regions can lead to spurious negative correlation values. A combination breath holding and visual task demonstrates that the relative phase of global and local signals can affect connectivity measures and that, experimentally, global signal regression leads to bell-shaped correlation value distributions, centred on zero. Finally, analyses of negatively correlated networks in resting state data show that global signal regression is most likely the cause of anti-correlations. These results call into question the interpretation of negatively correlated regions in the brain when using global signal regression as an initial processing step. PMID:18976716

  13. The impact of global signal regression on resting state correlations: are anti-correlated networks introduced?

    PubMed

    Murphy, Kevin; Birn, Rasmus M; Handwerker, Daniel A; Jones, Tyler B; Bandettini, Peter A

    2009-02-01

    Low-frequency fluctuations in fMRI signal have been used to map several consistent resting state networks in the brain. Using the posterior cingulate cortex as a seed region, functional connectivity analyses have found not only positive correlations in the default mode network but negative correlations in another resting state network related to attentional processes. The interpretation is that the human brain is intrinsically organized into dynamic, anti-correlated functional networks. Global variations of the BOLD signal are often considered nuisance effects and are commonly removed using a general linear model (GLM) technique. This global signal regression method has been shown to introduce negative activation measures in standard fMRI analyses. The topic of this paper is whether such a correction technique could be the cause of anti-correlated resting state networks in functional connectivity analyses. Here we show that, after global signal regression, correlation values to a seed voxel must sum to a negative value. Simulations also show that small phase differences between regions can lead to spurious negative correlation values. A combination breath holding and visual task demonstrates that the relative phase of global and local signals can affect connectivity measures and that, experimentally, global signal regression leads to bell-shaped correlation value distributions, centred on zero. Finally, analyses of negatively correlated networks in resting state data show that global signal regression is most likely the cause of anti-correlations. These results call into question the interpretation of negatively correlated regions in the brain when using global signal regression as an initial processing step.

  14. Analysis of Palm Oil Production, Export, and Government Consumption to Gross Domestic Product of Five Districts in West Kalimantan by Panel Regression

    NASA Astrophysics Data System (ADS)

    Sulistianingsih, E.; Kiftiah, M.; Rosadi, D.; Wahyuni, H.

    2017-04-01

    Gross Domestic Product (GDP) is an indicator of economic growth in a region. GDP is a panel data, which consists of cross-section and time series data. Meanwhile, panel regression is a tool which can be utilised to analyse panel data. There are three models in panel regression, namely Common Effect Model (CEM), Fixed Effect Model (FEM) and Random Effect Model (REM). The models will be chosen based on results of Chow Test, Hausman Test and Lagrange Multiplier Test. This research analyses palm oil about production, export, and government consumption to five district GDP are in West Kalimantan, namely Sanggau, Sintang, Sambas, Ketapang and Bengkayang by panel regression. Based on the results of analyses, it concluded that REM, which adjusted-determination-coefficient is 0,823, is the best model in this case. Also, according to the result, only Export and Government Consumption that influence GDP of the districts.

  15. Epidemiologic programs for computers and calculators. A microcomputer program for multiple logistic regression by unconditional and conditional maximum likelihood methods.

    PubMed

    Campos-Filho, N; Franco, E L

    1989-02-01

    A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.

  16. Regression Discontinuity in Prospective Evaluations: The Case of the FFVP Evaluation

    ERIC Educational Resources Information Center

    Klerman, Jacob Alex; Olsho, Lauren E. W.; Bartlett, Susan

    2015-01-01

    While regression discontinuity has usually been applied retrospectively to secondary data, it is even more attractive when applied prospectively. In a prospective design, data collection can be focused on cases near the discontinuity, thereby improving internal validity and substantially increasing precision. Furthermore, such prospective…

  17. Standardized Regression Coefficients as Indices of Effect Sizes in Meta-Analysis

    ERIC Educational Resources Information Center

    Kim, Rae Seon

    2011-01-01

    When conducting a meta-analysis, it is common to find many collected studies that report regression analyses, because multiple regression analysis is widely used in many fields. Meta-analysis uses effect sizes drawn from individual studies as a means of synthesizing a collection of results. However, indices of effect size from regression analyses…

  18. Alternative configurations of Quantile Regression for estimating predictive uncertainty in water level forecasts for the Upper Severn River: a comparison

    NASA Astrophysics Data System (ADS)

    Lopez, Patricia; Verkade, Jan; Weerts, Albrecht; Solomatine, Dimitri

    2014-05-01

    Hydrological forecasting is subject to many sources of uncertainty, including those originating in initial state, boundary conditions, model structure and model parameters. Although uncertainty can be reduced, it can never be fully eliminated. Statistical post-processing techniques constitute an often used approach to estimate the hydrological predictive uncertainty, where a model of forecast error is built using a historical record of past forecasts and observations. The present study focuses on the use of the Quantile Regression (QR) technique as a hydrological post-processor. It estimates the predictive distribution of water levels using deterministic water level forecasts as predictors. This work aims to thoroughly verify uncertainty estimates using the implementation of QR that was applied in an operational setting in the UK National Flood Forecasting System, and to inter-compare forecast quality and skill in various, differing configurations of QR. These configurations are (i) 'classical' QR, (ii) QR constrained by a requirement that quantiles do not cross, (iii) QR derived on time series that have been transformed into the Normal domain (Normal Quantile Transformation - NQT), and (iv) a piecewise linear derivation of QR models. The QR configurations are applied to fourteen hydrological stations on the Upper Severn River with different catchments characteristics. Results of each QR configuration are conditionally verified for progressively higher flood levels, in terms of commonly used verification metrics and skill scores. These include Brier's probability score (BS), the continuous ranked probability score (CRPS) and corresponding skill scores as well as the Relative Operating Characteristic score (ROCS). Reliability diagrams are also presented and analysed. The results indicate that none of the four Quantile Regression configurations clearly outperforms the others.

  19. Analysing Twitter and web queries for flu trend prediction.

    PubMed

    Santos, José Carlos; Matos, Sérgio

    2014-05-07

    Social media platforms encourage people to share diverse aspects of their daily life. Among these, shared health related information might be used to infer health status and incidence rates for specific conditions or symptoms. In this work, we present an infodemiology study that evaluates the use of Twitter messages and search engine query logs to estimate and predict the incidence rate of influenza like illness in Portugal. Based on a manually classified dataset of 2704 tweets from Portugal, we selected a set of 650 textual features to train a Naïve Bayes classifier to identify tweets mentioning flu or flu-like illness or symptoms. We obtained a precision of 0.78 and an F-measure of 0.83, based on cross validation over the complete annotated set. Furthermore, we trained a multiple linear regression model to estimate the health-monitoring data from the Influenzanet project, using as predictors the relative frequencies obtained from the tweet classification results and from query logs, and achieved a correlation ratio of 0.89 (p<0.001). These classification and regression models were also applied to estimate the flu incidence in the following flu season, achieving a correlation of 0.72. Previous studies addressing the estimation of disease incidence based on user-generated content have mostly focused on the english language. Our results further validate those studies and show that by changing the initial steps of data preprocessing and feature extraction and selection, the proposed approaches can be adapted to other languages. Additionally, we investigated whether the predictive model created can be applied to data from the subsequent flu season. In this case, although the prediction result was good, an initial phase to adapt the regression model could be necessary to achieve more robust results.

  20. Psychosocial risk and protective factors for depression in the dialysis population: a systematic review and meta-regression analysis.

    PubMed

    Chan, Ramony; Steel, Zachary; Brooks, Robert; Heung, Tracy; Erlich, Jonathan; Chow, Josephine; Suranyi, Michael

    2011-11-01

    Research into the association between psychosocial factors and depression in End-Stage Renal Disease (ESRD) has expanded considerably in recent years identifying a range of factors that may act as important risk and protective factors of depression for this population. The present study provides the first systematic review and meta-analysis of this body of research. Published studies reporting associations between any psychosocial factor and depression were identified and retrieved from Medline, Embase, and PsycINFO, by applying optimised search strategies. Mean effect sizes were calculated for the associations across five psychosocial constructs (social support, personality attributes, cognitive appraisal, coping process, stress/stressor). Multiple hierarchical meta-regression analysis was applied to examine the moderating effects of methodological and substantive factors on the strength of the observed associations. 57 studies covering 58 independent samples with 5956 participants were identified, resulting in 246 effect sizes of the association between a range of psychosocial factors and depression. The overall mean effect size (Pearsons correlation coefficient) of the association between psychosocial factor and depression was 0.36. The effect sizes between the five psychosocial constructs and depression ranged from medium (0.27) to large levels (0.46) with personality attributes (0.46) and cognitive appraisal (0.46) having the largest effect sizes. In the meta-regression analyses, identified demographic (gender, age, location of study) and treatment (type of dialysis) characteristics moderated the strength of the associations with depression. The current analysis documents a moderate to large association between the presence of psychosocial risk factors and depression in ESRD. 2011. Published by Elsevier Inc. All rights reserved.

  1. Principal component analysis-based pattern analysis of dose-volume histograms and influence on rectal toxicity.

    PubMed

    Söhn, Matthias; Alber, Markus; Yan, Di

    2007-09-01

    The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as "eigenmodes," which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe approximately 94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses ( approximately 40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches.

  2. Stature in archeological samples from central Italy: methodological issues and diachronic changes.

    PubMed

    Giannecchini, Monica; Moggi-Cecchi, Jacopo

    2008-03-01

    Stature reconstructions from skeletal remains are usually obtained through regression equations based on the relationship between height and limb bone length. Different equations have been employed to reconstruct stature in skeletal samples, but this is the first study to provide a systematic analysis of the reliability of the different methods for Italian historical samples. Aims of this article are: 1) to analyze the reliability of different regression methods to estimate stature for populations living in Central Italy from the Iron Age to Medieval times; 2) to search for trends in stature over this time period by applying the most reliable regression method. Long bone measurements were collected from 1,021 individuals (560 males, 461 females), from 66 archeological sites for males and 54 for females. Three time periods were identified: Iron Age, Roman period, and Medieval period. To determine the most appropriate equation to reconstruct stature the Delta parameter of Gini (Memorie di metodologia statistica. Milano: Giuffre A. 1939), in which stature estimates derived from different limb bones are compared, was employed. The equations proposed by Pearson (Philos Trans R Soc London 192 (1899) 169-244) and Trotter and Gleser for Afro-Americans (Am J Phys Anthropol 10 (1952) 463-514; Am J Phys Anthropol 47 (1977) 355-356) provided the most consistent estimates when applied to our sample. We then used the equation by Pearson for further analyses. Results indicate a reduction in stature in the transition from the Iron Age to the Roman period, and a subsequent increase in the transition from the Roman period to the Medieval period. Changes of limb lengths over time were more pronounced in the distal than in the proximal elements in both limbs. 2007 Wiley-Liss, Inc.

  3. Quality by design (QbD), Process Analytical Technology (PAT), and design of experiment applied to the development of multifunctional sunscreens.

    PubMed

    Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim

    2017-02-01

    Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.

  4. Applying intersectionality to explore the relations between gendered racism and health among Black women.

    PubMed

    Lewis, Jioni A; Williams, Marlene G; Peppers, Erica J; Gadson, Cecile A

    2017-10-01

    The purpose of this study was to apply an intersectionality framework to explore the influence of gendered racism (i.e., intersection of racism and sexism) on health outcomes. Specifically, we applied intersectionality to extend a biopsychosocial model of racism to highlight the psychosocial variables that mediate and moderate the influence of gendered racial microaggressions (i.e., subtle gendered racism) on health outcomes. In addition, we tested aspects of this conceptual model by exploring the influence of gendered racial microaggressions on the mental and physical health of Black women. In addition, we explored the mediating role of coping strategies and the moderating role of gendered racial identity centrality. Participants were 231 Black women who completed an online survey. Results from regression analyses indicated that gendered racial microaggressions significantly predicted both self-reported mental and physical health outcomes. In addition, results from mediation analyses indicated that disengagement coping significantly mediated the link between gendered racial microaggressions and negative mental and physical health. In addition, a moderated mediation effect was found, such that individuals who reported a greater frequency of gendered racial microaggressions and reported lower levels of gendered racial identity centrality tended to use greater disengagement coping, which in turn, was negatively associated with mental and physical health outcomes. Findings of this study suggest that gendered racial identity centrality can serve a buffering role against the negative mental and physical health effects of gendered racism for Black women. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Random regression analyses using B-splines to model growth of Australian Angus cattle

    PubMed Central

    Meyer, Karin

    2005-01-01

    Regression on the basis function of B-splines has been advocated as an alternative to orthogonal polynomials in random regression analyses. Basic theory of splines in mixed model analyses is reviewed, and estimates from analyses of weights of Australian Angus cattle from birth to 820 days of age are presented. Data comprised 84 533 records on 20 731 animals in 43 herds, with a high proportion of animals with 4 or more weights recorded. Changes in weights with age were modelled through B-splines of age at recording. A total of thirteen analyses, considering different combinations of linear, quadratic and cubic B-splines and up to six knots, were carried out. Results showed good agreement for all ages with many records, but fluctuated where data were sparse. On the whole, analyses using B-splines appeared more robust against "end-of-range" problems and yielded more consistent and accurate estimates of the first eigenfunctions than previous, polynomial analyses. A model fitting quadratic B-splines, with knots at 0, 200, 400, 600 and 821 days and a total of 91 covariance components, appeared to be a good compromise between detailedness of the model, number of parameters to be estimated, plausibility of results, and fit, measured as residual mean square error. PMID:16093011

  6. Refining cost-effectiveness analyses using the net benefit approach and econometric methods: an example from a trial of anti-depressant treatment.

    PubMed

    Sabes-Figuera, Ramon; McCrone, Paul; Kendricks, Antony

    2013-04-01

    Economic evaluation analyses can be enhanced by employing regression methods, allowing for the identification of important sub-groups and to adjust for imperfect randomisation in clinical trials or to analyse non-randomised data. To explore the benefits of combining regression techniques and the standard Bayesian approach to refine cost-effectiveness analyses using data from randomised clinical trials. Data from a randomised trial of anti-depressant treatment were analysed and a regression model was used to explore the factors that have an impact on the net benefit (NB) statistic with the aim of using these findings to adjust the cost-effectiveness acceptability curves. Exploratory sub-samples' analyses were carried out to explore possible differences in cost-effectiveness. Results The analysis found that having suffered a previous similar depression is strongly correlated with a lower NB, independent of the outcome measure or follow-up point. In patients with previous similar depression, adding an selective serotonin reuptake inhibitors (SSRI) to supportive care for mild-to-moderate depression is probably cost-effective at the level used by the English National Institute for Health and Clinical Excellence to make recommendations. This analysis highlights the need for incorporation of econometric methods into cost-effectiveness analyses using the NB approach.

  7. Regression and multivariate models for predicting particulate matter concentration level.

    PubMed

    Nazif, Amina; Mohammed, Nurul Izma; Malakahmad, Amirhossein; Abualqumboz, Motasem S

    2018-01-01

    The devastating health effects of particulate matter (PM 10 ) exposure by susceptible populace has made it necessary to evaluate PM 10 pollution. Meteorological parameters and seasonal variation increases PM 10 concentration levels, especially in areas that have multiple anthropogenic activities. Hence, stepwise regression (SR), multiple linear regression (MLR) and principal component regression (PCR) analyses were used to analyse daily average PM 10 concentration levels. The analyses were carried out using daily average PM 10 concentration, temperature, humidity, wind speed and wind direction data from 2006 to 2010. The data was from an industrial air quality monitoring station in Malaysia. The SR analysis established that meteorological parameters had less influence on PM 10 concentration levels having coefficient of determination (R 2 ) result from 23 to 29% based on seasoned and unseasoned analysis. While, the result of the prediction analysis showed that PCR models had a better R 2 result than MLR methods. The results for the analyses based on both seasoned and unseasoned data established that MLR models had R 2 result from 0.50 to 0.60. While, PCR models had R 2 result from 0.66 to 0.89. In addition, the validation analysis using 2016 data also recognised that the PCR model outperformed the MLR model, with the PCR model for the seasoned analysis having the best result. These analyses will aid in achieving sustainable air quality management strategies.

  8. Morse Code, Scrabble, and the Alphabet

    ERIC Educational Resources Information Center

    Richardson, Mary; Gabrosek, John; Reischman, Diann; Curtiss, Phyliss

    2004-01-01

    In this paper we describe an interactive activity that illustrates simple linear regression. Students collect data and analyze it using simple linear regression techniques taught in an introductory applied statistics course. The activity is extended to illustrate checks for regression assumptions and regression diagnostics taught in an…

  9. The Influencing Factor Analysis on the Performance Evaluation of Assembly Line Balancing Problem Level 1 (SALBP-1) Based on ANOVA Method

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Hu, Jiangnan

    2017-06-01

    Industry 4.0 and lean production has become the focus of manufacturing. A current issue is to analyse the performance of the assembly line balancing. This study focus on distinguishing the factors influencing the assembly line balancing. The one-way ANOVA method is applied to explore the significant degree of distinguished factors. And regression model is built to find key points. The maximal task time (tmax ), the quantity of tasks (n), and degree of convergence of precedence graph (conv) are critical for the performance of assembly line balancing. The conclusion will do a favor to the lean production in the manufacturing.

  10. Users manual for flight control design programs

    NASA Technical Reports Server (NTRS)

    Nalbandian, J. Y.

    1975-01-01

    Computer programs for the design of analog and digital flight control systems are documented. The program DIGADAPT uses linear-quadratic-gaussian synthesis algorithms in the design of command response controllers and state estimators, and it applies covariance propagation analysis to the selection of sampling intervals for digital systems. Program SCHED executes correlation and regression analyses for the development of gain and trim schedules to be used in open-loop explicit-adaptive control laws. A linear-time-varying simulation of aircraft motions is provided by the program TVHIS, which includes guidance and control logic, as well as models for control actuator dynamics. The programs are coded in FORTRAN and are compiled and executed on both IBM and CDC computers.

  11. Specialization Agreements in the Council for Mutual Economic Assistance

    DTIC Science & Technology

    1988-02-01

    proportions to stabilize variance (S. Weisberg, Applied Linear Regression , 2nd ed., John Wiley & Sons, New York, 1985, p. 134). If the dependent...27, 1986, p. 3. Weisberg, S., Applied Linear Regression , 2nd ed., John Wiley & Sons, New York, 1985, p. 134. Wiles, P. J., Communist International

  12. Radio Propagation Prediction Software for Complex Mixed Path Physical Channels

    DTIC Science & Technology

    2006-08-14

    63 4.4.6. Applied Linear Regression Analysis in the Frequency Range 1-50 MHz 69 4.4.7. Projected Scaling to...4.4.6. Applied Linear Regression Analysis in the Frequency Range 1-50 MHz In order to construct a comprehensive numerical algorithm capable of

  13. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  14. Anticoagulant vs. antiplatelet therapy in patients with cryptogenic stroke and patent foramen ovale: an individual participant data meta-analysis.

    PubMed

    Kent, David M; Dahabreh, Issa J; Ruthazer, Robin; Furlan, Anthony J; Weimar, Christian; Serena, Joaquín; Meier, Bernhard; Mattle, Heinrich P; Di Angelantonio, Emanuele; Paciaroni, Maurizio; Schuchlenz, Herwig; Homma, Shunichi; Lutz, Jennifer S; Thaler, David E

    2015-09-14

    The preferred antithrombotic strategy for secondary prevention in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO) is unknown. We pooled multiple observational studies and used propensity score-based methods to estimate the comparative effectiveness of oral anticoagulation (OAC) compared with antiplatelet therapy (APT). Individual participant data from 12 databases of medically treated patients with CS and PFO were analysed with Cox regression models, to estimate database-specific hazard ratios (HRs) comparing OAC with APT, for both the primary composite outcome [recurrent stroke, transient ischaemic attack (TIA), or death] and stroke alone. Propensity scores were applied via inverse probability of treatment weighting to control for confounding. We synthesized database-specific HRs using random-effects meta-analysis models. This analysis included 2385 (OAC = 804 and APT = 1581) patients with 227 composite endpoints (stroke/TIA/death). The difference between OAC and APT was not statistically significant for the primary composite outcome [adjusted HR = 0.76, 95% confidence interval (CI) 0.52-1.12] or for the secondary outcome of stroke alone (adjusted HR = 0.75, 95% CI 0.44-1.27). Results were consistent in analyses applying alternative weighting schemes, with the exception that OAC had a statistically significant beneficial effect on the composite outcome in analyses standardized to the patient population who actually received APT (adjusted HR = 0.64, 95% CI 0.42-0.99). Subgroup analyses did not detect statistically significant heterogeneity of treatment effects across clinically important patient groups. We did not find a statistically significant difference comparing OAC with APT; our results justify randomized trials comparing different antithrombotic approaches in these patients. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  15. Evaluating rehabilitation goals of visually impaired children in multidisciplinary care according to ICF-CY guidelines.

    PubMed

    Rainey, Linda; van Nispen, Ruth; van Rens, Ger

    2014-11-01

    To gain qualitative insight into the rehabilitation goals of visually impaired children and how these goals relate to the structure of the International Classification of Functioning, Disability and Health (ICF) and patient characteristics. A patient record study was conducted, analysing rehabilitation goals and characteristics of children with a suspected visual impairment in the Netherlands (<18 years) who applied for multidisciplinary services in 2012 (N = 289). Chi-square analyses for trend in rehabilitation content across age bands and additional analyses were performed. The three most common diagnoses were nystagmus (21.2%), cerebral visual impairment (16.2%) and albinism (6.1%). Rehabilitation goals for children aged <7 years were mostly aimed at 'physical (visual) functioning' (36.7%) and 'environmental factors' (36.7%). For children ≥7 years, significantly more goals were identified on activity and participation (A&P) domains (52.2%). Three A and P domains presented a significant linear trend on the number of rehabilitation goals across age bands: (1) 'Learning and applying knowledge' (13.042, p < 0.001), (4) 'Mobility' (31.340, p < 0.001) and (8) 'Major life areas' (5.925, p = 0.015). Regression analysis showed that both age and visual acuity significantly contributed to the number of A and P goals. Although analyses were based on a selection of patient records, the number and nature of rehabilitation goals differ significantly with age. Many A and P goals seem underrepresented at the intake procedure, for example: communication, peer interaction and participating in leisure activities. A systematic, standardized procedure is required to catalogue all existing goals and to be able to evaluate progress and potential new or other important goals. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  16. PARAMETRIC AND NON PARAMETRIC (MARS: MULTIVARIATE ADDITIVE REGRESSION SPLINES) LOGISTIC REGRESSIONS FOR PREDICTION OF A DICHOTOMOUS RESPONSE VARIABLE WITH AN EXAMPLE FOR PRESENCE/ABSENCE OF AMPHIBIANS

    EPA Science Inventory

    The purpose of this report is to provide a reference manual that could be used by investigators for making informed use of logistic regression using two methods (standard logistic regression and MARS). The details for analyses of relationships between a dependent binary response ...

  17. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. A Simulation Investigation of Principal Component Regression.

    ERIC Educational Resources Information Center

    Allen, David E.

    Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…

  19. Practical application of cure mixture model for long-term censored survivor data from a withdrawal clinical trial of patients with major depressive disorder.

    PubMed

    Arano, Ichiro; Sugimoto, Tomoyuki; Hamasaki, Toshimitsu; Ohno, Yuko

    2010-04-23

    Survival analysis methods such as the Kaplan-Meier method, log-rank test, and Cox proportional hazards regression (Cox regression) are commonly used to analyze data from randomized withdrawal studies in patients with major depressive disorder. However, unfortunately, such common methods may be inappropriate when a long-term censored relapse-free time appears in data as the methods assume that if complete follow-up were possible for all individuals, each would eventually experience the event of interest. In this paper, to analyse data including such a long-term censored relapse-free time, we discuss a semi-parametric cure regression (Cox cure regression), which combines a logistic formulation for the probability of occurrence of an event with a Cox proportional hazards specification for the time of occurrence of the event. In specifying the treatment's effect on disease-free survival, we consider the fraction of long-term survivors and the risks associated with a relapse of the disease. In addition, we develop a tree-based method for the time to event data to identify groups of patients with differing prognoses (cure survival CART). Although analysis methods typically adapt the log-rank statistic for recursive partitioning procedures, the method applied here used a likelihood ratio (LR) test statistic from a fitting of cure survival regression assuming exponential and Weibull distributions for the latency time of relapse. The method is illustrated using data from a sertraline randomized withdrawal study in patients with major depressive disorder. We concluded that Cox cure regression reveals facts on who may be cured, and how the treatment and other factors effect on the cured incidence and on the relapse time of uncured patients, and that cure survival CART output provides easily understandable and interpretable information, useful both in identifying groups of patients with differing prognoses and in utilizing Cox cure regression models leading to meaningful interpretations.

  20. LD Hub: a centralized database and web interface to perform LD score regression that maximizes the potential of summary level GWAS data for SNP heritability and genetic correlation analysis.

    PubMed

    Zheng, Jie; Erzurumluoglu, A Mesut; Elsworth, Benjamin L; Kemp, John P; Howe, Laurence; Haycock, Philip C; Hemani, Gibran; Tansey, Katherine; Laurin, Charles; Pourcain, Beate St; Warrington, Nicole M; Finucane, Hilary K; Price, Alkes L; Bulik-Sullivan, Brendan K; Anttila, Verneri; Paternoster, Lavinia; Gaunt, Tom R; Evans, David M; Neale, Benjamin M

    2017-01-15

    LD score regression is a reliable and efficient method of using genome-wide association study (GWAS) summary-level results data to estimate the SNP heritability of complex traits and diseases, partition this heritability into functional categories, and estimate the genetic correlation between different phenotypes. Because the method relies on summary level results data, LD score regression is computationally tractable even for very large sample sizes. However, publicly available GWAS summary-level data are typically stored in different databases and have different formats, making it difficult to apply LD score regression to estimate genetic correlations across many different traits simultaneously. In this manuscript, we describe LD Hub - a centralized database of summary-level GWAS results for 173 diseases/traits from different publicly available resources/consortia and a web interface that automates the LD score regression analysis pipeline. To demonstrate functionality and validate our software, we replicated previously reported LD score regression analyses of 49 traits/diseases using LD Hub; and estimated SNP heritability and the genetic correlation across the different phenotypes. We also present new results obtained by uploading a recent atopic dermatitis GWAS meta-analysis to examine the genetic correlation between the condition and other potentially related traits. In response to the growing availability of publicly accessible GWAS summary-level results data, our database and the accompanying web interface will ensure maximal uptake of the LD score regression methodology, provide a useful database for the public dissemination of GWAS results, and provide a method for easily screening hundreds of traits for overlapping genetic aetiologies. The web interface and instructions for using LD Hub are available at http://ldsc.broadinstitute.org/ CONTACT: jie.zheng@bristol.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    PubMed

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.

  2. Statistical relations among earthquake magnitude, surface rupture length, and surface fault displacement

    USGS Publications Warehouse

    Bonilla, M.G.; Mark, R.K.; Lienkaemper, J.J.

    1984-01-01

    In order to refine correlations of surface-wave magnitude, fault rupture length at the ground surface, and fault displacement at the surface by including the uncertainties in these variables, the existing data were critically reviewed and a new data base was compiled. Earthquake magnitudes were redetermined as necessary to make them as consistent as possible with the Gutenberg methods and results, which necessarily make up much of the data base. Measurement errors were estimated for the three variables for 58 moderate to large shallow-focus earthquakes. Regression analyses were then made utilizing the estimated measurement errors. The regression analysis demonstrates that the relations among the variables magnitude, length, and displacement are stochastic in nature. The stochastic variance, introduced in part by incomplete surface expression of seismogenic faulting, variation in shear modulus, and regional factors, dominates the estimated measurement errors. Thus, it is appropriate to use ordinary least squares for the regression models, rather than regression models based upon an underlying deterministic relation with the variance resulting from measurement errors. Significant differences exist in correlations of certain combinations of length, displacement, and magnitude when events are qrouped by fault type or by region, including attenuation regions delineated by Evernden and others. Subdivision of the data results in too few data for some fault types and regions, and for these only regressions using all of the data as a group are reported. Estimates of the magnitude and the standard deviation of the magnitude of a prehistoric or future earthquake associated with a fault can be made by correlating M with the logarithms of rupture length, fault displacement, or the product of length and displacement. Fault rupture area could be reliably estimated for about 20 of the events in the data set. Regression of MS on rupture area did not result in a marked improvement over regressions that did not involve rupture area. Because no subduction-zone earthquakes are included in this study, the reported results do not apply to such zones.

  3. Artificial Neural Network for the Prediction of Chromosomal Abnormalities in Azoospermic Males.

    PubMed

    Akinsal, Emre Can; Haznedar, Bulent; Baydilli, Numan; Kalinli, Adem; Ozturk, Ahmet; Ekmekçioğlu, Oğuz

    2018-02-04

    To evaluate whether an artifical neural network helps to diagnose any chromosomal abnormalities in azoospermic males. The data of azoospermic males attending to a tertiary academic referral center were evaluated retrospectively. Height, total testicular volume, follicle stimulating hormone, luteinising hormone, total testosterone and ejaculate volume of the patients were used for the analyses. In artificial neural network, the data of 310 azoospermics were used as the education and 115 as the test set. Logistic regression analyses and discriminant analyses were performed for statistical analyses. The tests were re-analysed with a neural network. Both logistic regression analyses and artificial neural network predicted the presence or absence of chromosomal abnormalities with more than 95% accuracy. The use of artificial neural network model has yielded satisfactory results in terms of distinguishing patients whether they have any chromosomal abnormality or not.

  4. Association between response rates and survival outcomes in patients with newly diagnosed multiple myeloma. A systematic review and meta-regression analysis.

    PubMed

    Mainou, Maria; Madenidou, Anastasia-Vasiliki; Liakos, Aris; Paschos, Paschalis; Karagiannis, Thomas; Bekiari, Eleni; Vlachaki, Efthymia; Wang, Zhen; Murad, Mohammad Hassan; Kumar, Shaji; Tsapas, Apostolos

    2017-06-01

    We performed a systematic review and meta-regression analysis of randomized control trials to investigate the association between response to initial treatment and survival outcomes in patients with newly diagnosed multiple myeloma (MM). Response outcomes included complete response (CR) and the combined outcome of CR or very good partial response (VGPR), while survival outcomes were overall survival (OS) and progression-free survival (PFS). We used random-effect meta-regression models and conducted sensitivity analyses based on definition of CR and study quality. Seventy-two trials were included in the systematic review, 63 of which contributed data in meta-regression analyses. There was no association between OS and CR in patients without autologous stem cell transplant (ASCT) (regression coefficient: .02, 95% confidence interval [CI] -0.06, 0.10), in patients undergoing ASCT (-.11, 95% CI -0.44, 0.22) and in trials comparing ASCT with non-ASCT patients (.04, 95% CI -0.29, 0.38). Similarly, OS did not correlate with the combined metric of CR or VGPR, and no association was evident between response outcomes and PFS. Sensitivity analyses yielded similar results. This meta-regression analysis suggests that there is no association between conventional response outcomes and survival in patients with newly diagnosed MM. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Self-transcendence, nurse-patient interaction and the outcome of multidimensional well-being in cognitively intact nursing home patients.

    PubMed

    Haugan, Gørill; Hanssen, Brith; Moksnes, Unni K

    2013-12-01

    The aim of this study was to investigate the associations between age, gender, self-transcendence, nurse-patient interaction and multidimensional well-being as the outcome among cognitively intact nursing home patients. Self-transcendence is considered to be a vital resource of well-being in vulnerable populations and at the end of life. Moreover, the quality of care and the nurse-patient interaction is found to influence self-transcendence and well-being in nursing home patients. A cross-sectional design employing the Self-Transcendence Scale, the Nurse-Patient Interaction Scale, the FACT-G Quality of Life and the FACIT-Sp Spiritual Well-Being questionnaires was adopted. A sample of 202 cognitively intact nursing home patients from 44 nursing homes in central Norway was selected. A previous documented two-factor construct of self-transcendence was applied. The statistical analyses were carried out by means of independent sample t-test, correlation and regression analyses. Multiple linear regression analyses revealed significant relationships between interpersonal self-transcendence and social, functional and spiritual well-being, whereas intrapersonal self-transcendence significantly related to emotional, social, functional and spiritual well-being. Nurse-patient interaction related to physical, emotional and functional well-being. Age and gender were not significant predictors for well-being, except for functional and spiritual well-being where women scored higher than men. Nurse-patient interaction and self-transcendence are vital resources for promoting well-being physically, emotionally, functionally, socially and spiritually among cognitively intact nursing home patients. Nurse-patient interaction signifies vital and ultimate nursing qualities promoting self-transcendence and multidimensional well-being. These findings are important for clinical nursing intending to increase patients' well-being. © 2012 The Authors Scandinavian Journal of Caring Sciences © 2012 Nordic College of Caring Science.

  6. Genome-Wide Associations Related to Hepatic Histology in Nonalcoholic Fatty Liver Disease in Hispanic Boys.

    PubMed

    Wattacheril, Julia; Lavine, Joel E; Chalasani, Naga P; Guo, Xiuqing; Kwon, Soonil; Schwimmer, Jeffrey; Molleston, Jean P; Loomba, Rohit; Brunt, Elizabeth M; Chen, Yii-Der Ida; Goodarzi, Mark O; Taylor, Kent D; Yates, Katherine P; Tonascia, James; Rotter, Jerome I

    2017-11-01

    To identify genetic loci associated with features of histologic severity of nonalcoholic fatty liver disease in a cohort of Hispanic boys. There were 234 eligible Hispanic boys age 2-17 years with clinical, laboratory, and histologic data enrolled in the Nonalcoholic Steatohepatitis Clinical Research Network included in the analysis of 624 297 single nucleotide polymorphisms (SNPs). After the elimination of 4 outliers and 22 boys with cryptic relatedness, association analyses were performed on 208 DNA samples with corresponding liver histology. Logistic regression analyses were carried out for qualitative traits and linear regression analyses were applied for quantitative traits. The median age and body mass index z-score were 12.0 years (IQR, 11.0-14.0) and 2.4 (IQR, 2.1-2.6), respectively. The nonalcoholic fatty liver disease activity score (scores 1-4 vs 5-8) was associated with SNP rs11166927 on chromosome 8 in the TRAPPC9 region (P = 8.7 -07 ). Fibrosis stage was associated with SNP rs6128907 on chromosome 20, near actin related protein 5 homolog (p = 9.9 -07 ). In comparing our results in Hispanic boys with those of previously reported SNPs in adult nonalcoholic steatohepatitis, 2 of 26 susceptibility loci were associated with nonalcoholic fatty liver disease activity score and 2 were associated with fibrosis stage. In this discovery genome-wide association study, we found significant novel gene effects on histologic traits associated with nonalcoholic fatty liver disease activity score and fibrosis that are distinct from those previously recognized by adult nonalcoholic fatty liver disease genome-wide association studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. The osmotic tolerance of boar spermatozoa and its usefulness as sperm quality parameter.

    PubMed

    Yeste, Marc; Briz, Mailo; Pinart, Elisabeth; Sancho, Sílvia; Bussalleu, Eva; Bonet, Sergi

    2010-06-01

    Predicting the fertility outcome of ejaculates is very important in the field of porcine reproduction. The aims of this study were to determine the effects of different osmotic treatments on boar spermatozoa and to correlate them with fertility and prolificacy, assessed as non-return rates within 60 days (NRR(60d)) of the first inseminations, and litter size (LS), respectively. Sperm samples (n=100) from one hundred healthy Piétrain boars were used to assess 48 treatments combining different osmolalities (ranged between 100 and 4000 mOsm kg(-1)), different compounds used to prepare anisotonic solutions, and two different modalities: return and non-return to isotonic conditions. Sperm quality was evaluated before and after applying the treatments on the basis of analyses of sperm viability, motility, morphology and percentages of acrosome-intact spermatozoa. Statistical analyses were performed using a one-way ANOVA and post hoc Tukey's test, linear regression analyses (Pearson correlation and multiple regression) and Jackknife cross-validation. Although three conventional parameters: sperm viability, sperm morphology and the percentages of acrosome-intact spermatozoa were significantly correlated with NRR(60d) and with LS, their respective osmotic tolerance parameters (defined for each parameter and treatment regarding with negative control) presented a higher Pearson coefficient with both fertility and prolificacy in three treatments (150 mOsm kg(-1) with non-return to isotonic conditions, 200 mOsm kg(-1) with return and 500 mOsm kg(-1) using sodium citrate and non-return to isotonic conditions). We conclude that osmotic resistance in sperm viability, sperm morphology and acrosome-intactness in the treatments mentioned above could be assessed along with classical parameters to better predict the fertilising ability of a given ejaculate. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  8. The effects of normal aging on multiple aspects of financial decision-making.

    PubMed

    Bangma, Dorien F; Fuermaier, Anselm B M; Tucha, Lara; Tucha, Oliver; Koerts, Janneke

    2017-01-01

    Financial decision-making (FDM) is crucial for independent living. Due to cognitive decline that accompanies normal aging, older adults might have difficulties in some aspects of FDM. However, an improved knowledge, personal experience and affective decision-making, which are also related to normal aging, may lead to a stable or even improved age-related performance in some other aspects of FDM. Therefore, the present explorative study examines the effects of normal aging on multiple aspects of FDM. One-hundred and eighty participants (range 18-87 years) were assessed with eight FDM tests and several standard neuropsychological tests. Age effects were evaluated using hierarchical multiple regression analyses. The validity of the prediction models was examined by internal validation (i.e. bootstrap resampling procedure) as well as external validation on another, independent, sample of participants (n = 124). Multiple regression and correlation analyses were applied to investigate the mediation effect of standard measures of cognition on the observed effects of age on FDM. On a relatively basic level of FDM (e.g., paying bills or using FDM styles) no significant effects of aging were found. However more complex FDM, such as making decisions in accordance with specific rules, becomes more difficult with advancing age. Furthermore, an older age was found to be related to a decreased sensitivity for impulsive buying. These results were confirmed by the internal and external validation analyses. Mediation effects of numeracy and planning were found to explain parts of the association between one aspect of FDM (i.e. Competence in decision rules) and age; however, these cognitive domains were not able to completely explain the relation between age and FDM. Normal aging has a negative influence on a complex aspect of FDM, however, other aspects appear to be unaffected by normal aging or improve.

  9. Health-promoting lifestyle behaviour for cancer prevention: a survey of Turkish university students.

    PubMed

    Ay, Semra; Yanikkerem, Emre; Çalim, Selda Ildan; Yazici, Mete

    2012-01-01

    Health risks associated with unhealthy behaviours in adolescent and university students contribute to the development of health problems in later life. During the past twenty years, there has been a dramatic increase in public, private, and professional interest in preventing disability and death through changes in lifestyle and participation in screening programs. The aim of the study was to evaluate university students' health-promoting lifestyle behaviour for cancer prevention. This study was carried out on university students who had education in sports, health and social areas in Celal Bayar University, Manisa, Turkey. The health-promoting lifestyles of university students were measured with the "health-promoting lifestyle profile (HPLP)" The survey was conducted from March 2011 to July 2011 and the study sample consisted of 1007 university students. T-test, ANOVA and multiple regression analyses were used for statistical analyses. In the univariate analyses, the overall HPLP score was significantly related to students' school, sex, age, school grades, their status of received health education lessons, place of birth, longest place of residence, current place of residence, health insurance, family income, alcohol use, their status in sports, and self-perceived health status. Healthier behaviour was found in those students whose parents had higher secondary degrees, and in students who had no siblings. In the multiple regression model, healthier behaviour was observed in Physical Education and Sports students, fourth-year students, those who exercised regularly, had a good self-perceived health status, who lived with their family, and who had received health education lessons. In general, in order to ensure cancer prevention and a healthy life style, social, cultural and sportive activities should be encouraged and educational programmes supporting these goals should be designed and applied in all stages of life from childhood through adulthood.

  10. Does more education mean less disability in people with dementia? A large cross-sectional study in Taiwan

    PubMed Central

    Huang, Shih-Wei; Chi, Wen-Chou; Yen, Chia-Feng; Chang, Kwang-Hwa; Liao, Hua-Fang; Escorpizo, Reuben; Chang, Feng-Hang; Liou, Tsan-Hon

    2017-01-01

    Background WHO Disability Assessment Schedule 2.0 (WHODAS 2.0) is a feasible tool for assessing functional disability and analysing the risk of institutionalisation among elderly patients with dementia. However, the data for the effect of education on disability status in patients with dementia is lacking. The aim of this large-scale, population-based study was to analyse the effect of education on the disability status of elderly Taiwanese patients with dementia by using WHODAS 2.0. Methods From the Taiwan Data Bank of Persons with Disability, we enrolled 7698 disabled elderly (older than 65 years) patients diagnosed with dementia between July 2012 and January 2014. According to their education status, we categorised these patients with and without formal education (3849 patients each). We controlled for the demographic variables through propensity score matching. The standardised scores of these patients in the six domains of WHODAS 2.0 were evaluated by certified interviewers. Student’s t-test was used for comparing the WHODAS 2.0 scores of patients with dementia in the two aforementioned groups. Poisson regression was applied for analysing the association among all the investigated variables. Results Patients with formal education had low disability status in the domains of getting along and social participation than did patients without formal education. Poisson regression revealed that standardised scores in all domains of WHODAS 2.0—except self-care—were associated with education status. Conclusions This study revealed lower disability status in the WHODAS 2.0 domains of getting along and social participation for patients with dementia with formal education compared with those without formal education. For patients with disability and dementia without formal education, community intervention of social participation should be implemented to maintain better social interaction ability. PMID:28473510

  11. [German practice of involuntary commitment at both federal and state level after introduction of the Guardianship law (1992-2009)].

    PubMed

    Valdes-Stauber, J; Deinert, H; Kilian, R

    2012-05-01

    Given the steady rise of psychiatric coercive measures in Germany, the question arises whether this development is significantly influenced by the corresponding legal basis or through epidemiological, socio-economic or socio-structural factors. Based on full surveys of the Federal Ministry of Justice we examined the development and associations of 10 indicators of coercive psychiatric measures over a period of 18 years. Time trends of all indicators have been descriptively analysed. Statistical associations between time trends and between involuntary and admissions economic indicators were analysed by regression models. All annual involuntary commitment rates have increased, judicial ordered physical restraint measures particularly strongly (848%). The rate of judicial rejections of applied involuntary measures showed the lowest increase. On the other hand, quotas of involuntary admissions remained stable. In former East Germany, the involuntary admission rates are only a third of those in the former West Germany. Results of regression analyses indicate an excess increase of physical coercive measures in psychiatric hospitals in relation to the increase of psychiatric admissions. In former East Germany the rate of involuntary admissions at the federal state level is negatively correlated with the average gross income. The continuous increase of coercive psychiatric measures in consequence to the change in the Guardianship law suggests that this change has influenced the practice. The differences at federal and state levels, and the sharper rise in the former East Germany by lower rates in comparison to the former West Germany need an explanation, as well as the fact that the rate of involuntary admissions is associated at least in the former East Germany with economic conditions. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Semi-supervised Machine Learning for Analysis of Hydrogeochemical Data and Models

    NASA Astrophysics Data System (ADS)

    Vesselinov, Velimir; O'Malley, Daniel; Alexandrov, Boian; Moore, Bryan

    2017-04-01

    Data- and model-based analyses such as uncertainty quantification, sensitivity analysis, and decision support using complex physics models with numerous model parameters and typically require a huge number of model evaluations (on order of 10^6). Furthermore, model simulations of complex physics may require substantial computational time. For example, accounting for simultaneously occurring physical processes such as fluid flow and biogeochemical reactions in heterogeneous porous medium may require several hours of wall-clock computational time. To address these issues, we have developed a novel methodology for semi-supervised machine learning based on Non-negative Matrix Factorization (NMF) coupled with customized k-means clustering. The algorithm allows for automated, robust Blind Source Separation (BSS) of groundwater types (contamination sources) based on model-free analyses of observed hydrogeochemical data. We have also developed reduced order modeling tools, which coupling support vector regression (SVR), genetic algorithms (GA) and artificial and convolutional neural network (ANN/CNN). SVR is applied to predict the model behavior within prior uncertainty ranges associated with the model parameters. ANN and CNN procedures are applied to upscale heterogeneity of the porous medium. In the upscaling process, fine-scale high-resolution models of heterogeneity are applied to inform coarse-resolution models which have improved computational efficiency while capturing the impact of fine-scale effects at the course scale of interest. These techniques are tested independently on a series of synthetic problems. We also present a decision analysis related to contaminant remediation where the developed reduced order models are applied to reproduce groundwater flow and contaminant transport in a synthetic heterogeneous aquifer. The tools are coded in Julia and are a part of the MADS high-performance computational framework (https://github.com/madsjulia/Mads.jl).

  13. Using Parametric Cost Models to Estimate Engineering and Installation Costs of Selected Electronic Communications Systems

    DTIC Science & Technology

    1994-09-01

    Institute of Technology, Wright- Patterson AFB OH, January 1994. 4. Neter, John and others. Applied Linear Regression Models. Boston: Irwin, 1989. 5...Technology, Wright-Patterson AFB OH 5 April 1994. 29. Neter, John and others. Applied Linear Regression Models. Boston: Irwin, 1989. 30. Office of

  14. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    DTIC Science & Technology

    1989-09-01

    residual and it is described as the residual divided by its standard deviation (13:App A,17). Neter, Wasserman, and Kutner, in Applied Linear Regression Models...others. Applied Linear Regression Models. Homewood IL: Irwin, 1983. 19. Raduchel, William J. "A Professional’s Perspective on User-Friendliness," Byte

  15. iHWG-μNIR: a miniaturised near-infrared gas sensor based on substrate-integrated hollow waveguides coupled to a micro-NIR-spectrophotometer.

    PubMed

    Rohwedder, J J R; Pasquini, C; Fortes, P R; Raimundo, I M; Wilk, A; Mizaikoff, B

    2014-07-21

    A miniaturised gas analyser is described and evaluated based on the use of a substrate-integrated hollow waveguide (iHWG) coupled to a microsized near-infrared spectrophotometer comprising a linear variable filter and an array of InGaAs detectors. This gas sensing system was applied to analyse surrogate samples of natural fuel gas containing methane, ethane, propane and butane, quantified by using multivariate regression models based on partial least square (PLS) algorithms and Savitzky-Golay 1(st) derivative data preprocessing. The external validation of the obtained models reveals root mean square errors of prediction of 0.37, 0.36, 0.67 and 0.37% (v/v), for methane, ethane, propane and butane, respectively. The developed sensing system provides particularly rapid response times upon composition changes of the gaseous sample (approximately 2 s) due the minute volume of the iHWG-based measurement cell. The sensing system developed in this study is fully portable with a hand-held sized analyser footprint, and thus ideally suited for field analysis. Last but not least, the obtained results corroborate the potential of NIR-iHWG analysers for monitoring the quality of natural gas and petrochemical gaseous products.

  16. DTI measures identify mild and moderate TBI cases among patients with complex health problems: A receiver operating characteristic analysis of U.S. veterans.

    PubMed

    Main, Keith L; Soman, Salil; Pestilli, Franco; Furst, Ansgar; Noda, Art; Hernandez, Beatriz; Kong, Jennifer; Cheng, Jauhtai; Fairchild, Jennifer K; Taylor, Joy; Yesavage, Jerome; Wesson Ashford, J; Kraemer, Helena; Adamson, Maheen M

    2017-01-01

    Standard MRI methods are often inadequate for identifying mild traumatic brain injury (TBI). Advances in diffusion tensor imaging now provide potential biomarkers of TBI among white matter fascicles (tracts). However, it is still unclear which tracts are most pertinent to TBI diagnosis. This study ranked fiber tracts on their ability to discriminate patients with and without TBI. We acquired diffusion tensor imaging data from military veterans admitted to a polytrauma clinic (Overall n  = 109; Age: M  = 47.2, SD  = 11.3; Male: 88%; TBI: 67%). TBI diagnosis was based on self-report and neurological examination. Fiber tractography analysis produced 20 fiber tracts per patient. Each tract yielded four clinically relevant measures (fractional anisotropy, mean diffusivity, radial diffusivity, and axial diffusivity). We applied receiver operating characteristic (ROC) analyses to identify the most diagnostic tract for each measure. The analyses produced an optimal cutpoint for each tract. We then used kappa coefficients to rate the agreement of each cutpoint with the neurologist's diagnosis. The tract with the highest kappa was most diagnostic. As a check on the ROC results, we performed a stepwise logistic regression on each measure using all 20 tracts as predictors. We also bootstrapped the ROC analyses to compute the 95% confidence intervals for sensitivity, specificity, and the highest kappa coefficients. The ROC analyses identified two fiber tracts as most diagnostic of TBI: the left cingulum (LCG) and the left inferior fronto-occipital fasciculus (LIF). Like ROC, logistic regression identified LCG as most predictive for the FA measure but identified the right anterior thalamic tract (RAT) for the MD, RD, and AD measures. These findings are potentially relevant to the development of TBI biomarkers. Our methods also demonstrate how ROC analysis may be used to identify clinically relevant variables in the TBI population.

  17. Predicting Word Reading Ability: A Quantile Regression Study

    ERIC Educational Resources Information Center

    McIlraith, Autumn L.

    2018-01-01

    Predictors of early word reading are well established. However, it is unclear if these predictors hold for readers across a range of word reading abilities. This study used quantile regression to investigate predictive relationships at different points in the distribution of word reading. Quantile regression analyses used preschool and…

  18. Applied Multiple Linear Regression: A General Research Strategy

    ERIC Educational Resources Information Center

    Smith, Brandon B.

    1969-01-01

    Illustrates some of the basic concepts and procedures for using regression analysis in experimental design, analysis of variance, analysis of covariance, and curvilinear regression. Applications to evaluation of instruction and vocational education programs are illustrated. (GR)

  19. Linear regression crash prediction models : issues and proposed solutions.

    DOT National Transportation Integrated Search

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  20. Changes in Soil Carbon Storage After Cultivation

    DOE Data Explorer

    Mann, L. K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2004-01-01

    Previously published data from 625 paired soil samples were used to predict carbon in cultivated soil as a function of initial carbon content. A 30-cm sampling depth provided a less variable estimate (r2 = 0.9) of changes in carbon than a 15-cm sampling depth (r2 = 0.6). Regression analyses of changes in carbon storage in relation to years of cultivation confirmed that the greatest rates of change occurred in the first 20 y. An initial carbon effect was present in all analyses: soils very low in carbon tended to gain slight amounts of carbon after cultivation, but soils high in carbon lost at least 20% during cultivation. Carbon losses from most agricultural soils are estimated to average less than 20% of initial values or less than 1.5 kg/m2 within the top 30 cm. These estimates should not be applied to depths greater than 30 cm and would be improved with more bulk density information and equivalent sample volumes.

  1. Why go the extra mile? A longitudinal study on sojourn goals and their impact on sojourners' adaptation.

    PubMed

    Zimmermann, Julia; Schubert, Kristina; Bruder, Martin; Hagemeyer, Birk

    2017-12-01

    Although international student mobility has become a ubiquitous phenomenon in many parts of the world, the goals that student sojourners pursue when moving abroad have received little systematic attention in psychological research. Likewise, their effects on psychological outcomes such as sojourners' psychological and sociocultural adaptation abroad have not yet been examined. Hence, the purpose of the present research was twofold: First, we established the parsimonious Sojourn Goals Scale and confirmed its psychometric quality and construct validity. Second, we used a longitudinal sample of student sojourners to investigate the role of sojourn goals for sojourners' sociocultural (i.e., sojourners' social relationships) and psychological (i.e., sojourn satisfaction) adaptation abroad at 3 months into the sojourn. Regression analyses revealed substantial effects of sojourn goals on measures of sociocultural adaptation. Response surface analyses served to examine the interplay of sojourn goals and respective sojourn experiences on sojourn satisfaction. We discuss implications for both psychological and applied research and identify future research needs. © 2016 International Union of Psychological Science.

  2. Estimating population diversity with CatchAll

    PubMed Central

    Bunge, John; Woodard, Linda; Böhning, Dankmar; Foster, James A.; Connolly, Sean; Allen, Heather K.

    2012-01-01

    Motivation: The massive data produced by next-generation sequencing require advanced statistical tools. We address estimating the total diversity or species richness in a population. To date, only relatively simple methods have been implemented in available software. There is a need for software employing modern, computationally intensive statistical analyses including error, goodness-of-fit and robustness assessments. Results: We present CatchAll, a fast, easy-to-use, platform-independent program that computes maximum likelihood estimates for finite-mixture models, weighted linear regression-based analyses and coverage-based non-parametric methods, along with outlier diagnostics. Given sample ‘frequency count’ data, CatchAll computes 12 different diversity estimates and applies a model-selection algorithm. CatchAll also derives discounted diversity estimates to adjust for possibly uncertain low-frequency counts. It is accompanied by an Excel-based graphics program. Availability: Free executable downloads for Linux, Windows and Mac OS, with manual and source code, at www.northeastern.edu/catchall. Contact: jab18@cornell.edu PMID:22333246

  3. Structural characterization of humic-like substances with conventional and surface-enhanced spectroscopic techniques

    NASA Astrophysics Data System (ADS)

    Carletti, Paolo; Roldán, Maria Lorena; Francioso, Ornella; Nardi, Serenella; Sanchez-Cortes, Santiago

    2010-10-01

    Emission-excitation, synchronous fluorescence spectroscopy and surface-enhanced Raman scattering (SERS) combined with surface-enhanced fluorescence (SEF) were applied to aqueous solutions of a humic-like substance (HLS) extracted from earthworm faeces. All measurements were acquired in a wide range of pH (4-12) and analysed by the linear regression analysis. Diffuse Reflectance Infrared Fourier Transform (DRIFT) spectra were also acquired to assist in the structural characterization of this HLS. The emission and excitation spectra allowed the identification of two main fluorophores in the analysed sample. Moreover, a close correlation between fluorescence intensities of each fluorophore with pH variation was observed. SERS and SEF, in agreement with the fluorescence spectroscopy, showed that the HLS at low pH values exists in an aggregated and coiled molecular structure while it is dispersed and uncoiled at alkaline conditions. The obtained spectra also evidenced that different conditions modify the functional groups exposed to the surrounding aqueous environment.

  4. Psychological quality of life and its association with academic employability skills among newly-registered students from three European faculties

    PubMed Central

    2011-01-01

    Background In accord with new European university reforms initiated by the Bologna Process, our objectives were to assess psychological quality of life (QoL) and to analyse its associations with academic employability skills (AES) among students from the Faculty of Language, Literature, Humanities, Arts and Education, Walferdange Luxembourg (F1, mostly vocational/applied courses); the Faculty of Social and Human Sciences, Liege, Belgium (F2, mainly general courses); and the Faculty of Social Work, Iasi, Romania (F3, mainly vocational/professional courses). Method Students who redoubled or who had studied at other universities were excluded. 355 newly-registered first-year students (145 from F1, 125 from F2, and 85 from F3) were invited to complete an online questionnaire (in French, German, English or Romanian) covering socioeconomic data, the AES scale and the QoL-psychological, QoL-social relationships and QoL-environment subscales as measured with the World Health Organisation Quality of Life short-form (WHOQoL-BREF) questionnaire. Analyses included multiple regressions with interactions. Results QoL-psychological, QoL-social relationships and QoL-environment' scores were highest in F1 (Luxembourg), and the QoL-psychological score in F2 (Belgium) was the lower. AES score was higher in F1 than in F3 (Romania). A positive link was found between QoL-psychological and AES for F1 (correlation coefficient 0.29, p < 0.01) and F3 (correlation coefficient 0.30, p < 0.05), but the association was negative for F2 (correlation coefficient -0.25, p < 0.01). QoL-psychological correlated positively with QoL-social relationships (regression coefficient 0.31, p < 0.001) and QoL-environment (regression coefficient 0.35, p < 0.001). Conclusions Psychological quality of life is associated with acquisition of skills that increase employability from the faculties offering vocational/applied/professional courses in Luxembourg and Romania, but not their academically orientated Belgian counterparts. In the context of developing a European Higher Educational Area, these measurements are major indicators that can be used as a guide to promoting programs geared towards counseling, improvement of the social environment, and services to assist with university work and facilitate achievement of future professional projects. PMID:21501507

  5. Psychological quality of life and its association with academic employability skills among newly-registered students from three European faculties.

    PubMed

    Baumann, Michèle; Ionescu, Ion; Chau, Nearkasen

    2011-04-18

    In accord with new European university reforms initiated by the Bologna Process, our objectives were to assess psychological quality of life (QoL) and to analyse its associations with academic employability skills (AES) among students from the Faculty of Language, Literature, Humanities, Arts and Education, Walferdange Luxembourg (F1, mostly vocational/applied courses); the Faculty of Social and Human Sciences, Liege, Belgium (F2, mainly general courses); and the Faculty of Social Work, Iasi, Romania (F3, mainly vocational/professional courses). Students who redoubled or who had studied at other universities were excluded. 355 newly-registered first-year students (145 from F1, 125 from F2, and 85 from F3) were invited to complete an online questionnaire (in French, German, English or Romanian) covering socioeconomic data, the AES scale and the QoL-psychological, QoL-social relationships and QoL-environment subscales as measured with the World Health Organisation Quality of Life short-form (WHOQoL-BREF) questionnaire. Analyses included multiple regressions with interactions. QoL-psychological, QoL-social relationships and QoL-environment' scores were highest in F1 (Luxembourg), and the QoL-psychological score in F2 (Belgium) was the lower. AES score was higher in F1 than in F3 (Romania). A positive link was found between QoL-psychological and AES for F1 (correlation coefficient 0.29, p<0.01) and F3 (correlation coefficient 0.30, p<0.05), but the association was negative for F2 (correlation coefficient -0.25, p<0.01). QoL-psychological correlated positively with QoL-social relationships (regression coefficient 0.31, p<0.001) and QoL-environment (regression coefficient 0.35, p<0.001). Psychological quality of life is associated with acquisition of skills that increase employability from the faculties offering vocational/applied/professional courses in Luxembourg and Romania, but not their academically orientated Belgian counterparts. In the context of developing a European Higher Educational Area, these measurements are major indicators that can be used as a guide to promoting programs geared towards counseling, improvement of the social environment, and services to assist with university work and facilitate achievement of future professional projects.

  6. Real estate value prediction using multivariate regression models

    NASA Astrophysics Data System (ADS)

    Manjula, R.; Jain, Shubham; Srivastava, Sharad; Rajiv Kher, Pranav

    2017-11-01

    The real estate market is one of the most competitive in terms of pricing and the same tends to vary significantly based on a lot of factors, hence it becomes one of the prime fields to apply the concepts of machine learning to optimize and predict the prices with high accuracy. Therefore in this paper, we present various important features to use while predicting housing prices with good accuracy. We have described regression models, using various features to have lower Residual Sum of Squares error. While using features in a regression model some feature engineering is required for better prediction. Often a set of features (multiple regressions) or polynomial regression (applying a various set of powers in the features) is used for making better model fit. For these models are expected to be susceptible towards over fitting ridge regression is used to reduce it. This paper thus directs to the best application of regression models in addition to other techniques to optimize the result.

  7. Missing heritability in the tails of quantitative traits? A simulation study on the impact of slightly altered true genetic models.

    PubMed

    Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André

    2011-01-01

    Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.

  8. A method for fitting regression splines with varying polynomial order in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Stewart, Paul W; MacDougall, James E; Helms, Ronald W

    2006-02-15

    The linear mixed model has become a widely used tool for longitudinal analysis of continuous variables. The use of regression splines in these models offers the analyst additional flexibility in the formulation of descriptive analyses, exploratory analyses and hypothesis-driven confirmatory analyses. We propose a method for fitting piecewise polynomial regression splines with varying polynomial order in the fixed effects and/or random effects of the linear mixed model. The polynomial segments are explicitly constrained by side conditions for continuity and some smoothness at the points where they join. By using a reparameterization of this explicitly constrained linear mixed model, an implicitly constrained linear mixed model is constructed that simplifies implementation of fixed-knot regression splines. The proposed approach is relatively simple, handles splines in one variable or multiple variables, and can be easily programmed using existing commercial software such as SAS or S-plus. The method is illustrated using two examples: an analysis of longitudinal viral load data from a study of subjects with acute HIV-1 infection and an analysis of 24-hour ambulatory blood pressure profiles.

  9. Classification and regression tree analysis vs. multivariable linear and logistic regression methods as statistical tools for studying haemophilia.

    PubMed

    Henrard, S; Speybroeck, N; Hermans, C

    2015-11-01

    Haemophilia is a rare genetic haemorrhagic disease characterized by partial or complete deficiency of coagulation factor VIII, for haemophilia A, or IX, for haemophilia B. As in any other medical research domain, the field of haemophilia research is increasingly concerned with finding factors associated with binary or continuous outcomes through multivariable models. Traditional models include multiple logistic regressions, for binary outcomes, and multiple linear regressions for continuous outcomes. Yet these regression models are at times difficult to implement, especially for non-statisticians, and can be difficult to interpret. The present paper sought to didactically explain how, why, and when to use classification and regression tree (CART) analysis for haemophilia research. The CART method is non-parametric and non-linear, based on the repeated partitioning of a sample into subgroups based on a certain criterion. Breiman developed this method in 1984. Classification trees (CTs) are used to analyse categorical outcomes and regression trees (RTs) to analyse continuous ones. The CART methodology has become increasingly popular in the medical field, yet only a few examples of studies using this methodology specifically in haemophilia have to date been published. Two examples using CART analysis and previously published in this field are didactically explained in details. There is increasing interest in using CART analysis in the health domain, primarily due to its ease of implementation, use, and interpretation, thus facilitating medical decision-making. This method should be promoted for analysing continuous or categorical outcomes in haemophilia, when applicable. © 2015 John Wiley & Sons Ltd.

  10. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    PubMed

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and Nutrition Examination Survey (NHANES).

  11. Applying Kaplan-Meier to Item Response Data

    ERIC Educational Resources Information Center

    McNeish, Daniel

    2018-01-01

    Some IRT models can be equivalently modeled in alternative frameworks such as logistic regression. Logistic regression can also model time-to-event data, which concerns the probability of an event occurring over time. Using the relation between time-to-event models and logistic regression and the relation between logistic regression and IRT, this…

  12. Height-income association in developing countries: Evidence from 14 countries.

    PubMed

    Patel, Pankaj C; Devaraj, Srikant

    2017-12-28

    The purpose of this study was to assess whether the height-income association is positive in developing countries, and whether income differences between shorter and taller individuals in developing countries are explained by differences in endowment (ie, taller individuals have a higher income than shorter individuals because of characteristics such as better social skills) or due to discrimination (ie, shorter individuals have a lower income despite having comparable characteristics). Instrumental variable regression, Oaxaca-Blinder decomposition, quantile regression, and quantile decomposition analyses were applied to a sample of 45 108 respondents from 14 developing countries represented in the Research on Early Life and Aging Trends and Effects (RELATE) study. For a one-centimeter increase in country- and sex-adjusted median height, real income adjusted for purchasing power parity increased by 1.37%. The income differential between shorter and taller individuals was explained by discrimination and not by differences in endowments; however, the effect of discrimination decreased at higher values of country- and sex-adjusted height. Taller individuals in developing countries may realize higher income despite having characteristics similar to those of shorter individuals. © 2017 Wiley Periodicals, Inc.

  13. Is economic inequality in infant mortality higher in urban than in rural India?

    PubMed

    Kumar, Abhishek; Singh, Abhishek

    2014-11-01

    This paper examines the trends in economic inequality in infant mortality across urban-rural residence in India over last 14 years. We analysed data from the three successive rounds of the National Family Health Survey conducted in India during 1992-1993, 1998-1999, and 2005-2006. Asset-based household wealth index was used as the economic indicator for the study. Concentration index and pooled logistic regression analysis were applied to measure the extent of economic inequality in infant mortality in urban and rural India. Infant mortality rate differs considerably by urban-rural residence: infant mortality in rural India being substantially higher than that in urban India. The findings suggest that economic inequalities are higher in urban than in rural India in each of the three survey rounds. Pooled logistic regression results suggest that, in urban areas, infant mortality has declined by 22 % in poorest and 43 % in richest. In comparison, the decline is 29 and 32 % respectively in rural India. Economic inequality in infant mortality has widened more in urban than in rural India in the last two decades.

  14. Impact of an equality constraint on the class-specific residual variances in regression mixtures: A Monte Carlo simulation study.

    PubMed

    Kim, Minjung; Lamont, Andrea E; Jaki, Thomas; Feaster, Daniel; Howe, George; Van Horn, M Lee

    2016-06-01

    Regression mixture models are a novel approach to modeling the heterogeneous effects of predictors on an outcome. In the model-building process, often residual variances are disregarded and simplifying assumptions are made without thorough examination of the consequences. In this simulation study, we investigated the impact of an equality constraint on the residual variances across latent classes. We examined the consequences of constraining the residual variances on class enumeration (finding the true number of latent classes) and on the parameter estimates, under a number of different simulation conditions meant to reflect the types of heterogeneity likely to exist in applied analyses. The results showed that bias in class enumeration increased as the difference in residual variances between the classes increased. Also, an inappropriate equality constraint on the residual variances greatly impacted on the estimated class sizes and showed the potential to greatly affect the parameter estimates in each class. These results suggest that it is important to make assumptions about residual variances with care and to carefully report what assumptions are made.

  15. Undergraduates' intentions to take a second language proficiency test: a comparison of predictions from the theory of planned behavior and social cognitive theory.

    PubMed

    Lin, Bih-Jiau; Chiou, Wen-Bin

    2010-06-01

    English competency has become essential for obtaining a better job or succeeding in higher education in Taiwan. Thus, passing the General English Proficiency Test is important for college students in Taiwan. The current study applied Ajzen's theory of planned behavior and the notions of outcome expectancy and self-efficacy from Bandura's social cognitive theory to investigate college students' intentions to take the General English Proficiency Test. The formal sample consisted of 425 undergraduates (217 women, 208 men; M age = 19.5 yr., SD = 1.3). The theory of planned behavior showed greater predictive ability (R2 = 33%) of intention than the social cognitive theory (R2 = 7%) in regression analysis and made a unique contribution to prediction of actual test-taking behavior one year later in logistic regression. Within-model analyses indicated that subjective norm in theory of planned behavior and outcome expectancy in social cognitive theory are crucial factors in predicting intention. Implications for enhancing undergraduates' intentions to take the English proficiency test are discussed.

  16. Peak-flow characteristics of Wyoming streams

    USGS Publications Warehouse

    Miller, Kirk A.

    2003-01-01

    Peak-flow characteristics for unregulated streams in Wyoming are described in this report. Frequency relations for annual peak flows through water year 2000 at 364 streamflow-gaging stations in and near Wyoming were evaluated and revised or updated as needed. Analyses of historical floods, temporal trends, and generalized skew were included in the evaluation. Physical and climatic basin characteristics were determined for each gaging station using a geographic information system. Gaging stations with similar peak-flow and basin characteristics were grouped into six hydrologic regions. Regional statistical relations between peak-flow and basin characteristics were explored using multiple-regression techniques. Generalized least squares regression equations for estimating magnitudes of annual peak flows with selected recurrence intervals from 1.5 to 500 years were developed for each region. Average standard errors of estimate range from 34 to 131 percent. Average standard errors of prediction range from 35 to 135 percent. Several statistics for evaluating and comparing the errors in these estimates are described. Limitations of the equations are described. Methods for applying the regional equations for various circumstances are listed and examples are given.

  17. A procedure for scaling sensory attributes based on multidimensional measurements: application to sensory sharpness of kitchen knives

    NASA Astrophysics Data System (ADS)

    Takatsuji, Toshiyuki; Tanaka, Ken-ichi

    1996-06-01

    A procedure is derived by which sensory attributes can be scaled as a function of various physical and/or chemical properties of the object to be tested. This procedure consists of four successive steps: (i) design and experiment, (ii) fabrication of specimens according to the design parameters, (iii) assessment of a sensory attribute using sensory evaluation and (iv) derivation of the relationship between the parameters and the sensory attribute. In these steps an experimental design using orthogonal arrays, analysis of variance and regression analyses are used strategically. When a specimen with the design parameters cannot be physically fabricated, an alternative specimen having parameters closest to the design is selected from a group of specimens which can be physically made. The influence of the deviation of actual parameters from the desired ones is also discussed. A method of confirming the validity of the regression equation is also investigated. The procedure is applied to scale the sensory sharpness of kitchen knives as a function of the edge angle and the roughness of the cutting edge.

  18. Unit Cohesion and the Surface Navy: Does Cohesion Affect Performance

    DTIC Science & Technology

    1989-12-01

    v. 68, 1968. Neter, J., Wasserman, W., and Kutner, M. H., Applied Linear Regression Models, 2d ed., Boston, MA: Irwin, 1989. Rand Corporation R-2607...Neter, J., Wasserman, W., and Kutner, M. H., Applied Linear Regression Models, 2d ed., Boston, MA: Irwin, 1989. SAS User’s Guide: Basics, Version 5 ed

  19. Comparison of Selection Procedures and Validation of Criterion Used in Selection of Significant Control Variates of a Simulation Model

    DTIC Science & Technology

    1990-03-01

    and M.H. Knuter. Applied Linear Regression Models. Homewood IL: Richard D. Erwin Inc., 1983. Pritsker, A. Alan B. Introduction to Simulation and SLAM...Control Variates in Simulation," European Journal of Operational Research, 42: (1989). Neter, J., W. Wasserman, and M.H. Xnuter. Applied Linear Regression Models

  20. Some Applied Research Concerns Using Multiple Linear Regression Analysis.

    ERIC Educational Resources Information Center

    Newman, Isadore; Fraas, John W.

    The intention of this paper is to provide an overall reference on how a researcher can apply multiple linear regression in order to utilize the advantages that it has to offer. The advantages and some concerns expressed about the technique are examined. A number of practical ways by which researchers can deal with such concerns as…

  1. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  2. The influence of mid-latitude storm tracks on hot, cold, dry and wet extremes

    PubMed Central

    Lehmann, Jascha; Coumou, Dim

    2015-01-01

    Changes in mid-latitude circulation can strongly affect the number and intensity of extreme weather events. In particular, high-amplitude quasi-stationary planetary waves have been linked to prolonged weather extremes at the surface. In contrast, analyses of fast-traveling synoptic-scale waves and their direct influence on heat and cold extremes are scarce though changes in such waves have been detected and are projected for the 21st century. Here we apply regression analyses of synoptic activity with surface temperature and precipitation in monthly gridded observational data. We show that over large parts of mid-latitude continental regions, summer heat extremes are associated with low storm track activity. In winter, the occurrence of cold spells is related to low storm track activity over parts of eastern North America, Europe, and central- to eastern Asia. Storm tracks thus have a moderating effect on continental temperatures. Pronounced storm track activity favors monthly rainfall extremes throughout the year, whereas dry spells are associated with a lack thereof. Trend analyses reveal significant regional changes in recent decades favoring the occurrence of cold spells in the eastern US, droughts in California and heat extremes over Eurasia. PMID:26657163

  3. Clustering of dietary intake and sedentary behavior in 2-year-old children.

    PubMed

    Gubbels, Jessica S; Kremers, Stef P J; Stafleu, Annette; Dagnelie, Pieter C; de Vries, Sanne I; de Vries, Nanne K; Thijs, Carel

    2009-08-01

    To examine clustering of energy balance-related behaviors (EBRBs) in young children. This is crucial because lifestyle habits are formed at an early age and track in later life. This study is the first to examine EBRB clustering in children as young as 2 years. Cross-sectional data originated from the Child, Parent and Health: Lifestyle and Genetic Constitution (KOALA) Birth Cohort Study. Parents of 2578 2-year-old children completed a questionnaire. Correlation analyses, principal component analyses, and linear regression analyses were performed to examine clustering of EBRBs. We found modest but consistent correlations in EBRBs. Two clusters emerged: a "sedentary-snacking cluster" and a "fiber cluster." Television viewing clustered with computer use and unhealthy dietary behaviors. Children who frequently consumed vegetables also consumed fruit and brown bread more often and white bread less often. Lower maternal education and maternal obesity were associated with high scores on the sedentary-snacking cluster, whereas higher educational level was associated with high fiber cluster scores. Obesity-prone behavioral clusters are already visible in 2-year-old children and are related to maternal characteristics. The findings suggest that obesity prevention should apply an integrated approach to physical activity and dietary intake in early childhood.

  4. Using exceedance probabilities to detect anomalies in routinely recorded animal health data, with particular reference to foot-and-mouth disease in Viet Nam.

    PubMed

    Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L

    2014-10-01

    The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Analyzing the influence of admissions criteria and cultural norms on success in an international dental studies program.

    PubMed

    Itaya, Lisa E; Chambers, David W; King, Patricia A

    2008-03-01

    This study determines the extent to which admissions criteria and cultural norms predict the success of a foreign-trained dentist in a United States dental educational program. Correlation and regression tests were applied to an eleven-year period from 1994 to 2004 of retrospective admissions data for 144 International Dental Studies Program students. Five cultural norms were derived from the collective cultural dimensions of a scholarly work of validated multinational surveys by Geert Hofstede. These five cultural norms are Power Distance (degree of inequality between "haves" and "have-nots" in a culture); Individualism (support for independent or group behavior); Long-Term View (deferred gratification versus quick results/rewards); Masculinity (emphasis on performance/outcomes versus socialization); and Uncertainty Avoidance (ability to cope with an uncertain future). Hofstede's calculated country scores on these cultural dimensions applied to the students' countries of education and their influence on students' academic performance were assessed by correlation and regression analyses. Results showed that the TOEFL and National Board Part I examinations and the cultural norm of Long-Term View were the most positive predictors of grade point averages. The other four cultural norms studied were not predictors of success. Those who applied to the program more than once before being accepted did less well in the program, yet "less well" might have meant that they graduated with a 3.0 instead of a 3.5 GPA. Generally speaking, the more recent the graduated class, the higher the ending GPA has been. Admissions committees should determine if they want to invest the resources required to implement a multitude of admissions predictors to find the best of the qualified applicants.

  6. The applicability of dental wear in age estimation for a modern American population.

    PubMed

    Faillace, Katie E; Bethard, Jonathan D; Marks, Murray K

    2017-12-01

    Though applied in bioarchaeology, dental wear is an underexplored age indicator in the biological anthropology of contemporary populations, although research has been conducted on dental attrition in forensic contexts (Kim et al., , Journal of Forensic Sciences, 45, 303; Prince et al., , Journal of Forensic Sciences, 53, 588; Yun et al., , Journal of Forensic Sciences, 52, 678). The purpose of this study is to apply and adapt existing techniques for age estimation based on dental wear to a modern American population, with the aim of producing accurate age range estimates for individuals from an industrialized context. Methodologies following Yun and Prince were applied to a random sample from the University of New Mexico (n = 583) and Universidade de Coimbra (n = 50) cast and skeletal collections. Analysis of variance (ANOVA) and linear regression analyses were conducted to examine the relationship between tooth wear scores and age. Application of both Yun et al. () and Prince et al. () methodologies resulted in inaccurate age estimates. Recalibrated sectioning points correctly classified individuals as over or under 50 years for 88% of the sample. Linear regression demonstrated 60% of age estimates fell within ±10 years of the actual age, and accuracy improved for individuals under 45 years, with 74% of predictions within ±10 years. This study demonstrates age estimation from dental wear is possible for modern populations, with comparable age intervals to other established methods. It provides a quantifiable method of seriation into "older" and "younger" adult categories, and provides more reliable age interval estimates than cranial sutures in instances where only the skull is available. © 2017 Wiley Periodicals, Inc.

  7. Co-Occurring Psychosocial Problems and HIV Risk Among Women Attending Drinking Venues in a South African Township: A Syndemic Approach

    PubMed Central

    Pitpitan, Eileen V.; Kalichman, Seth C.; Eaton, Lisa A.; Cain, Demetria; Sikkema, Kathleen J.; Watt, Melissa H.; Skinner, Donald; Pieterse, Desiree

    2012-01-01

    Background In South Africa, women comprise the majority of HIV infections. Syndemics, or co-occurring epidemics and risk factors, have been applied to understanding HIV risk among marginalized groups. Purpose To apply the syndemic framework to examine psychosocial problems that co-occur among women attending drinking venues in South Africa, and to test how the co-occurrence of these problems may exacerbate risk for HIV infection. Method 560 women from a Cape Town township provided data on multiple psychosocial problems, including food insufficiency, depression, abuse experiences, problem drinking, and sexual behaviors. Results Bivariate associations among the syndemic factors showed a high degree of co-occurrence and regression analyses showed an additive effect of psychosocial problems on HIV risk behaviors. Conclusions These results demonstrate the utility of a syndemic framework to understand co-occurring psychosocial problems among women in South Africa. HIV prevention interventions should consider the compounding effects of psychosocial problems among women. PMID:23054944

  8. Everything should be as simple as possible, but no simpler: towards a protocol for accumulating evidence regarding the active content of health behaviour change interventions.

    PubMed

    Peters, Gjalt-Jorn Ygram; de Bruin, Marijn; Crutzen, Rik

    2015-01-01

    There is a need to consolidate the evidence base underlying our toolbox of methods of behaviour change. Recent efforts to this effect have conducted meta-regressions on evaluations of behaviour change interventions, deriving each method's effectiveness from its association to intervention effect size. However, there are a range of issues that raise concern about whether this approach is actually furthering or instead obstructing the advancement of health psychology theories and the quality of health behaviour change interventions. Using examples from theory, the literature and data from previous meta-analyses, these concerns and their implications are explained and illustrated. An iterative protocol for evidence base accumulation is proposed that integrates evidence derived from both experimental and applied behaviour change research, and combines theory development in experimental settings with theory testing in applied real-life settings. As evidence gathered in this manner accumulates, a cumulative science of behaviour change can develop.

  9. Everything should be as simple as possible, but no simpler: towards a protocol for accumulating evidence regarding the active content of health behaviour change interventions

    PubMed Central

    Peters, Gjalt-Jorn Ygram; de Bruin, Marijn; Crutzen, Rik

    2015-01-01

    There is a need to consolidate the evidence base underlying our toolbox of methods of behaviour change. Recent efforts to this effect have conducted meta-regressions on evaluations of behaviour change interventions, deriving each method's effectiveness from its association to intervention effect size. However, there are a range of issues that raise concern about whether this approach is actually furthering or instead obstructing the advancement of health psychology theories and the quality of health behaviour change interventions. Using examples from theory, the literature and data from previous meta-analyses, these concerns and their implications are explained and illustrated. An iterative protocol for evidence base accumulation is proposed that integrates evidence derived from both experimental and applied behaviour change research, and combines theory development in experimental settings with theory testing in applied real-life settings. As evidence gathered in this manner accumulates, a cumulative science of behaviour change can develop. PMID:25793484

  10. Accessibility of fast food outlets is associated with fast food intake. A study in the Capital Region of Denmark.

    PubMed

    Bernsdorf, Kamille Almer; Lau, Cathrine Juel; Andreasen, Anne Helms; Toft, Ulla; Lykke, Maja; Glümer, Charlotte

    2017-11-01

    Literature suggests that people living in areas with a wealth of unhealthy fast food options may show higher levels of fast food intake. Multilevel logistic regression analyses were applied to examine the association between GIS-located fast food outlets (FFOs) and self-reported fast food intake among adults (+ 16 years) in the Capital Region of Denmark (N = 48,305). Accessibility of FFOs was measured both as proximity (distance to nearest FFO) and density (number of FFOs within a 1km network buffer around home). Odds of fast food intake ≥ 1/week increased significantly with increasing FFO density and decreased significantly with increasing distance to the nearest FFO for distances ≤ 4km. For long distances (>4km), odds increased with increasing distance, although this applied only for car owners. Results suggest that Danish health promotion strategies need to consider the contribution of the built environment to unhealthy eating. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. An application of Six Sigma methodology to turnover intentions in health care.

    PubMed

    Taner, Mehmet

    2009-01-01

    The purpose of this study is to show how the principles of Six Sigma can be applied to the high turnover problem of doctors in medical emergency services and paramedic backup. Six Sigma's define-measure-analyse-improve-control (DMAIC) is applied for reducing the turnover rate of doctors in an organisation operating in emergency services. Variables of the model are determined. Explanatory factor analysis, multiple regression, analysis of variance (ANOVA) and Gage R&R are employed for the analysis. Personal burnout/stress and dissatisfaction from salary were found to be the "vital few" variables. The organisation took a new approach by improving its initiatives to doctors' working conditions. Sigma level of the process is increased. New policy and process changes have been found to effectively decrease the incidence of turnover intentions. The improved process is gained, standardised and institutionalised. This study is one of the few papers in the literature that elaborates the turnover problem of doctors working in the emergency and paramedic backup services.

  12. Leadership styles of nurse managers in ethical dilemmas: Reasons and consequences.

    PubMed

    Zydziunaite, Vilma; Suominen, Tarja

    2014-01-01

    Abstract Background: Understanding the reasons and consequences of leadership styles in ethical dilemmas is fundamental to exploring nurse managers' abilities to influence outcomes for patients and nursing personnel. To explain the associations between different leadership styles, reasons for their application and its consequences when nurse managers make decisions in ethical dilemmas. The data were collected between 15 October 2011 and 30 April 2012 by statistically validated questionnaire. The respondents (N = 278) were nurse managers. The data were analysed using SPSS 20.0, calculating Spearman's correlations, the Stepwise Regression and ANOVA. The reasons for applying different leadership styles in ethical dilemmas include personal characteristics, years in work position, institutional factors, and the professional authority of nurse managers. The applied leadership styles in ethical dilemmas are associated with the consequences regarding the satisfaction of patients,' relatives' and nurse managers' needs. Nurse managers exhibited leadership styles oriented to maintenance, focussing more on the 'doing the job' than on managing the decision-making in ethical dilemmas.

  13. Regression Analysis: Legal Applications in Institutional Research

    ERIC Educational Resources Information Center

    Frizell, Julie A.; Shippen, Benjamin S., Jr.; Luna, Andrew L.

    2008-01-01

    This article reviews multiple regression analysis, describes how its results should be interpreted, and instructs institutional researchers on how to conduct such analyses using an example focused on faculty pay equity between men and women. The use of multiple regression analysis will be presented as a method with which to compare salaries of…

  14. Regression Effects in Angoff Ratings: Examples from Credentialing Exams

    ERIC Educational Resources Information Center

    Wyse, Adam E.

    2018-01-01

    This article discusses regression effects that are commonly observed in Angoff ratings where panelists tend to think that hard items are easier than they are and easy items are more difficult than they are in comparison to estimated item difficulties. Analyses of data from two credentialing exams illustrate these regression effects and the…

  15. A health economic model to determine the long-term costs and clinical outcomes of raising low HDL-cholesterol in the prevention of coronary heart disease.

    PubMed

    Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C

    2006-12-01

    The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.

  16. International law's effects on health and its social determinants: protocol for a systematic review, meta-analysis, and meta-regression analysis.

    PubMed

    Hoffman, Steven J; Hughsam, Matthew; Randhawa, Harkanwal; Sritharan, Lathika; Guyatt, Gordon; Lavis, John N; Røttingen, John-Arne

    2016-04-16

    In recent years, there have been numerous calls for global institutions to develop and enforce new international laws. International laws are, however, often blunt instruments with many uncertain benefits, costs, risks of harm, and trade-offs. Thus, they are probably not always appropriate solutions to global health challenges. Given these uncertainties and international law's potential importance for improving global health, the paucity of synthesized evidence addressing whether international laws achieve their intended effects or whether they are superior in comparison to other approaches is problematic. Ten electronic bibliographic databases were searched using predefined search strategies, including MEDLINE, Global Health, CINAHL, Applied Social Sciences Index and Abstracts, Dissertations and Theses, International Bibliography of Social Sciences, International Political Science Abstracts, Social Sciences Abstracts, Social Sciences Citation Index, PAIS International, and Worldwide Political Science Abstracts. Two reviewers will independently screen titles and abstracts using predefined inclusion criteria. Pairs of reviewers will then independently screen the full-text of articles for inclusion using predefined inclusion criteria and then independently extract data and assess risk of bias for included studies. Where feasible, results will be pooled through subgroup analyses, meta-analyses, and meta-regression techniques. The findings of this review will contribute to a better understanding of the expected benefits and possible harms of using international law to address different kinds of problems, thereby providing important evidence-informed guidance on when and how it can be effectively introduced and implemented by countries and global institutions. PROSPERO CRD42015019830.

  17. Publication bias in obesity treatment trials?

    PubMed

    Allison, D B; Faith, M S; Gorman, B S

    1996-10-01

    The present investigation examined the extent of publication bias (namely the tendency to publish significant findings and file away non-significant findings) within the obesity treatment literature. Quantitative literature synthesis of four published meta-analyses from the obesity treatment literature. Interventions in these studies included pharmacological, educational, child, and couples treatments. To assess publication bias, several regression procedures (for example weighted least-squares, random-effects multi-level modeling, and robust regression methods) were used to regress effect sizes onto their standard errors, or proxies thereof, within each of the four meta-analysis. A significant positive beta weight in these analyses signified publication bias. There was evidence for publication bias within two of the four published meta-analyses, such that reviews of published studies were likely to overestimate clinical efficacy. The lack of evidence for publication bias within the two other meta-analyses might have been due to insufficient statistical power rather than the absence of selection bias. As in other disciplines, publication bias appears to exist in the obesity treatment literature. Suggestions are offered for managing publication bias once identified or reducing its likelihood in the first place.

  18. Factors associated with participation frequency and satisfaction among people applying for a housing adaptation grant.

    PubMed

    Thordardottir, Björg; Ekstam, Lisa; Chiatti, Carlos; Fänge, Agneta Malmgren

    2016-09-01

    People applying for a housing adaptation (HA) grant are at great risk of participation restrictions due to declining capacity and environmental barriers. To investigate the association of person-, environment-, and activity-related factors with participation frequency and satisfaction among people applying for a housing adaptation grant. Baseline cross-sectional data were collected during home visits (n = 128). The association between person-, environment-, and activity-related factors and participation frequency and satisfaction was analysed using logistic regressions. The main result is that frequency of participation outside the home is strongly associated with dependence in activities of daily living (ADL) and cognitive impairments, while satisfaction with participation outside the home is strongly associated with self-reported health. Moreover, aspects of usability in the home were associated with frequency of participation outside the home and satisfaction with participation in the home and outside the home alone. Dependence in ADL, cognitive impairments, self-rated health, and aspects of usability are important factors contributing to participation frequency and satisfaction among people applying for a housing adaptation grant, particularly outside the home. Our findings indicate that more attention should be directed towards activity-related factors to facilitate participation among HA applicants, inside and outside the home.

  19. Water quality parameter measurement using spectral signatures

    NASA Technical Reports Server (NTRS)

    White, P. E.

    1973-01-01

    Regression analysis is applied to the problem of measuring water quality parameters from remote sensing spectral signature data. The equations necessary to perform regression analysis are presented and methods of testing the strength and reliability of a regression are described. An efficient algorithm for selecting an optimal subset of the independent variables available for a regression is also presented.

  20. The Application of the Cumulative Logistic Regression Model to Automated Essay Scoring

    ERIC Educational Resources Information Center

    Haberman, Shelby J.; Sinharay, Sandip

    2010-01-01

    Most automated essay scoring programs use a linear regression model to predict an essay score from several essay features. This article applied a cumulative logit model instead of the linear regression model to automated essay scoring. Comparison of the performances of the linear regression model and the cumulative logit model was performed on a…

  1. Methods for estimating selected low-flow frequency statistics for unregulated streams in Kentucky

    USGS Publications Warehouse

    Martin, Gary R.; Arihood, Leslie D.

    2010-01-01

    This report provides estimates of, and presents methods for estimating, selected low-flow frequency statistics for unregulated streams in Kentucky including the 30-day mean low flows for recurrence intervals of 2 and 5 years (30Q2 and 30Q5) and the 7-day mean low flows for recurrence intervals of 5, 10, and 20 years (7Q2, 7Q10, and 7Q20). Estimates of these statistics are provided for 121 U.S. Geological Survey streamflow-gaging stations with data through the 2006 climate year, which is the 12-month period ending March 31 of each year. Data were screened to identify the periods of homogeneous, unregulated flows for use in the analyses. Logistic-regression equations are presented for estimating the annual probability of the selected low-flow frequency statistics being equal to zero. Weighted-least-squares regression equations were developed for estimating the magnitude of the nonzero 30Q2, 30Q5, 7Q2, 7Q10, and 7Q20 low flows. Three low-flow regions were defined for estimating the 7-day low-flow frequency statistics. The explicit explanatory variables in the regression equations include total drainage area and the mapped streamflow-variability index measured from a revised statewide coverage of this characteristic. The percentage of the station low-flow statistics correctly classified as zero or nonzero by use of the logistic-regression equations ranged from 87.5 to 93.8 percent. The average standard errors of prediction of the weighted-least-squares regression equations ranged from 108 to 226 percent. The 30Q2 regression equations have the smallest standard errors of prediction, and the 7Q20 regression equations have the largest standard errors of prediction. The regression equations are applicable only to stream sites with low flows unaffected by regulation from reservoirs and local diversions of flow and to drainage basins in specified ranges of basin characteristics. Caution is advised when applying the equations for basins with characteristics near the applicable limits and for basins with karst drainage features.

  2. Techniques for estimating flood-peak discharges of rural, unregulated streams in Ohio

    USGS Publications Warehouse

    Koltun, G.F.; Roberts, J.W.

    1990-01-01

    Multiple-regression equations are presented for estimating flood-peak discharges having recurrence intervals of 2, 5, 10, 25, 50, and 100 years at ungaged sites on rural, unregulated streams in Ohio. The average standard errors of prediction for the equations range from 33.4% to 41.4%. Peak discharge estimates determined by log-Pearson Type III analysis using data collected through the 1987 water year are reported for 275 streamflow-gaging stations. Ordinary least-squares multiple-regression techniques were used to divide the State into three regions and to identify a set of basin characteristics that help explain station-to- station variation in the log-Pearson estimates. Contributing drainage area, main-channel slope, and storage area were identified as suitable explanatory variables. Generalized least-square procedures, which include historical flow data and account for differences in the variance of flows at different gaging stations, spatial correlation among gaging station records, and variable lengths of station record were used to estimate the regression parameters. Weighted peak-discharge estimates computed as a function of the log-Pearson Type III and regression estimates are reported for each station. A method is provided to adjust regression estimates for ungaged sites by use of weighted and regression estimates for a gaged site located on the same stream. Limitations and shortcomings cited in an earlier report on the magnitude and frequency of floods in Ohio are addressed in this study. Geographic bias is no longer evident for the Maumee River basin of northwestern Ohio. No bias is found to be associated with the forested-area characteristic for the range used in the regression analysis (0.0 to 99.0%), nor is this characteristic significant in explaining peak discharges. Surface-mined area likewise is not significant in explaining peak discharges, and the regression equations are not biased when applied to basins having approximately 30% or less surface-mined area. Analyses of residuals indicate that the equations tend to overestimate flood-peak discharges for basins having approximately 30% or more surface-mined area. (USGS)

  3. Effectiveness of a low-intensity telephone counselling intervention on an untreated metabolic syndrome detected by national population screening in Korea: a non-randomised study using regression discontinuity design.

    PubMed

    Yi, Sang-Wook; Shin, Soon-Ae; Lee, Youn-Jung

    2015-07-10

    Whether low-intensity telephone-counselling interventions can improve cardiometabolic risk factors in screen-detected people with metabolic syndrome (MetS) is unclear. The aim of this study was to evaluate the effectiveness of a low-intensity, telephone-counselling programme on MetS implemented by the National Health Insurance Service (NHIS) of Korea using regression discontinuity design. A nationwide non-randomised intervention study with a regression discontinuity design. A retrospective analysis using data from NHIS. NHIS, Korea from January 2011 to June 2013. 5,378,558 beneficiaries with one or more MetS components by NHIS criteria detected by population screening were enrolled in the NHIS MetS Management Programme in 2012. Of these, 1,147,695 underwent annual follow-up examinations until June 2013 ('control group' which received control intervention, n=855,870; 'eligible group' which was eligible for counselling, n=291,825; 'intervention group' which participated in telephone counselling among eligible groups, n=23,968). Absolute changes in MetS components, weight and body mass index (BMI) were analysed. Multiple regression analyses were applied using the analysis of covariance model (baseline measurements as covariates). Low-intensity telephone counselling was associated with decreased systolic BP (-0.85 mm Hg, 95% CI -1.02 to -0.68), decreased diastolic BP (-0.63 mm Hg, -95% CI -0.75 to -0.50), decreased triglyceride (-1.57 mg/dL, 95% CI -2.89 to -0.25), reduced waist circumference (-0.09 cm, 95% CI -0.16 to -0.02), reduced weight (-0.19 kg, 95% CI -0.24 to -0.15) and reduced BMI (-0.07 kg/m(2), 95% CI -0.09 to -0.05), when comparing the intervention and control groups. When individuals with low high-density lipoprotein cholesterol were analysed, the intervention was also associated with increased HDL cholesterol (0.90 mg/dL, 95% CI 0.51 to 1.29). Low-intensity telephone counselling programmes could yield improvements in the following year on blood pressure, lipid profiles, weight and body mass index in untreated patients detected at the population screening. However, the improvements may be very modest and the clinical relevance of these small improvements may be limited. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. Effectiveness of a low-intensity telephone counselling intervention on an untreated metabolic syndrome detected by national population screening in Korea: a non-randomised study using regression discontinuity design

    PubMed Central

    Yi, Sang-Wook; Shin, Soon-Ae; Lee, Youn-Jung

    2015-01-01

    Objective Whether low-intensity telephone-counselling interventions can improve cardiometabolic risk factors in screen-detected people with metabolic syndrome (MetS) is unclear. The aim of this study was to evaluate the effectiveness of a low-intensity, telephone-counselling programme on MetS implemented by the National Health Insurance Service (NHIS) of Korea using regression discontinuity design. Design A nationwide non-randomised intervention study with a regression discontinuity design. A retrospective analysis using data from NHIS. Setting NHIS, Korea from January 2011 to June 2013. Participants 5 378 558 beneficiaries with one or more MetS components by NHIS criteria detected by population screening were enrolled in the NHIS MetS Management Programme in 2012. Of these, 1 147 695 underwent annual follow-up examinations until June 2013 (‘control group’ which received control intervention, n=855 870; ‘eligible group’ which was eligible for counselling, n=291 825; ‘intervention group’ which participated in telephone counselling among eligible groups, n=23 968). Main outcome measures Absolute changes in MetS components, weight and body mass index (BMI) were analysed. Multiple regression analyses were applied using the analysis of covariance model (baseline measurements as covariates). Results Low-intensity telephone counselling was associated with decreased systolic BP (−0.85 mm Hg, 95% CI −1.02 to −0.68), decreased diastolic BP (−0.63 mm Hg, −95% CI −0.75 to −0.50), decreased triglyceride (−1.57 mg/dL, 95% CI −2.89 to −0.25), reduced waist circumference (−0.09 cm, 95% CI −0.16 to −0.02), reduced weight (−0.19 kg, 95% CI −0.24 to −0.15) and reduced BMI (−0.07 kg/m2, 95% CI −0.09 to −0.05), when comparing the intervention and control groups. When individuals with low high-density lipoprotein cholesterol were analysed, the intervention was also associated with increased HDL cholesterol (0.90 mg/dL, 95% CI 0.51 to 1.29). Conclusions Low-intensity telephone counselling programmes could yield improvements in the following year on blood pressure, lipid profiles, weight and body mass index in untreated patients detected at the population screening. However, the improvements may be very modest and the clinical relevance of these small improvements may be limited. PMID:26163030

  5. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  6. Fungicide application practices and personal protective equipment use among orchard farmers in the agricultural health study.

    PubMed

    Hines, C J; Deddens, J A; Coble, J; Alavanja, M C R

    2007-04-01

    Fungicides are routinely applied to deciduous tree fruits for disease management. Seventy-four private orchard applicators enrolled in the Agricultural Health Study participated in the Orchard Fungicide Exposure Study in 2002-2003. During 144 days of observation, information was obtained on chemicals applied and applicator mixing, application, personal protective, and hygiene practices. At least half of the applicators had orchards with <100 trees. Air blast was the most frequent application method used (55%), followed by hand spray (44%). Rubber gloves were the most frequently worn protective equipment (68% mix; 59% apply), followed by respirators (45% mix; 49% apply), protective outerwear (36% mix; 37% apply), and rubber boots (35% mix; 36% apply). Eye protection was worn while mixing and applying on only 35% and 41% of the days, respectively. Bivariate analyses were performed using repeated logistic or repeated linear regression. Mean duration of mixing, pounds of captan applied, total acres sprayed, and number of tank mixes sprayed were greater for air blast than for hand spray (p < 0.05). Spraying from a tractor/vehicle without an enclosed cab was associated with wearing some type of coverall (p < 0.05). Applicators often did not wash their hands after mixing (77%), a finding not explained by glove use. Glove use during mixing was associated with younger age, while wearing long-sleeve shirts was associated with older age (p < 0.05 each). Self-reported unusually high fungicide exposures were more likely on days applicators performed repairs (p < 0.05). These data will be useful for evaluating fungicide exposure determinants among orchard applicators.

  7. Fragile--Handle with Care: Regression Analyses That Include Categorical Data.

    ERIC Educational Resources Information Center

    Brown, Diane Peacock

    In education and the social sciences, problems of interest to researchers and users of research often involve variables that do not meet the assumptions of regression in the area of an equal interval scale relative to a zero point. Various coding schemes exist that allow the use of regression while still answering the researcher's questions of…

  8. Structural classification of marshes with Polarimetric SAR highlighting the temporal mapping of marshes exposed to oil

    USGS Publications Warehouse

    Ramsey, Elijah W.; Rangoonwala, Amina; Jones, Cathleen E.

    2015-01-01

    Empirical relationships between field-derived Leaf Area Index (LAI) and Leaf Angle Distribution (LAD) and polarimetric synthetic aperture radar (PolSAR) based biophysical indicators were created and applied to map S. alterniflora marsh canopy structure. PolSAR and field data were collected near concurrently in the summers of 2010, 2011, and 2012 in coastal marshes, and PolSAR data alone were acquired in 2009. Regression analyses showed that LAI correspondence with the PolSAR biophysical indicator variables equaled or exceeded those of vegetation water content (VWC) correspondences. In the final six regressor model, the ratio HV/VV explained 49% of the total 77% explained LAI variance, and the HH-VV coherence and phase information accounted for the remainder. HV/HH dominated the two regressor LAD relationship, and spatial heterogeneity and backscatter mechanism followed by coherence information dominated the final three regressor model that explained 74% of the LAD variance. Regression results applied to 2009 through 2012 PolSAR images showed substantial changes in marsh LAI and LAD. Although the direct cause was not substantiated, following a release of freshwater in response to the 2010 Deepwater Horizon oil spill, the fairly uniform interior marsh structure of 2009 was more vertical and dense shortly after the oil spill cessation. After 2010, marsh structure generally progressed back toward the 2009 uniformity; however, the trend was more disjointed in oil impact marshes.             

  9. Effects of a randomized controlled trial to assess the six-months effects of a school based smoking prevention program in Saudi Arabia.

    PubMed

    Mohammed, Mutaz; Eggers, Sander Matthijs; Alotaiby, Fahad F; de Vries, Nanne; de Vries, Hein

    2016-09-01

    To examine the efficacy of a smoking prevention program which aimed to address smoking related cognitions and smoking behavior among Saudi adolescents age 13 to 15. A randomized controlled trial was used. Respondents in the experimental group (N=698) received five in-school sessions, while those in the control group (N=683) received no smoking prevention information (usual curriculum). Post-intervention data was collected six months after baseline. Logistic regression analysis was applied to assess effects on smoking initiation, and linear regression analysis was applied to assess changes in beliefs and analysis of covariance (ANCOVA) was used to assess intervention effects. All analyses were adjusted for the nested structure of students within schools. At post-intervention respondents from the experimental group reported in comparison with those from the control group a significantly more negative attitude towards smoking, stronger social norms against smoking, higher self-efficacy towards non-smoking, more action planning to remain a non-smoker, and lower intentions to smoke in the future. Smoking initiation was 3.2% in the experimental group and 8.8% in the control group (p<0.01). The prevention program reinforced non-smoking cognitions and non-smoking behavior. Therefore it is recommended to implement the program at a national level in Saudi-Arabia. Future studies are recommended to assess long term program effects and the conditions favoring national implementation of the program. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Can Emotional and Behavioral Dysregulation in Youth Be Decoded from Functional Neuroimaging?

    PubMed

    Portugal, Liana C L; Rosa, Maria João; Rao, Anil; Bebko, Genna; Bertocci, Michele A; Hinze, Amanda K; Bonar, Lisa; Almeida, Jorge R C; Perlman, Susan B; Versace, Amelia; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Demeter, Christine; Diwadkar, Vaibhav A; Ciuffetelli, Gary; Rodriguez, Eric; Forbes, Erika E; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Horwitz, Sarah M; Arnold, Eugene L; Fristad, Mary A; Youngstrom, Eric A; Findling, Robert L; Pereira, Mirtes; Oliveira, Leticia; Phillips, Mary L; Mourao-Miranda, Janaina

    2016-01-01

    High comorbidity among pediatric disorders characterized by behavioral and emotional dysregulation poses problems for diagnosis and treatment, and suggests that these disorders may be better conceptualized as dimensions of abnormal behaviors. Furthermore, identifying neuroimaging biomarkers related to dimensional measures of behavior may provide targets to guide individualized treatment. We aimed to use functional neuroimaging and pattern regression techniques to determine whether patterns of brain activity could accurately decode individual-level severity on a dimensional scale measuring behavioural and emotional dysregulation at two different time points. A sample of fifty-seven youth (mean age: 14.5 years; 32 males) was selected from a multi-site study of youth with parent-reported behavioral and emotional dysregulation. Participants performed a block-design reward paradigm during functional Magnetic Resonance Imaging (fMRI). Pattern regression analyses consisted of Relevance Vector Regression (RVR) and two cross-validation strategies implemented in the Pattern Recognition for Neuroimaging toolbox (PRoNTo). Medication was treated as a binary confounding variable. Decoded and actual clinical scores were compared using Pearson's correlation coefficient (r) and mean squared error (MSE) to evaluate the models. Permutation test was applied to estimate significance levels. Relevance Vector Regression identified patterns of neural activity associated with symptoms of behavioral and emotional dysregulation at the initial study screen and close to the fMRI scanning session. The correlation and the mean squared error between actual and decoded symptoms were significant at the initial study screen and close to the fMRI scanning session. However, after controlling for potential medication effects, results remained significant only for decoding symptoms at the initial study screen. Neural regions with the highest contribution to the pattern regression model included cerebellum, sensory-motor and fronto-limbic areas. The combination of pattern regression models and neuroimaging can help to determine the severity of behavioral and emotional dysregulation in youth at different time points.

  11. Statistical relations among earthquake magnitude, surface rupture length, and surface fault displacement

    USGS Publications Warehouse

    Bonilla, Manuel G.; Mark, Robert K.; Lienkaemper, James J.

    1984-01-01

    In order to refine correlations of surface-wave magnitude, fault rupture length at the ground surface, and fault displacement at the surface by including the uncertainties in these variables, the existing data were critically reviewed and a new data base was compiled. Earthquake magnitudes were redetermined as necessary to make them as consistent as possible with the Gutenberg methods and results, which make up much of the data base. Measurement errors were estimated for the three variables for 58 moderate to large shallow-focus earthquakes. Regression analyses were then made utilizing the estimated measurement errors.The regression analysis demonstrates that the relations among the variables magnitude, length, and displacement are stochastic in nature. The stochastic variance, introduced in part by incomplete surface expression of seismogenic faulting, variation in shear modulus, and regional factors, dominates the estimated measurement errors. Thus, it is appropriate to use ordinary least squares for the regression models, rather than regression models based upon an underlying deterministic relation in which the variance results primarily from measurement errors.Significant differences exist in correlations of certain combinations of length, displacement, and magnitude when events are grouped by fault type or by region, including attenuation regions delineated by Evernden and others.Estimates of the magnitude and the standard deviation of the magnitude of a prehistoric or future earthquake associated with a fault can be made by correlating Ms with the logarithms of rupture length, fault displacement, or the product of length and displacement.Fault rupture area could be reliably estimated for about 20 of the events in the data set. Regression of Ms on rupture area did not result in a marked improvement over regressions that did not involve rupture area. Because no subduction-zone earthquakes are included in this study, the reported results do not apply to such zones.

  12. Geographic information systems and logistic regression for high-resolution malaria risk mapping in a rural settlement of the southern Brazilian Amazon.

    PubMed

    de Oliveira, Elaine Cristina; dos Santos, Emerson Soares; Zeilhofer, Peter; Souza-Santos, Reinaldo; Atanaka-Santos, Marina

    2013-11-15

    In Brazil, 99% of the cases of malaria are concentrated in the Amazon region, with high level of transmission. The objectives of the study were to use geographic information systems (GIS) analysis and logistic regression as a tool to identify and analyse the relative likelihood and its socio-environmental determinants of malaria infection in the Vale do Amanhecer rural settlement, Brazil. A GIS database of georeferenced malaria cases, recorded in 2005, and multiple explanatory data layers was built, based on a multispectral Landsat 5 TM image, digital map of the settlement blocks and a SRTM digital elevation model. Satellite imagery was used to map the spatial patterns of land use and cover (LUC) and to derive spectral indices of vegetation density (NDVI) and soil/vegetation humidity (VSHI). An Euclidian distance operator was applied to measure proximity of domiciles to potential mosquito breeding habitats and gold mining areas. The malaria risk model was generated by multiple logistic regression, in which environmental factors were considered as independent variables and the number of cases, binarized by a threshold value was the dependent variable. Out of a total of 336 cases of malaria, 133 positive slides were from inhabitants at Road 08, which corresponds to 37.60% of the notifications. The southern region of the settlement presented 276 cases and a greater number of domiciles in which more than ten cases/home were notified. From these, 102 (30.36%) cases were caused by Plasmodium falciparum and 174 (51.79%) cases by Plasmodium vivax. Malaria risk is the highest in the south of the settlement, associated with proximity to gold mining sites, intense land use, high levels of soil/vegetation humidity and low vegetation density. Mid-resolution, remote sensing data and GIS-derived distance measures can be successfully combined with digital maps of the housing location of (non-) infected inhabitants to predict relative likelihood of disease infection through the analysis by logistic regression. Obtained findings on the relation between malaria cases and environmental factors should be applied in the future for land use planning in rural settlements in the Southern Amazon to minimize risks of disease transmission.

  13. Predicting 30-day Hospital Readmission with Publicly Available Administrative Database. A Conditional Logistic Regression Modeling Approach.

    PubMed

    Zhu, K; Lou, Z; Zhou, J; Ballester, N; Kong, N; Parikh, P

    2015-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". Hospital readmissions raise healthcare costs and cause significant distress to providers and patients. It is, therefore, of great interest to healthcare organizations to predict what patients are at risk to be readmitted to their hospitals. However, current logistic regression based risk prediction models have limited prediction power when applied to hospital administrative data. Meanwhile, although decision trees and random forests have been applied, they tend to be too complex to understand among the hospital practitioners. Explore the use of conditional logistic regression to increase the prediction accuracy. We analyzed an HCUP statewide inpatient discharge record dataset, which includes patient demographics, clinical and care utilization data from California. We extracted records of heart failure Medicare beneficiaries who had inpatient experience during an 11-month period. We corrected the data imbalance issue with under-sampling. In our study, we first applied standard logistic regression and decision tree to obtain influential variables and derive practically meaning decision rules. We then stratified the original data set accordingly and applied logistic regression on each data stratum. We further explored the effect of interacting variables in the logistic regression modeling. We conducted cross validation to assess the overall prediction performance of conditional logistic regression (CLR) and compared it with standard classification models. The developed CLR models outperformed several standard classification models (e.g., straightforward logistic regression, stepwise logistic regression, random forest, support vector machine). For example, the best CLR model improved the classification accuracy by nearly 20% over the straightforward logistic regression model. Furthermore, the developed CLR models tend to achieve better sensitivity of more than 10% over the standard classification models, which can be translated to correct labeling of additional 400 - 500 readmissions for heart failure patients in the state of California over a year. Lastly, several key predictor identified from the HCUP data include the disposition location from discharge, the number of chronic conditions, and the number of acute procedures. It would be beneficial to apply simple decision rules obtained from the decision tree in an ad-hoc manner to guide the cohort stratification. It could be potentially beneficial to explore the effect of pairwise interactions between influential predictors when building the logistic regression models for different data strata. Judicious use of the ad-hoc CLR models developed offers insights into future development of prediction models for hospital readmissions, which can lead to better intuition in identifying high-risk patients and developing effective post-discharge care strategies. Lastly, this paper is expected to raise the awareness of collecting data on additional markers and developing necessary database infrastructure for larger-scale exploratory studies on readmission risk prediction.

  14. Reduction of State Victim Compensation Disparities in Disadvantaged Crime Victims Through Active Outreach and Assistance: A Randomized Trial

    PubMed Central

    Alvidrez, Jennifer; Shumway, Martha; Boccellari, Alicia; Green, Jon Dean; Kelly, Vanessa; Merrill, Gregory

    2008-01-01

    Objectives. We examined whether providing active outreach and assistance to crime victims as part of comprehensive psychosocial services reduced disparities in access to state compensation funds. Methods. We analyzed data from a randomized trial of injured crime victims (N = 541) and compared outcomes from comprehensive psychosocial services with usual community care. We examined the impact of outreach and assistance on disparities in applying for victim compensation by testing for interactions between victim characteristics and treatment condition in logistic regression analyses. Results. Victims receiving comprehensive services were much more likely to apply for victim compensation than were victims receiving usual care. Comprehensive services decreased disparities associated with younger age, lower levels of education, and homelessness. Conclusions. State-level victim compensation funds are available to help individuals recover physically, psychologically, and financially from crime victimization. However, few crime victims apply for victim compensation, and there are particularly low application rates among young, male, ethnic minority, and physical assault victims. Active outreach and assistance can address disparities in access to victim compensation funds for disadvantaged populations and should be offered more widely to victims of violent crime. PMID:18382004

  15. Regionalization of low-flow characteristics of Tennessee streams

    USGS Publications Warehouse

    Bingham, R.H.

    1986-01-01

    Procedures for estimating 3-day 2-year, 3-day 10-year, 3-day 20-year, and 7-day 10-year low flows at ungaged stream sites in Tennessee are based on surface geology and drainage area size. One set of equations applies to west Tennessee streams, and another set applies to central and east Tennessee streams. The equations do not apply to streams where flow is significantly altered by activities of man. Standard errors of estimate of equations for west Tennessee are 24 to 32% and for central and east Tennessee 31 to 35%. Streamflow recession indexes, in days/log cycle, are used to account for effects of geology of the drainage basin on low flow of streams. The indexes in Tennessee range from 32 days/log cycle for clay and shale to 350 days/log cycle for gravel and sand, indicating different aquifer characteristics of the geologic units that sustain streamflows during periods of no surface runoff. Streamflow recession rate depends primarily on transmissivity and storage characteristics of the aquifers, and the average distance from stream channels to basin divides. Geology and drainage basin size are the most significant variables affecting low flow in Tennessee streams according to regression analyses. (Author 's abstract)

  16. Sequence analysis to assess labour market participation following vocational rehabilitation: an observational study among patients sick-listed with low back pain from a randomised clinical trial in Denmark

    PubMed Central

    Lindholdt, Louise; Labriola, Merete; Nielsen, Claus Vinther; Horsbøl, Trine Allerslev; Lund, Thomas

    2017-01-01

    Introduction The return-to-work (RTW) process after long-term sickness absence is often complex and long and implies multiple shifts between different labour market states for the absentee. Standard methods for examining RTW research typically rely on the analysis of one outcome measure at a time, which will not capture the many possible states and transitions the absentee can go through. The purpose of this study was to explore the potential added value of sequence analysis in supplement to standard regression analysis of a multidisciplinary RTW intervention among patients with low back pain (LBP). Methods The study population consisted of 160 patients randomly allocated to either a hospital-based brief or a multidisciplinary intervention. Data on labour market participation following intervention were obtained from a national register and analysed in two ways: as a binary outcome expressed as active or passive relief at a 1-year follow-up and as four different categories for labour market participation. Logistic regression and sequence analysis were performed. Results The logistic regression analysis showed no difference in labour market participation for patients in the two groups after 1 year. Applying sequence analysis showed differences in subsequent labour market participation after 2 years after baseline in favour of the brief intervention group versus the multidisciplinary intervention group. Conclusion The study indicated that sequence analysis could provide added analytical value as a supplement to traditional regression analysis in prospective studies of RTW among patients with LBP. PMID:28729315

  17. Vesicular stomatitis forecasting based on Google Trends

    PubMed Central

    Lu, Yi; Zhou, GuangYa; Chen, Qin

    2018-01-01

    Background Vesicular stomatitis (VS) is an important viral disease of livestock. The main feature of VS is irregular blisters that occur on the lips, tongue, oral mucosa, hoof crown and nipple. Humans can also be infected with vesicular stomatitis and develop meningitis. This study analyses 2014 American VS outbreaks in order to accurately predict vesicular stomatitis outbreak trends. Methods American VS outbreaks data were collected from OIE. The data for VS keywords were obtained by inputting 24 disease-related keywords into Google Trends. After calculating the Pearson and Spearman correlation coefficients, it was found that there was a relationship between outbreaks and keywords derived from Google Trends. Finally, the predicted model was constructed based on qualitative classification and quantitative regression. Results For the regression model, the Pearson correlation coefficients between the predicted outbreaks and actual outbreaks are 0.953 and 0.948, respectively. For the qualitative classification model, we constructed five classification predictive models and chose the best classification predictive model as the result. The results showed, SN (sensitivity), SP (specificity) and ACC (prediction accuracy) values of the best classification predictive model are 78.52%,72.5% and 77.14%, respectively. Conclusion This study applied Google search data to construct a qualitative classification model and a quantitative regression model. The results show that the method is effective and that these two models obtain more accurate forecast. PMID:29385198

  18. Linear regression analysis of Hospital Episode Statistics predicts a large increase in demand for elective hand surgery in England.

    PubMed

    Bebbington, Emily; Furniss, Dominic

    2015-02-01

    We integrated two factors, demographic population shifts and changes in prevalence of disease, to predict future trends in demand for hand surgery in England, to facilitate workforce planning. We analysed Hospital Episode Statistics data for Dupuytren's disease, carpal tunnel syndrome, cubital tunnel syndrome, and trigger finger from 1998 to 2011. Using linear regression, we estimated trends in both diagnosis and surgery until 2030. We integrated this regression with age specific population data from the Office for National Statistics in order to estimate how this will contribute to a change in workload over time. There has been a significant increase in both absolute numbers of diagnoses and surgery for all four conditions. Combined with future population data, we calculate that the total operative burden for these four conditions will increase from 87,582 in 2011 to 170,166 (95% confidence interval 144,517-195,353) in 2030. The prevalence of these diseases in the ageing population, and increasing prevalence of predisposing factors such as obesity and diabetes, may account for the predicted increase in workload. The most cost effective treatments must be sought, which requires high quality clinical trials. Our methodology can be applied to other sub-specialties to help anticipate the need for future service provision. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  19. Applicability of the Tanaka-Johnston and Moyers mixed dentition analyses in Northeast Han Chinese.

    PubMed

    Sherpa, Jangbu; Sah, Gopal; Rong, Zeng; Wu, Lipeng

    2015-06-01

    To assess applicability of the Tanaka-Johnston and Moyers prediction methods in a Han ethnic group from Northeast China and to develop prediction equations for this same population. Cross-sectional study. Department of Orthodontics, School of Stomatology, Jiamusi University, Heilongjiang, China. A total of 130 subjects (65 male and 65 female) aged 16-21 years from a Han ethnic group of Northeast China were recruited from dental students and patients seeking orthodontic treatment. Ethnicity was verified by questionnaire. Mesio-distal tooth width was measured using Digital Vernier calipers. Predicted values were obtained from the Tanaka-Johnston and Moyers methods in both arches were compared with the actual measured widths. Based on regression analysis, prediction equations were developed. Tanaka-Johnston equations were not precise, except for the upper arch in males. However, the Moyers 85th percentile in the upper arch and 75th percentile in the lower arch predicted the sum precisely in males. For females, the Moyers 75th percentile predicted the sum precisely for the upper arch, but none of the Moyers percentiles predicted in the lower arch. Both the Tanaka-Johnston and Moyers method may not be applied universally without question. Hence, it may be safer to develop regression equations for specific populations. Validating studies must be conducted to confirm the precision of these newly developed regression equations.

  20. Monitoring the quality consistency of Weibizhi tablets by micellar electrokinetic chromatography fingerprints combined with multivariate statistical analyses, the simple quantified ratio fingerprint method, and the fingerprint-efficacy relationship.

    PubMed

    Liu, Yingchun; Sun, Guoxiang; Wang, Yan; Yang, Lanping; Yang, Fangliang

    2015-06-01

    Micellar electrokinetic chromatography fingerprinting combined with quantification was successfully developed and applied to monitor the quality consistency of Weibizhi tablets, which is a classical compound preparation used to treat gastric ulcers. A background electrolyte composed of 57 mmol/L sodium borate, 21 mmol/L sodium dodecylsulfate and 100 mmol/L sodium hydroxide was used to separate compounds. To optimize capillary electrophoresis conditions, multivariate statistical analyses were applied. First, the most important factors influencing sample electrophoretic behavior were identified as background electrolyte concentrations. Then, a Box-Benhnken design response surface strategy using resolution index RF as an integrated response was set up to correlate factors with response. RF reflects the effective signal amount, resolution, and signal homogenization in an electropherogram, thus, it was regarded as an excellent indicator. In fingerprint assessments, simple quantified ratio fingerprint method was established for comprehensive quality discrimination of traditional Chinese medicines/herbal medicines from qualitative and quantitative perspectives, by which the quality of 27 samples from the same manufacturer were well differentiated. In addition, the fingerprint-efficacy relationship between fingerprints and antioxidant activities was established using partial least squares regression, which provided important medicinal efficacy information for quality control. The present study offered an efficient means for monitoring Weibizhi tablet quality consistency. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Relational regulation theory and the role of social support and organisational fairness for nurses in a general acute context.

    PubMed

    Rodwell, John; Munro, Louise

    2013-11-01

    To present a novel approach to nurse stress by exploring the demand-control-support model with organisational justice through the lens of relational regulation theory. Nursing is often stressful due to high demands and dissatisfaction with pay, which impacts the mental well-being and productivity of nurses. A cross-sectional design. A validated questionnaire was sent to the work addresses of all nursing and midwifery staff in a medium-sized general acute hospital in Australia. A total of 190 nurses and midwives returned completed questionnaires for the analyses. The multiple regression analyses demonstrated that the model applies to the prototypical context of a general acute hospital and that job control, supervisor support and outside work support improve the job satisfaction and mental health of nurses. Most importantly, supervisor support was found to buffer the impact of excessive work demands. Fairness of procedures, distribution of resources and the quality and consistency of information are also beneficial. Relational regulation theory is applied to these findings as a novel way to conceptualise the mechanisms of support and fairness in nursing. The importance of nurses' well-being and job satisfaction is a priority for improving clinical outcomes. Practically, this means nurse managers should be encouraging nurses in the pursuit of diverse relational activities both at work and outside work. © 2013 John Wiley & Sons Ltd.

  2. Spatial quantile regression using INLA with applications to childhood overweight in Malawi.

    PubMed

    Mtambo, Owen P L; Masangwi, Salule J; Kazembe, Lawrence N M

    2015-04-01

    Analyses of childhood overweight have mainly used mean regression. However, using quantile regression is more appropriate as it provides flexibility to analyse the determinants of overweight corresponding to quantiles of interest. The main objective of this study was to fit a Bayesian additive quantile regression model with structured spatial effects for childhood overweight in Malawi using the 2010 Malawi DHS data. Inference was fully Bayesian using R-INLA package. The significant determinants of childhood overweight ranged from socio-demographic factors such as type of residence to child and maternal factors such as child age and maternal BMI. We observed significant positive structured spatial effects on childhood overweight in some districts of Malawi. We recommended that the childhood malnutrition policy makers should consider timely interventions based on risk factors as identified in this paper including spatial targets of interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Variable selection and model choice in geoadditive regression models.

    PubMed

    Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard

    2009-06-01

    Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.

  4. Multicollinearity in spatial genetics: separating the wheat from the chaff using commonality analyses.

    PubMed

    Prunier, J G; Colyn, M; Legendre, X; Nimon, K F; Flamand, M C

    2015-01-01

    Direct gradient analyses in spatial genetics provide unique opportunities to describe the inherent complexity of genetic variation in wildlife species and are the object of many methodological developments. However, multicollinearity among explanatory variables is a systemic issue in multivariate regression analyses and is likely to cause serious difficulties in properly interpreting results of direct gradient analyses, with the risk of erroneous conclusions, misdirected research and inefficient or counterproductive conservation measures. Using simulated data sets along with linear and logistic regressions on distance matrices, we illustrate how commonality analysis (CA), a detailed variance-partitioning procedure that was recently introduced in the field of ecology, can be used to deal with nonindependence among spatial predictors. By decomposing model fit indices into unique and common (or shared) variance components, CA allows identifying the location and magnitude of multicollinearity, revealing spurious correlations and thus thoroughly improving the interpretation of multivariate regressions. Despite a few inherent limitations, especially in the case of resistance model optimization, this review highlights the great potential of CA to account for complex multicollinearity patterns in spatial genetics and identifies future applications and lines of research. We strongly urge spatial geneticists to systematically investigate commonalities when performing direct gradient analyses. © 2014 John Wiley & Sons Ltd.

  5. 50 CFR 224.101 - Enumeration of endangered marine and anadromous species.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... institutions) and which are identified as fish belonging to the NYB DPS based on genetics analyses, previously... genetics analyses, previously applied tags, previously applied marks, or documentation to verify that the... Carolina DPS based on genetics analyses, previously applied tags, previously applied marks, or...

  6. 50 CFR 224.101 - Enumeration of endangered marine and anadromous species.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... institutions) and which are identified as fish belonging to the NYB DPS based on genetics analyses, previously... genetics analyses, previously applied tags, previously applied marks, or documentation to verify that the... Carolina DPS based on genetics analyses, previously applied tags, previously applied marks, or...

  7. Minimal improvement of nurses' motivational interviewing skills in routine diabetes care one year after training: a cluster randomized trial.

    PubMed

    Jansink, Renate; Braspenning, Jozé; Laurant, Miranda; Keizer, Ellen; Elwyn, Glyn; Weijden, Trudy van der; Grol, Richard

    2013-03-28

    The effectiveness of nurse-led motivational interviewing (MI) in routine diabetes care in general practice is inconclusive. Knowledge about the extent to which nurses apply MI skills and the factors that affect the usage can help to understand the black box of this intervention. The current study compared MI skills of trained versus non-trained general practice nurses in diabetes consultations. The nurses participated in a cluster randomized trial in which a comprehensive program (including MI training) was tested on improving clinical parameters, lifestyle, patients' readiness to change lifestyle, and quality of life. Fifty-eight general practices were randomly assigned to usual care (35 nurses) or the intervention (30 nurses). The ratings of applying 24 MI skills (primary outcome) were based on five consultation recordings per nurse at baseline and 14 months later. Two judges evaluated independently the MI skills and the consultation characteristics time, amount of nurse communication, amount of lifestyle discussion and patients' readiness to change. The effect of the training on the MI skills was analysed with a multilevel linear regression by comparing baseline and the one-year follow-up between the interventions with usual care group. The overall effect of the consultation characteristics on the MI skills was studied in a multilevel regression analyses. At one year follow up, it was demonstrated that the nurses improved on 2 of the 24 MI skills, namely, "inviting the patient to talk about behaviour change" (mean difference=0.39, p=0.009), and "assessing patient's confidence in changing their lifestyle" (mean difference=0.28, p=0.037). Consultation time and the amount of lifestyle discussion as well as the patients' readiness to change health behaviour was associated positively with applying MI skills. The maintenance of the MI skills one year after the training program was minimal. The question is whether the success of MI to change unhealthy behaviour must be doubted, whether the technique is less suitable for patients with a complex chronic disease, such as diabetes mellitus, or that nurses have problems with the acquisition and maintenance of MI skills in daily practice. Overall, performing MI skills during consultation increases, if there is more time, more lifestyle discussion, and the patients show more readiness to change. Current Controlled Trials ISRCTN68707773.

  8. Minimal improvement of nurses’ motivational interviewing skills in routine diabetes care one year after training: a cluster randomized trial

    PubMed Central

    2013-01-01

    Background The effectiveness of nurse-led motivational interviewing (MI) in routine diabetes care in general practice is inconclusive. Knowledge about the extent to which nurses apply MI skills and the factors that affect the usage can help to understand the black box of this intervention. The current study compared MI skills of trained versus non-trained general practice nurses in diabetes consultations. The nurses participated in a cluster randomized trial in which a comprehensive program (including MI training) was tested on improving clinical parameters, lifestyle, patients’ readiness to change lifestyle, and quality of life. Methods Fifty-eight general practices were randomly assigned to usual care (35 nurses) or the intervention (30 nurses). The ratings of applying 24 MI skills (primary outcome) were based on five consultation recordings per nurse at baseline and 14 months later. Two judges evaluated independently the MI skills and the consultation characteristics time, amount of nurse communication, amount of lifestyle discussion and patients’ readiness to change. The effect of the training on the MI skills was analysed with a multilevel linear regression by comparing baseline and the one-year follow-up between the interventions with usual care group. The overall effect of the consultation characteristics on the MI skills was studied in a multilevel regression analyses. Results At one year follow up, it was demonstrated that the nurses improved on 2 of the 24 MI skills, namely, “inviting the patient to talk about behaviour change” (mean difference=0.39, p=0.009), and “assessing patient’s confidence in changing their lifestyle” (mean difference=0.28, p=0.037). Consultation time and the amount of lifestyle discussion as well as the patients’ readiness to change health behaviour was associated positively with applying MI skills. Conclusions The maintenance of the MI skills one year after the training program was minimal. The question is whether the success of MI to change unhealthy behaviour must be doubted, whether the technique is less suitable for patients with a complex chronic disease, such as diabetes mellitus, or that nurses have problems with the acquisition and maintenance of MI skills in daily practice. Overall, performing MI skills during consultation increases, if there is more time, more lifestyle discussion, and the patients show more readiness to change. Trial registration Current Controlled Trials ISRCTN68707773 PMID:23537327

  9. Estimating the concrete compressive strength using hard clustering and fuzzy clustering based regression techniques.

    PubMed

    Nagwani, Naresh Kumar; Deo, Shirish V

    2014-01-01

    Understanding of the compressive strength of concrete is important for activities like construction arrangement, prestressing operations, and proportioning new mixtures and for the quality assurance. Regression techniques are most widely used for prediction tasks where relationship between the independent variables and dependent (prediction) variable is identified. The accuracy of the regression techniques for prediction can be improved if clustering can be used along with regression. Clustering along with regression will ensure the more accurate curve fitting between the dependent and independent variables. In this work cluster regression technique is applied for estimating the compressive strength of the concrete and a novel state of the art is proposed for predicting the concrete compressive strength. The objective of this work is to demonstrate that clustering along with regression ensures less prediction errors for estimating the concrete compressive strength. The proposed technique consists of two major stages: in the first stage, clustering is used to group the similar characteristics concrete data and then in the second stage regression techniques are applied over these clusters (groups) to predict the compressive strength from individual clusters. It is found from experiments that clustering along with regression techniques gives minimum errors for predicting compressive strength of concrete; also fuzzy clustering algorithm C-means performs better than K-means algorithm.

  10. Estimating the Concrete Compressive Strength Using Hard Clustering and Fuzzy Clustering Based Regression Techniques

    PubMed Central

    Nagwani, Naresh Kumar; Deo, Shirish V.

    2014-01-01

    Understanding of the compressive strength of concrete is important for activities like construction arrangement, prestressing operations, and proportioning new mixtures and for the quality assurance. Regression techniques are most widely used for prediction tasks where relationship between the independent variables and dependent (prediction) variable is identified. The accuracy of the regression techniques for prediction can be improved if clustering can be used along with regression. Clustering along with regression will ensure the more accurate curve fitting between the dependent and independent variables. In this work cluster regression technique is applied for estimating the compressive strength of the concrete and a novel state of the art is proposed for predicting the concrete compressive strength. The objective of this work is to demonstrate that clustering along with regression ensures less prediction errors for estimating the concrete compressive strength. The proposed technique consists of two major stages: in the first stage, clustering is used to group the similar characteristics concrete data and then in the second stage regression techniques are applied over these clusters (groups) to predict the compressive strength from individual clusters. It is found from experiments that clustering along with regression techniques gives minimum errors for predicting compressive strength of concrete; also fuzzy clustering algorithm C-means performs better than K-means algorithm. PMID:25374939

  11. Analysis of Sting Balance Calibration Data Using Optimized Regression Models

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Bader, Jon B.

    2010-01-01

    Calibration data of a wind tunnel sting balance was processed using a candidate math model search algorithm that recommends an optimized regression model for the data analysis. During the calibration the normal force and the moment at the balance moment center were selected as independent calibration variables. The sting balance itself had two moment gages. Therefore, after analyzing the connection between calibration loads and gage outputs, it was decided to choose the difference and the sum of the gage outputs as the two responses that best describe the behavior of the balance. The math model search algorithm was applied to these two responses. An optimized regression model was obtained for each response. Classical strain gage balance load transformations and the equations of the deflection of a cantilever beam under load are used to show that the search algorithm s two optimized regression models are supported by a theoretical analysis of the relationship between the applied calibration loads and the measured gage outputs. The analysis of the sting balance calibration data set is a rare example of a situation when terms of a regression model of a balance can directly be derived from first principles of physics. In addition, it is interesting to note that the search algorithm recommended the correct regression model term combinations using only a set of statistical quality metrics that were applied to the experimental data during the algorithm s term selection process.

  12. Numeric promoter description - A comparative view on concepts and general application.

    PubMed

    Beier, Rico; Labudde, Dirk

    2016-01-01

    Nucleic acid molecules play a key role in a variety of biological processes. Starting from storage and transfer tasks, this also comprises the triggering of biological processes, regulatory effects and the active influence gained by target binding. Based on the experimental output (in this case promoter sequences), further in silico analyses aid in gaining new insights into these processes and interactions. The numerical description of nucleic acids thereby constitutes a bridge between the concrete biological issues and the analytical methods. Hence, this study compares 26 descriptor sets obtained by applying well-known numerical description concepts to an established dataset of 38 DNA promoter sequences. The suitability of the description sets was evaluated by computing partial least squares regression models and assessing the model accuracy. We conclude that the major importance regarding the descriptive power is attached to positional information rather than to explicitly incorporated physico-chemical information, since a sufficient amount of implicit physico-chemical information is already encoded in the nucleobase classification. The regression models especially benefited from employing the information that is encoded in the sequential and structural neighborhood of the nucleobases. Thus, the analyses of n-grams (short fragments of length n) suggested that they are valuable descriptors for DNA target interactions. A mixed n-gram descriptor set thereby yielded the best description of the promoter sequences. The corresponding regression model was checked and found to be plausible as it was able to reproduce the characteristic binding motifs of promoter sequences in a reasonable degree. As most functional nucleic acids are based on the principle of molecular recognition, the findings are not restricted to promoter sequences, but can rather be transferred to other kinds of functional nucleic acids. Thus, the concepts presented in this study could provide advantages for future nucleic acid-based technologies, like biosensoring, therapeutics and molecular imaging. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Survival Regression Modeling Strategies in CVD Prediction.

    PubMed

    Barkhordari, Mahnaz; Padyab, Mojgan; Sardarinia, Mahsa; Hadaegh, Farzad; Azizi, Fereidoun; Bozorgmanesh, Mohammadreza

    2016-04-01

    A fundamental part of prevention is prediction. Potential predictors are the sine qua non of prediction models. However, whether incorporating novel predictors to prediction models could be directly translated to added predictive value remains an area of dispute. The difference between the predictive power of a predictive model with (enhanced model) and without (baseline model) a certain predictor is generally regarded as an indicator of the predictive value added by that predictor. Indices such as discrimination and calibration have long been used in this regard. Recently, the use of added predictive value has been suggested while comparing the predictive performances of the predictive models with and without novel biomarkers. User-friendly statistical software capable of implementing novel statistical procedures is conspicuously lacking. This shortcoming has restricted implementation of such novel model assessment methods. We aimed to construct Stata commands to help researchers obtain the aforementioned statistical indices. We have written Stata commands that are intended to help researchers obtain the following. 1, Nam-D'Agostino X 2 goodness of fit test; 2, Cut point-free and cut point-based net reclassification improvement index (NRI), relative absolute integrated discriminatory improvement index (IDI), and survival-based regression analyses. We applied the commands to real data on women participating in the Tehran lipid and glucose study (TLGS) to examine if information relating to a family history of premature cardiovascular disease (CVD), waist circumference, and fasting plasma glucose can improve predictive performance of Framingham's general CVD risk algorithm. The command is adpredsurv for survival models. Herein we have described the Stata package "adpredsurv" for calculation of the Nam-D'Agostino X 2 goodness of fit test as well as cut point-free and cut point-based NRI, relative and absolute IDI, and survival-based regression analyses. We hope this work encourages the use of novel methods in examining predictive capacity of the emerging plethora of novel biomarkers.

  14. History of falls, gait, balance, and fall risks in older cancer survivors living in the community.

    PubMed

    Huang, Min H; Shilling, Tracy; Miller, Kara A; Smith, Kristin; LaVictoire, Kayle

    2015-01-01

    Older cancer survivors may be predisposed to falls because cancer-related sequelae affect virtually all body systems. The use of a history of falls, gait speed, and balance tests to assess fall risks remains to be investigated in this population. This study examined the relationship of previous falls, gait, and balance with falls in community-dwelling older cancer survivors. At the baseline, demographics, health information, and the history of falls in the past year were obtained through interviewing. Participants performed tests including gait speed, Balance Evaluation Systems Test, and short-version of Activities-specific Balance Confidence scale. Falls were tracked by mailing of monthly reports for 6 months. A "faller" was a person with ≥1 fall during follow-up. Univariate analyses, including independent sample t-tests and Fisher's exact tests, compared baseline demographics, gait speed, and balance between fallers and non-fallers. For univariate analyses, Bonferroni correction was applied for multiple comparisons. Baseline variables with P<0.15 were included in a forward logistic regression model to identify factors predictive of falls with age as covariate. Sensitivity and specificity of each predictor of falls in the model were calculated. Significance level for the regression analysis was P<0.05. During follow-up, 59% of participants had one or more falls. Baseline demographics, health information, history of falls, gaits speed, and balance tests did not differ significantly between fallers and non-fallers. Forward logistic regression revealed that a history of falls was a significant predictor of falls in the final model (odds ratio =6.81; 95% confidence interval =1.594-29.074) (P<0.05). Sensitivity and specificity for correctly identifying a faller using the positive history of falls were 74% and 69%, respectively. Current findings suggested that for community-dwelling older cancer survivors with mixed diagnoses, asking about the history of falls may help detect individuals at risk of falling.

  15. History of falls, gait, balance, and fall risks in older cancer survivors living in the community

    PubMed Central

    Huang, Min H; Shilling, Tracy; Miller, Kara A; Smith, Kristin; LaVictoire, Kayle

    2015-01-01

    Older cancer survivors may be predisposed to falls because cancer-related sequelae affect virtually all body systems. The use of a history of falls, gait speed, and balance tests to assess fall risks remains to be investigated in this population. This study examined the relationship of previous falls, gait, and balance with falls in community-dwelling older cancer survivors. At the baseline, demographics, health information, and the history of falls in the past year were obtained through interviewing. Participants performed tests including gait speed, Balance Evaluation Systems Test, and short-version of Activities-specific Balance Confidence scale. Falls were tracked by mailing of monthly reports for 6 months. A “faller” was a person with ≥1 fall during follow-up. Univariate analyses, including independent sample t-tests and Fisher’s exact tests, compared baseline demographics, gait speed, and balance between fallers and non-fallers. For univariate analyses, Bonferroni correction was applied for multiple comparisons. Baseline variables with P<0.15 were included in a forward logistic regression model to identify factors predictive of falls with age as covariate. Sensitivity and specificity of each predictor of falls in the model were calculated. Significance level for the regression analysis was P<0.05. During follow-up, 59% of participants had one or more falls. Baseline demographics, health information, history of falls, gaits speed, and balance tests did not differ significantly between fallers and non-fallers. Forward logistic regression revealed that a history of falls was a significant predictor of falls in the final model (odds ratio =6.81; 95% confidence interval =1.594–29.074) (P<0.05). Sensitivity and specificity for correctly identifying a faller using the positive history of falls were 74% and 69%, respectively. Current findings suggested that for community-dwelling older cancer survivors with mixed diagnoses, asking about the history of falls may help detect individuals at risk of falling. PMID:26425079

  16. A Comparison between the Use of Beta Weights and Structure Coefficients in Interpreting Regression Results

    ERIC Educational Resources Information Center

    Tong, Fuhui

    2006-01-01

    Background: An extensive body of researches has favored the use of regression over other parametric analyses that are based on OVA. In case of noteworthy regression results, researchers tend to explore magnitude of beta weights for the respective predictors. Purpose: The purpose of this paper is to examine both beta weights and structure…

  17. Applying additive logistic regression to data derived from sensors monitoring behavioral and physiological characteristics of dairy cows to detect lameness.

    PubMed

    Kamphuis, C; Frank, E; Burke, J K; Verkerk, G A; Jago, J G

    2013-01-01

    The hypothesis was that sensors currently available on farm that monitor behavioral and physiological characteristics have potential for the detection of lameness in dairy cows. This was tested by applying additive logistic regression to variables derived from sensor data. Data were collected between November 2010 and June 2012 on 5 commercial pasture-based dairy farms. Sensor data from weigh scales (liveweight), pedometers (activity), and milk meters (milking order, unadjusted and adjusted milk yield in the first 2 min of milking, total milk yield, and milking duration) were collected at every milking from 4,904 cows. Lameness events were recorded by farmers who were trained in detecting lameness before the study commenced. A total of 318 lameness events affecting 292 cows were available for statistical analyses. For each lameness event, the lame cow's sensor data for a time period of 14 d before observation date were randomly matched by farm and date to 10 healthy cows (i.e., cows that were not lame and had no other health event recorded for the matched time period). Sensor data relating to the 14-d time periods were used for developing univariable (using one source of sensor data) and multivariable (using multiple sources of sensor data) models. Model development involved the use of additive logistic regression by applying the LogitBoost algorithm with a regression tree as base learner. The model's output was a probability estimate for lameness, given the sensor data collected during the 14-d time period. Models were validated using leave-one-farm-out cross-validation and, as a result of this validation, each cow in the data set (318 lame and 3,180 nonlame cows) received a probability estimate for lameness. Based on the area under the curve (AUC), results indicated that univariable models had low predictive potential, with the highest AUC values found for liveweight (AUC=0.66), activity (AUC=0.60), and milking order (AUC=0.65). Combining these 3 sensors improved AUC to 0.74. Detection performance of this combined model varied between farms but it consistently and significantly outperformed univariable models across farms at a fixed specificity of 80%. Still, detection performance was not high enough to be implemented in practice on large, pasture-based dairy farms. Future research may improve performance by developing variables based on sensor data of liveweight, activity, and milking order, but that better describe changes in sensor data patterns when cows go lame. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Conditional Density Estimation with HMM Based Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Hu, Fasheng; Liu, Zhenqiu; Jia, Chunxin; Chen, Dechang

    Conditional density estimation is very important in financial engineer, risk management, and other engineering computing problem. However, most regression models have a latent assumption that the probability density is a Gaussian distribution, which is not necessarily true in many real life applications. In this paper, we give a framework to estimate or predict the conditional density mixture dynamically. Through combining the Input-Output HMM with SVM regression together and building a SVM model in each state of the HMM, we can estimate a conditional density mixture instead of a single gaussian. With each SVM in each node, this model can be applied for not only regression but classifications as well. We applied this model to denoise the ECG data. The proposed method has the potential to apply to other time series such as stock market return predictions.

  19. Analysis of Learning Curve Fitting Techniques.

    DTIC Science & Technology

    1987-09-01

    1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied

  20. Accuracy assessment of linear spectral mixture model due to terrain undulation

    NASA Astrophysics Data System (ADS)

    Wang, Tianxing; Chen, Songlin; Ma, Ya

    2008-12-01

    Mixture spectra are common in remote sensing due to the limitations of spatial resolution and the heterogeneity of land surface. During the past 30 years, a lot of subpixel model have developed to investigate the information within mixture pixels. Linear spectral mixture model (LSMM) is a simper and more general subpixel model. LSMM also known as spectral mixture analysis is a widely used procedure to determine the proportion of endmembers (constituent materials) within a pixel based on the endmembers' spectral characteristics. The unmixing accuracy of LSMM is restricted by variety of factors, but now the research about LSMM is mostly focused on appraisement of nonlinear effect relating to itself and techniques used to select endmembers, unfortunately, the environment conditions of study area which could sway the unmixing-accuracy, such as atmospheric scatting and terrain undulation, are not studied. This paper probes emphatically into the accuracy uncertainty of LSMM resulting from the terrain undulation. ASTER dataset was chosen and the C terrain correction algorithm was applied to it. Based on this, fractional abundances for different cover types were extracted from both pre- and post-C terrain illumination corrected ASTER using LSMM. Simultaneously, the regression analyses and the IKONOS image were introduced to assess the unmixing accuracy. Results showed that terrain undulation could dramatically constrain the application of LSMM in mountain area. Specifically, for vegetation abundances, a improved unmixing accuracy of 17.6% (regression against to NDVI) and 18.6% (regression against to MVI) for R2 was achieved respectively by removing terrain undulation. Anyway, this study indicated in a quantitative way that effective removal or minimization of terrain illumination effects was essential for applying LSMM. This paper could also provide a new instance for LSMM applications in mountainous areas. In addition, the methods employed in this study could be effectively used to evaluate different algorithms of terrain undulation correction for further study.

  1. The Spatial Distribution of Hepatitis C Virus Infections and Associated Determinants--An Application of a Geographically Weighted Poisson Regression for Evidence-Based Screening Interventions in Hotspots.

    PubMed

    Kauhl, Boris; Heil, Jeanne; Hoebe, Christian J P A; Schweikart, Jürgen; Krafft, Thomas; Dukers-Muijrers, Nicole H T M

    2015-01-01

    Hepatitis C Virus (HCV) infections are a major cause for liver diseases. A large proportion of these infections remain hidden to care due to its mostly asymptomatic nature. Population-based screening and screening targeted on behavioural risk groups had not proven to be effective in revealing these hidden infections. Therefore, more practically applicable approaches to target screenings are necessary. Geographic Information Systems (GIS) and spatial epidemiological methods may provide a more feasible basis for screening interventions through the identification of hotspots as well as demographic and socio-economic determinants. Analysed data included all HCV tests (n = 23,800) performed in the southern area of the Netherlands between 2002-2008. HCV positivity was defined as a positive immunoblot or polymerase chain reaction test. Population data were matched to the geocoded HCV test data. The spatial scan statistic was applied to detect areas with elevated HCV risk. We applied global regression models to determine associations between population-based determinants and HCV risk. Geographically weighted Poisson regression models were then constructed to determine local differences of the association between HCV risk and population-based determinants. HCV prevalence varied geographically and clustered in urban areas. The main population at risk were middle-aged males, non-western immigrants and divorced persons. Socio-economic determinants consisted of one-person households, persons with low income and mean property value. However, the association between HCV risk and demographic as well as socio-economic determinants displayed strong regional and intra-urban differences. The detection of local hotspots in our study may serve as a basis for prioritization of areas for future targeted interventions. Demographic and socio-economic determinants associated with HCV risk show regional differences underlining that a one-size-fits-all approach even within small geographic areas may not be appropriate. Future screening interventions need to consider the spatially varying association between HCV risk and associated demographic and socio-economic determinants.

  2. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    PubMed

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  3. The effect of a cryotherapy gel wrap on the microcirculation of skin affected by chronic venous disorders.

    PubMed

    Kelechi, Teresa J; Mueller, Martina; Zapka, Jane G; King, Dana E

    2011-11-01

    The aim of this randomized clinical trial was to investigate a cryotherapy (cooling) gel wrap applied to lower leg skin affected by chronic venous disorders to determine whether therapeutic cooling improves skin microcirculation. Chronic venous disorders are under-recognized vascular health problems that result in severe skin damage and ulcerations of the lower legs. Impaired skin microcirculation contributes to venous leg ulcer development, thus new prevention therapies should address the microcirculation to prevent venous leg ulcers. Sixty participants (n = 30 per group) were randomized to receive one of two daily 30-minute interventions for four weeks. The treatment group applied the cryotherapy gel wrap around the affected lower leg skin, or compression and elevated the legs on a special pillow each evening at bedtime. The standard care group wore compression and elevated the legs only. Laboratory pre- and post-measures included microcirculation measures of skin temperature with a thermistor, blood flow with a laser Doppler flowmeter, and venous refill time with a photoplethysmograph. Data were collected between 2008 2009 and analysed using descriptive statistics, paired t-tests or Wilcoxon signed ranks tests, logistic regression analyses, and mixed model analyses. Fifty-seven participants (treatment = 28; standard care = 29) completed the study. The mean age was 62 years, 70% female, 50% African American. In the final adjusted model, there was a statistically significant decrease in blood flow between the two groups (-6.2[-11.8; -0.6], P = 0.03). No statistically significant differences were noted in temperature or venous refill time. Study findings suggest that cryotherapy improves blood flow by slowing movement within the microcirculation and thus might potentially provide a therapeutic benefit to prevent leg ulcers. © 2011 Blackwell Publishing Ltd.

  4. The effect of a cryotherapy gel wrap on the microcirculation of skin affected by Chronic Venous Disorders

    PubMed Central

    Mueller, Martina; Zapka, Jane G.; King, Dana E.

    2011-01-01

    Aim This randomized clinical trial was conducted 2008 – 2009 to investigate a cryotherapy (cooling) gel wrap applied to lower leg skin affected by chronic venous disorders to determine whether therapeutic cooling improves skin microcirculation. Impaired skin microcirculation contributes to venous leg ulcer development, thus new prevention therapies should address the microcirculation to prevent venous leg ulcers. Data Sources Sixty participants (n = 30 per group) were randomized to receive one of two daily 30-minute interventions for four weeks. The treatment group applied the cryotherapy gel wrap around the affected lower leg skin, or compression and elevated the legs on a special pillow each evening at bedtime. The standard care group wore compression and elevated the legs only. Laboratory pre- and post-measures included microcirculation measures of skin temperature with a thermistor, blood flow with a laser Doppler flowmeter, and venous refill time with a photoplethysmograph. Review methods Data were analysed using descriptive statistics, paired t-tests or Wilcoxon signed ranks tests, logistic regression analyses, and mixed model analyses. Results Fifty-seven participants (treatment = 28; standard care = 29) completed the study. The mean age was 62 years, 70% female, 50% African American. In the final adjusted model, there was a statistically significant decrease in blood flow between the two groups (−6.2[−11.8; −0.6], P = 0.03). No statistically significant differences were noted in temperature or venous refill time. Conclusion Study findings suggest that cryotherapy improves blood flow by slowing movement within the microcirculation and thus might potentially provide a therapeutic benefit to prevent leg ulcers. PMID:21592186

  5. Which kind of psychometrics is adequate for patient satisfaction questionnaires?

    PubMed

    Konerding, Uwe

    2016-01-01

    The construction and psychometric analysis of patient satisfaction questionnaires are discussed. The discussion is based upon the classification of multi-item questionnaires into scales or indices. Scales consist of items that describe the effects of the latent psychological variable to be measured, and indices consist of items that describe the causes of this variable. Whether patient satisfaction questionnaires should be constructed and analyzed as scales or as indices depends upon the purpose for which these questionnaires are required. If the final aim is improving care with regard to patients' preferences, then these questionnaires should be constructed and analyzed as indices. This implies two requirements: 1) items for patient satisfaction questionnaires should be selected in such a way that the universe of possible causes of patient satisfaction is covered optimally and 2) Cronbach's alpha, principal component analysis, exploratory factor analysis, confirmatory factor analysis, and analyses with models from item response theory, such as the Rasch Model, should not be applied for psychometric analyses. Instead, multivariate regression analyses with a direct rating of patient satisfaction as the dependent variable and the individual questionnaire items as independent variables should be performed. The coefficients produced by such an analysis can be applied for selecting the best items and for weighting the selected items when a sum score is determined. The lower boundaries of the validity of the unweighted and the weighted sum scores can be estimated by their correlations with the direct satisfaction rating. While the first requirement is fulfilled in the majority of the previous patient satisfaction questionnaires, the second one deviates from previous practice. Hence, if patient satisfaction is actually measured with the final aim of improving care with regard to patients' preferences, then future practice should be changed so that the second requirement is also fulfilled.

  6. ANNUAL PATIENT TIME COSTS ASSOCIATED WITH MEDICAL CARE AMONG CANCER SURVIVORS IN THE UNITED STATES

    PubMed Central

    Yabroff, K. Robin; Guy, Gery P.; Ekwueme, Donatus U.; McNeel, Timothy; Rozjabek, Heather M.; Dowling, Emily; Li, Chunyu; Virgo, Katherine S.

    2014-01-01

    Background Although patient time costs are recommended for inclusion in cost-effectiveness analyses, these data are not routinely collected. We used nationally representative data and a medical service-based approach to estimate annual patient time costs among cancer survivors. Methods We identified 6,699 cancer survivors and 86,412 individuals without a cancer history ≥ 18 years from the 2008–2011 Medical Expenditure Panel Survey (MEPS). Service use was categorized as hospitalizations, emergency room (ER) use, provider visits, ambulatory surgery, chemotherapy, and radiation therapy. Service time estimates were applied to frequencies for each service category and the U.S. median wage rate in 2011 was used to value time. We evaluated the association between cancer survivorship and service use frequencies and patient time costs with multivariable regression models, stratified by age group (18–64 and 65+ years). Sensitivity analyses evaluated different approaches for valuing time. Results Cancer survivors were more likely to have hospitalizations, ER visits, ambulatory surgeries, and provider visits in the past year than individuals without a cancer history in adjusted analyses (p<0.05). Annual patient time was higher for cancer survivors than individuals without a cancer history among those ages 18–64 (30.2 vs. 13.6 hours; p<0.001) and ages 65+ (55.1 vs. 36.6 hours; p<0.001), as were annual patient time costs (18–64 years: $500 vs. $226; p<0.001 and 65+ years: $913 vs. $607; p<0.001). Conclusions Cancer survivors had greater annual medical service use and patient time costs than individuals without a cancer history. This medical service-based approach for estimating annual time costs can also be applied to other conditions. PMID:24926706

  7. Annual patient time costs associated with medical care among cancer survivors in the United States.

    PubMed

    Yabroff, K Robin; Guy, Gery P; Ekwueme, Donatus U; McNeel, Timothy; Rozjabek, Heather M; Dowling, Emily; Li, Chunyu; Virgo, Katherine S

    2014-07-01

    Although patient time costs are recommended for inclusion in cost-effectiveness analyses, these data are not routinely collected. We used nationally representative data and a medical service-based approach to estimate the annual patient time costs among cancer survivors. We identified adult 6699 cancer survivors and 86,412 individuals without a cancer history ages 18 years or more from 2008-2011 Medical Expenditure Panel Survey (MEPS). Service use was categorized as hospitalizations, emergency room use, provider visits, ambulatory surgery, chemotherapy, and radiation therapy. Service time estimates were applied to frequencies for each service category and the US median wage rate in 2011 was used to value time. We evaluated the association between cancer survivorship and service use frequencies and patient time costs with multivariable regression models, stratified by age group (18-64 and 65+ y). Sensitivity analyses evaluated different approaches for valuing time. Cancer survivors were more likely to have hospitalizations, emergency room visits, ambulatory surgeries, and provider visits in the past year than individuals without a cancer history in adjusted analyses (P<0.05). Annual patient time was higher for cancer survivors than individuals without a cancer history among those aged 18-64 years (30.2 vs. 13.6 h; P<0.001) and 65+ years (55.1 vs. 36.6 h; P<0.001), as were annual patient time costs (18-64 y: $500 vs. $226; P<0.001 and 65+ y: $913 vs. $607; P<0.001). Cancer survivors had greater annual medical service use and patient time costs than individuals without a cancer history. This medical service-based approach for estimating annual time costs can also be applied to other conditions.

  8. Automatic energy expenditure measurement for health science.

    PubMed

    Catal, Cagatay; Akbulut, Akhan

    2018-04-01

    It is crucial to predict the human energy expenditure in any sports activity and health science application accurately to investigate the impact of the activity. However, measurement of the real energy expenditure is not a trivial task and involves complex steps. The objective of this work is to improve the performance of existing estimation models of energy expenditure by using machine learning algorithms and several data from different sensors and provide this estimation service in a cloud-based platform. In this study, we used input data such as breathe rate, and hearth rate from three sensors. Inputs are received from a web form and sent to the web service which applies a regression model on Azure cloud platform. During the experiments, we assessed several machine learning models based on regression methods. Our experimental results showed that our novel model which applies Boosted Decision Tree Regression in conjunction with the median aggregation technique provides the best result among other five regression algorithms. This cloud-based energy expenditure system which uses a web service showed that cloud computing technology is a great opportunity to develop estimation systems and the new model which applies Boosted Decision Tree Regression with the median aggregation provides remarkable results. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Determining factors influencing survival of breast cancer by fuzzy logistic regression model.

    PubMed

    Nikbakht, Roya; Bahrampour, Abbas

    2017-01-01

    Fuzzy logistic regression model can be used for determining influential factors of disease. This study explores the important factors of actual predictive survival factors of breast cancer's patients. We used breast cancer data which collected by cancer registry of Kerman University of Medical Sciences during the period of 2000-2007. The variables such as morphology, grade, age, and treatments (surgery, radiotherapy, and chemotherapy) were applied in the fuzzy logistic regression model. Performance of model was determined in terms of mean degree of membership (MDM). The study results showed that almost 41% of patients were in neoplasm and malignant group and more than two-third of them were still alive after 5-year follow-up. Based on the fuzzy logistic model, the most important factors influencing survival were chemotherapy, morphology, and radiotherapy, respectively. Furthermore, the MDM criteria show that the fuzzy logistic regression have a good fit on the data (MDM = 0.86). Fuzzy logistic regression model showed that chemotherapy is more important than radiotherapy in survival of patients with breast cancer. In addition, another ability of this model is calculating possibilistic odds of survival in cancer patients. The results of this study can be applied in clinical research. Furthermore, there are few studies which applied the fuzzy logistic models. Furthermore, we recommend using this model in various research areas.

  10. Piecewise exponential survival times and analysis of case-cohort data.

    PubMed

    Li, Yan; Gail, Mitchell H; Preston, Dale L; Graubard, Barry I; Lubin, Jay H

    2012-06-15

    Case-cohort designs select a random sample of a cohort to be used as control with cases arising from the follow-up of the cohort. Analyses of case-cohort studies with time-varying exposures that use Cox partial likelihood methods can be computer intensive. We propose a piecewise-exponential approach where Poisson regression model parameters are estimated from a pseudolikelihood and the corresponding variances are derived by applying Taylor linearization methods that are used in survey research. The proposed approach is evaluated using Monte Carlo simulations. An illustration is provided using data from the Alpha-Tocopherol, Beta-Carotene Cancer Prevention Study of male smokers in Finland, where a case-cohort study of serum glucose level and pancreatic cancer was analyzed. Copyright © 2012 John Wiley & Sons, Ltd.

  11. The impact of outcome orientation and justice concerns on tax compliance: the role of taxpayers' identity.

    PubMed

    Wenzel, Michael

    2002-08-01

    Previous research has yielded inconsistent evidence for the impact of justice perceptions on tax compliance. This article suggests a more differentiated view on the basis of 2 congenial theories of procedural and distributive justice. The group-value model and a categorization approach argue that taxpayers are more concerned about justice and less about personal outcomes when they identify strongly with the inclusive category within which procedures and distributions apply. Regression analyses of survey data from 2,040 Australian citizens showed that 2 forms of tax compliance (pay-income reporting and tax minimization) were determined by self-interest variables. For 2 other forms (nonpay income and deductions), inclusive identification had an additional effect and moderated the effects of self-interest and justice variables as predicted.

  12. Optimisation of steam distillation extraction oil from onion by response surface methodology and its chemical composition.

    PubMed

    Wang, Zhao Dan; Li, Li Hua; Xia, Hui; Wang, Feng; Yang, Li Gang; Wang, Shao Kang; Sun, Gui Ju

    2018-01-01

    Oil extraction from onion was performed by steam distillation. Response surface methodology was applied to evaluate the effects of ratio of water to raw material, extraction time, zymolysis temperature and distillation times on yield of onion oil. The maximum extraction yield (1.779%) was obtained as following conditions: ratio of water to raw material was 1, extraction time was 2.5 h, zymolysis temperature was 36° and distillation time was 2.6 h. The experimental values agreed well with those predicted by regression model. The chemical composition of extracted onion oil under the optimum conditions was analysed by gas chromatography-mass spectrometry technology. The results showed that sulphur compounds, like alkanes, sulphide, alkenes, ester and alcohol, were the major components of onion oil.

  13. Effects of teacher autonomy support and students' autonomous motivation on learning in physical education.

    PubMed

    Shen, Bo; McCaughtry, Nate; Martin, Jeffrey; Fahlman, Mariane

    2009-03-01

    This study applied self-determination theory to investigate the effects of students' autonomous motivation and their perceptions of teacher autonomy support on need satisfaction adjustment, learning achievement, and cardiorespiratory fitness over a 4-month personal conditioning unit. Participants were 253 urban adolescents (121 girls and 132 boys, ages = 12-14 years). Based on a series of multiple regression analyses, perceived autonomy support by teachers significantly predicted students'need satisfaction adjustment and led to learning achievement, especially for students who were not autonomously motivated to learn in physical education. In turn, being more autonomous was directly associated with cardiorespiratory fitness enhancement. The findings suggest that shifts in teaching approaches toward providing more support for students' autonomy and active involvement hold promise for enhancing learning.

  14. Development, optimization and validation of gas chromatographic fingerprinting of Brazilian commercial diesel fuel for quality control.

    PubMed

    dos Santos, Bruno César Diniz Brito; Flumignan, Danilo Luiz; de Oliveira, José Eduardo

    2012-10-01

    A three-step development, optimization and validation strategy is described for gas chromatography (GC) fingerprints of Brazilian commercial diesel fuel. A suitable GC-flame ionization detection (FID) system was selected to assay a complex matrix such as diesel. The next step was to improve acceptable chromatographic resolution with reduced analysis time, which is recommended for routine applications. Full three-level factorial designs were performed to improve flow rate, oven ramps, injection volume and split ratio in the GC system. Finally, several validation parameters were performed. The GC fingerprinting can be coupled with pattern recognition and multivariate regressions analyses to determine fuel quality and fuel physicochemical parameters. This strategy can also be applied to develop fingerprints for quality control of other fuel types.

  15. Controlling Type I Error Rates in Assessing DIF for Logistic Regression Method Combined with SIBTEST Regression Correction Procedure and DIF-Free-Then-DIF Strategy

    ERIC Educational Resources Information Center

    Shih, Ching-Lin; Liu, Tien-Hsiang; Wang, Wen-Chung

    2014-01-01

    The simultaneous item bias test (SIBTEST) method regression procedure and the differential item functioning (DIF)-free-then-DIF strategy are applied to the logistic regression (LR) method simultaneously in this study. These procedures are used to adjust the effects of matching true score on observed score and to better control the Type I error…

  16. Spatial Assessment of Model Errors from Four Regression Techniques

    Treesearch

    Lianjun Zhang; Jeffrey H. Gove; Jeffrey H. Gove

    2005-01-01

    Fomst modelers have attempted to account for the spatial autocorrelations among trees in growth and yield models by applying alternative regression techniques such as linear mixed models (LMM), generalized additive models (GAM), and geographicalIy weighted regression (GWR). However, the model errors are commonly assessed using average errors across the entire study...

  17. Quantile Regression in the Study of Developmental Sciences

    ERIC Educational Resources Information Center

    Petscher, Yaacov; Logan, Jessica A. R.

    2014-01-01

    Linear regression analysis is one of the most common techniques applied in developmental research, but only allows for an estimate of the average relations between the predictor(s) and the outcome. This study describes quantile regression, which provides estimates of the relations between the predictor(s) and outcome, but across multiple points of…

  18. Why bother with testing? The validity of immigrants' self-assessed language proficiency.

    PubMed

    Edele, Aileen; Seuring, Julian; Kristen, Cornelia; Stanat, Petra

    2015-07-01

    Due to its central role in social integration, immigrants' language proficiency is a matter of considerable societal concern and scientific interest. This study examines whether commonly applied self-assessments of linguistic skills yield results that are similar to those of competence tests and thus whether these self-assessments are valid measures of language proficiency. Analyses of data for immigrant youth reveal moderate correlations between language test scores and two types of self-assessments (general ability estimates and concrete performance estimates) for the participants' first and second languages. More importantly, multiple regression models using self-assessments and models using test scores yield different results. This finding holds true for a variety of analyses and for both types of self-assessments. Our findings further suggest that self-assessed language skills are systematically biased in certain groups. Subjective measures thus seem to be inadequate estimates of language skills, and future research should use them with caution when research questions pertain to actual language skills rather than self-perceptions. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. [Influence of humidex on incidence of bacillary dysentery in Hefei: a time-series study].

    PubMed

    Zhang, H; Zhao, K F; He, R X; Zhao, D S; Xie, M Y; Wang, S S; Bai, L J; Cheng, Q; Zhang, Y W; Su, H

    2017-11-10

    Objective: To investigate the effect of humidex combined with mean temperature and relative humidity on the incidence of bacillary dysentery in Hefei. Methods: Daily counts of bacillary dysentery cases and weather data in Hefei were collected from January 1, 2006 to December 31, 2013. Then, the humidex was calculated from temperature and relative humidity. A Poisson generalized linear regression combined with distributed lag non-linear model was applied to analyze the relationship between humidex and the incidence of bacillary dysentery, after adjusting for long-term and seasonal trends, day of week and other weather confounders. Stratified analyses by gender, age and address were also conducted. Results: The risk of bacillary dysentery increased with the rise of humidex. The adverse effect of high humidex (90 percentile of humidex) appeared in 2-days lag and it was the largest at 4-days lag ( RR =1.063, 95 %CI : 1.037-1.090). Subgroup analyses indicated that all groups were affected by high humidex at lag 2-5 days. Conclusion: High humidex could significantly increase the risk of bacillary dysentery, and the lagged effects were observed.

  20. Interventions to Improve Medication Adherence in Hypertensive Patients: Systematic Review and Meta-analysis.

    PubMed

    Conn, Vicki S; Ruppar, Todd M; Chase, Jo-Ana D; Enriquez, Maithe; Cooper, Pamela S

    2015-12-01

    This systematic review applied meta-analytic procedures to synthesize medication adherence interventions that focus on adults with hypertension. Comprehensive searching located trials with medication adherence behavior outcomes. Study sample, design, intervention characteristics, and outcomes were coded. Random-effects models were used in calculating standardized mean difference effect sizes. Moderator analyses were conducted using meta-analytic analogues of ANOVA and regression to explore associations between effect sizes and sample, design, and intervention characteristics. Effect sizes were calculated for 112 eligible treatment-vs.-control group outcome comparisons of 34,272 subjects. The overall standardized mean difference effect size between treatment and control subjects was 0.300. Exploratory moderator analyses revealed interventions were most effective among female, older, and moderate- or high-income participants. The most promising intervention components were those linking adherence behavior with habits, giving adherence feedback to patients, self-monitoring of blood pressure, using pill boxes and other special packaging, and motivational interviewing. The most effective interventions employed multiple components and were delivered over many days. Future research should strive for minimizing risks of bias common in this literature, especially avoiding self-report adherence measures.

  1. Can a toxin gene NAAT be used to predict toxin EIA and the severity of Clostridium difficile infection?

    PubMed

    Garvey, Mark I; Bradley, Craig W; Wilkinson, Martyn A C; Holden, Elisabeth

    2017-01-01

    Diagnosis of C. difficile infection (CDI) is controversial because of the many laboratory methods available and their lack of ability to distinguish between carriage, mild or severe disease. Here we describe whether a low C. difficile toxin B nucleic acid amplification test (NAAT) cycle threshold (CT) can predict toxin EIA, CDI severity and mortality. A three-stage algorithm was employed for CDI testing, comprising a screening test for glutamate dehydrogenase (GDH), followed by a NAAT, then a toxin enzyme immunoassay (EIA). All diarrhoeal samples positive for GDH and NAAT between 2012 and 2016 were analysed. The performance of the NAAT CT value as a classifier of toxin EIA outcome was analysed using a ROC curve; patient mortality was compared to CTs and toxin EIA via linear regression models. A CT value ≤26 was associated with ≥72% toxin EIA positivity; applying a logistic regression model we demonstrated an association between low CT values and toxin EIA positivity. A CT value of ≤26 was significantly associated ( p  = 0.0262) with increased one month mortality, severe cases of CDI or failure of first line treatment. The ROC curve probabilities demonstrated a CT cut off value of 26.6. Here we demonstrate that a CT ≤26 indicates more severe CDI and is associated with higher mortality. Samples with a low CT value are often toxin EIA positive, questioning the need for this additional EIA test. A CT ≤26 could be used to assess the potential for severity of CDI and guide patient treatment.

  2. Prevalence and Trends in Domestic Violence in South Korea: Findings From National Surveys.

    PubMed

    Kim, Jae Yop; Oh, Sehun; Nam, Seok In

    2016-05-01

    To examine trends in the prevalence of domestic violence since 1997, 1 year prior to the introduction of legislative countermeasures and accompanying services in South Korea, and to analyze what socio-demographic characteristics of perpetrators contribute to spousal violence and whether there were any changes in risk factors over time. This study used two sets of nationally representative household samples: married or cohabiting couples of 1,540 from the 1999 national survey and 3,269 from the 2010 National Survey of Domestic Violence. Frequency analysis was used to measure the prevalence of intimate partner violence (IPV), and cross-tabulation, correlation, and logistic regression analyses were used to look for socio-demographic risk factors of spousal physical violence and patterns of change over time. The frequency analysis showed that the IPV prevalence dropped by approximately 50%, from 34.1% in 1999 to 16.5% in 2010, though it was still higher than many other countries. The cross-tabulation and logistic regression analyses suggested that men with low socio-demographic characteristics were generally more violent, though this tendency did not apply to women. Instead, younger women seemed to be more violent than older women. Last, different levels of household income were associated with different levels of IPV in 2010, but no linear trend was detected. In this study, IPV prevalence trends and risk factors of two different time periods were discussed to provide implications for tackling the IPV problem. Future countermeasures must build on understanding about men with low socio-demographic status and younger women, who were more violent in marital relationships. © The Author(s) 2015.

  3. Predicting Treatment Outcomes and Responder Subsets in Scleroderma-related Interstitial Lung Disease

    PubMed Central

    Roth, Michael D.; Tseng, Chi-Hong; Clements, Philip J.; Furst, Daniel E.; Tashkin, Donald P.; Goldin, Jonathan G.; Khanna, Dinesh; Kleerup, Eric C.; Li, Ning; Elashoff, David; Elashoff, Robert E.

    2014-01-01

    Objectives To identify baseline characteristics of patients with Scleroderma-Related Interstitial Lung Disease (SSc-ILD) which predict the most favorable response to a 12-month treatment with oral cyclophosphamide (CYC). Methods Regression analyses were retrospectively applied to the Scleroderma Lung Study data in order to identify baseline characteristics that correlated with the absolute change in %-predicted Forced Vital Capacity (FVC) and the placebo-adjusted change in %-predicted FVC over time (the CYC treatment effect). Results Completion of the CYC arm of the Scleroderma Lung Study was associated with a placebo-adjusted improvement in %-predicted FVC of 2.11% at 12 months which increased to 4.16% when patients were followed for another 6 months (p=0.014). Multivariate regression analyses identified the maximal severity of reticular infiltrates on baseline high-resolution computerized tomography (HRCT), the modified Rodnan Skin Score (mRSS), and Mahler's Baseline Dyspnea Index (BDI) as independent correlates of treatment response. When patients were stratified based on whether 50% or more of any lung zone was involved by reticular infiltrates on HRCT and/or the presence of a mRSS of at least 23, a subgroup emerged with an average CYC treatment effect of 4.73% at 12 months and 9.81% at 18 months (p<0.001). Conversely, there was no treatment effect (−0.58%) in patients with less severe HRCT findings and a lower mRSS. Conclusions A retrospective analysis of the Scleroderma Lung Study identified the severity of reticular infiltrates on baseline HRCT and the baseline mRSS as patient features that might predict responsiveness to CYC therapy. PMID:21547897

  4. Physical activity behaviour in men with inflammatory joint disease: a cross-sectional register-based study.

    PubMed

    Hammer, Nanna Maria; Midtgaard, Julie; Hetland, Merete Lund; Krogh, Niels Steen; Esbensen, Bente Appel

    2018-05-01

    Physical activity is recommended as an essential part of the non-pharmacological management of inflammatory joint disease, but previous research in this area has predominantly included women. The aim of this study was to examine physical activity behaviour in men with inflammatory joint disease. The study was conducted as a cross-sectional register-based study. Data on physical activity behaviour in men with RA, PsA and AS were matched with sociodemographic and clinical variables extracted from the DANBIO registry. Logistic regression analyses using multiple imputations were performed to investigate demographic and clinical variables associated with regular engagement in physical activity (moderate-vigorous ⩾2 h/week). Descriptive statistics were applied to explore motivation, barriers and preferences for physical activity. A total of 325 men were included of whom 129 (40%) engaged in regular physical activity. In univariate analyses, higher age, visual analogue scale (VAS) for pain, VAS fatigue, VAS patient's global, CRP level, disease activity, functional disability and current smoking were negatively associated with regular engagement in physical activity. In the final multivariable regression model only a high VAS fatigue score (⩾61 mm) (OR = 0.228; CI: 0.119, 0.436) remained significantly independently associated with regular physical activity. A majority of men with inflammatory joint disease do not meet the recommendations of regular physical activity. Both sociodemographic and clinical parameters were associated with engagement in physical activity, and fatigue especially seems to play a pivotal role in explaining suboptimal physical activity behaviour in this patient group.

  5. Ultrasound predictors of placental invasion: the Placenta Accreta Index.

    PubMed

    Rac, Martha W F; Dashe, Jodi S; Wells, C Edward; Moschos, Elysia; McIntire, Donald D; Twickler, Diane M

    2015-03-01

    We sought to apply a standardized evaluation of ultrasound parameters for the prediction of placental invasion in a high-risk population. This was a retrospective review of gravidas with ≥1 prior cesarean delivery who received an ultrasound diagnosis of placenta previa or low-lying placenta in the third trimester at our institution from 1997 through 2011. Sonographic images were reviewed by an investigator blinded to pregnancy outcome and sonography reports. Parameters assessed included loss of retroplacental clear zone, irregularity and width of uterine-bladder interface, smallest myometrial thickness, presence of lacunar spaces, and bridging vessels. Diagnosis of placental invasion was based on histologic confirmation. Statistical analyses were performed using linear logistic regression and multiparametric analyses to generate a predictive equation evaluated using a receiver operating characteristic curve. Of 184 gravidas who met inclusion criteria, 54 (29%) had invasion confirmed on hysterectomy specimen. All sonographic parameters were associated with placental invasion (P < .001). Constructing a receiver operating characteristic curve, the combination of smallest sagittal myometrial thickness, lacunae, and bridging vessels, in addition to number of cesarean deliveries and placental location, yielded an area under the curve of 0.87 (95% confidence interval, 0.80-0.95). Using logistic regression, a predictive equation was generated, termed the "Placenta Accreta Index." Each parameter was weighted to create a 9-point scale in which a score of 0-9 provided a probability of invasion that ranged from 2-96%, respectively. Assignment of the Placenta Accreta Index may be helpful in predicting individual patient risk for morbidly adherent placenta. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Do we act upon what we see? Direct effects of alcohol cues in movies on young adults' alcohol drinking.

    PubMed

    Koordeman, Renske; Kuntsche, Emmanuel; Anschutz, Doeschka J; van Baaren, Rick B; Engels, Rutger C M E

    2011-01-01

    Ample survey research has shown that alcohol portrayals in movies affect the development of alcohol consumption in youth. Hence, there is preliminary evidence that alcohol portrayals in movies also directly influence viewers' drinking of alcohol while watching movies. One process that might account for these direct effects is imitation. The present study therefore examined whether young people imitate actors sipping alcohol on screen. We observed sipping behaviours of 79 young adults (ages 18-25) watching a 60-min movie clip, 'What Happens in Vegas', in a semi-naturalistic home setting. Each of the 79 participants was exposed to 25 alcohol cues. Two-level logistic regression analyses were used to analyse whether participants in general imitated actors' sipping during this clip. In addition, we applied proportional hazard models in a survival analysis framework (Cox regression) to test whether there was a difference in imitation of the cues between male and female participants, and to test whether the timing of the actors' sipping throughout the movie played a role. The findings showed that participants were more likely to sip in accordance with the actors' sipping than without such a cue. Further, we found that men were more likely to imitate actors' sipping than females and that participants tended to respond to actors' sipping at the beginning of the movie rather than at the end. Exposure to actors sipping alcohol in a movie seems to have an immediate impact on the drinking behaviour of viewers, via the mechanism of imitation.

  7. Usefulness of the Trabecular Bone Score for assessing the risk of osteoporotic fracture.

    PubMed

    Redondo, L; Puigoriol, E; Rodríguez, J R; Peris, P; Kanterewicz, E

    2018-04-01

    The trabecular bone score (TBS) is an imaging technique that assesses the condition of the trabecular microarchitecture. Preliminary results suggest that TBS, along with the bone mineral density assessment, could improve the calculation of the osteoporotic fracture risk. The aim of this study was to analyse TBS values and their relationship with the clinical characteristics, bone mineral density and history of fractures of a cohort of posmenopausal women. We analysed 2,257 posmenopausal women from the FRODOS cohort, which was created to determine the risk factors for osteoporotic fracture through a clinical survey and bone densitometry with vertebral morphometry. TBS was applied to the densitometry images. TBS values ≤1230 were considered indicative of degraded microarchitecture. We performed a simple and multiple linear regression to determine the factors associated with this index. The mean TBS value in L1-L4 was 1.203±0.121. Some 55.3% of the women showed values indicating degraded microarchitecture. In the multiple linear regression analysis, the factors associated with low TBS values were age, weight, height, spinal T-score, glucocorticoid treatment, presence of type 2 diabetes and a history of fractures due to frailty. TBS showed microarchitecture degradation values in the participants of the FRODOS cohort and was associated with anthropometric factors, low bone mineral density values, the presence of fractures, a history of type 2 diabetes mellitus and the use of glucocorticoids. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  8. Socioeconomic Correlates of Contraceptive Use among the Ethnic Tribal Women of Bangladesh: Does Sex Preference Matter?

    PubMed

    Kamal, S M Mostafa; Hassan, Che Hashim

    2013-06-01

    To examine the relationship between socioeconomic factors affecting contraceptive use among tribal women of Bangladesh with focusing on son preference over daughter. The study used data gathered through a cross sectional survey on four tribal communities resided in the Rangamati Hill District of the Chittagong Hill Tracts, Bangladesh. A multistage random sampling procedure was applied to collect data from 865 currently married women of whom 806 women were currently married, non-pregnant and had at least one living child, which are the basis of this study. The information was recorded in a pre-structured questionnaire. Simple cross tabulation, chi-square tests and logistic regression analyses were performed to analyzing data. The contraceptive prevalence rate among the study tribal women was 73%. The multivariate analyses yielded quantitatively important and reliable estimates of likelihood of contraceptive use. Findings revealed that after controlling for other variables, the likelihood of contraceptive use was found not to be significant among women with at least one son than those who had only daughters, indicating no preference of son over daughter. Multivariate logistic regression analysis suggests that home visitations by family planning workers, tribal identity, place of residence, husband's education, and type of family, television ownership, electricity connection in the household and number of times married are important determinants of any contraceptive method use among the tribal women. The contraceptive use rate among the disadvantaged tribal women was more than that of the national level. Door-step delivery services of modern methods should be reached and available targeting the poor and remote zones.

  9. [Being online without a purpose -- study of background variables of problematic internet use].

    PubMed

    Prievara, Dóra Katalin; Pikó, Bettina

    2016-01-01

    These days, use of the Internet is unavoidable for the younger generations. The online world is the primary source of infomation and quick communication, and these activities can take many hours per day. The main goal of the present study was to examine the correlations among problematic internet use, social factors, stress and life satisfaction. Data collection was going online during the first semester of the year 2014 (N= 386 girls). The anonymous questionnaire contained items on perceived social support and the amount of online activites beyond sociodemographics. After descriptive statistics, factor, correlation and multiple linear regression analyses were applied to detect interrelationships. According to our data, 78% of the participants spent daily at least 2 hours, 40% more than 4 hours online. Using factor analysis, four factors of online activities were identified: Social networking-surfing, News-information, Risky and Lonely game factors. Only the News-information factor was not related to the problematic internet use. Based on multiple regression analyses, we may conclude that shyness, stress, loneliness and two factors, the Social networking-surfing and the Risky factors acted as background variables for problematic internet use. As a summary we may conclude that the internet has an important role in the everyday life of the participants. In case of the direct aim of the online activities the problematic use did not appear. These activities were mostly searching for information and news. In introduction of prevention, education about the correct use of the internet may be reasonable as early as possible.

  10. Regression and Data Mining Methods for Analyses of Multiple Rare Variants in the Genetic Analysis Workshop 17 Mini-Exome Data

    PubMed Central

    Bailey-Wilson, Joan E.; Brennan, Jennifer S.; Bull, Shelley B; Culverhouse, Robert; Kim, Yoonhee; Jiang, Yuan; Jung, Jeesun; Li, Qing; Lamina, Claudia; Liu, Ying; Mägi, Reedik; Niu, Yue S.; Simpson, Claire L.; Wang, Libo; Yilmaz, Yildiz E.; Zhang, Heping; Zhang, Zhaogong

    2012-01-01

    Group 14 of Genetic Analysis Workshop 17 examined several issues related to analysis of complex traits using DNA sequence data. These issues included novel methods for analyzing rare genetic variants in an aggregated manner (often termed collapsing rare variants), evaluation of various study designs to increase power to detect effects of rare variants, and the use of machine learning approaches to model highly complex heterogeneous traits. Various published and novel methods for analyzing traits with extreme locus and allelic heterogeneity were applied to the simulated quantitative and disease phenotypes. Overall, we conclude that power is (as expected) dependent on locus-specific heritability or contribution to disease risk, large samples will be required to detect rare causal variants with small effect sizes, extreme phenotype sampling designs may increase power for smaller laboratory costs, methods that allow joint analysis of multiple variants per gene or pathway are more powerful in general than analyses of individual rare variants, population-specific analyses can be optimal when different subpopulations harbor private causal mutations, and machine learning methods may be useful for selecting subsets of predictors for follow-up in the presence of extreme locus heterogeneity and large numbers of potential predictors. PMID:22128066

  11. An Impact Evaluation of a Federal Mine Safety Training Regulation on Injury Rates Among US Stone, Sand, and Gravel Mine Workers: An Interrupted Time-Series Analysis

    PubMed Central

    Windsor, Richard

    2010-01-01

    Objectives. We evaluated the impact of a safety training regulation, implemented by the US Department of Labor's Mine Safety and Health Administration (MSHA) in 1999, on injury rates at stone, sand, and gravel mining operations. Methods. We applied a time-series design and analyses with quarterly counts of nonfatal injuries and employment hours from 7998 surface aggregate mines from 1995 through 2006. Covariates included standard industrial classification codes, ownership, and injury severity. Results. Overall crude rates of injuries declined over the 12-year period. Reductions in incident rates for medical treatment only, restricted duty, and lost-time injuries were consistent with temporal trends and provided no evidence of an intervention effect attributable to the MSHA regulation. Rates of permanently disabling injuries (PDIs) declined markedly. Regression analyses documented a statistically significant reduction in the risk rate in the postintervention time period (risk rate = 0.591; 95% confidence interval = 0.529, 0.661). Conclusions. Although a causal relationship between the regulatory intervention and the decline in the rate of PDIs is plausible, inconsistency in the results with the other injury-severity categories preclude attributing the observed outcome to the MSHA regulation. Further analyses of these data are needed. PMID:20466960

  12. An impact evaluation of a federal mine safety training regulation on injury rates among US stone, sand, and gravel mine workers: an interrupted time-series analysis.

    PubMed

    Monforton, Celeste; Windsor, Richard

    2010-07-01

    We evaluated the impact of a safety training regulation, implemented by the US Department of Labor's Mine Safety and Health Administration (MSHA) in 1999, on injury rates at stone, sand, and gravel mining operations. We applied a time-series design and analyses with quarterly counts of nonfatal injuries and employment hours from 7998 surface aggregate mines from 1995 through 2006. Covariates included standard industrial classification codes, ownership, and injury severity. Overall crude rates of injuries declined over the 12-year period. Reductions in incident rates for medical treatment only, restricted duty, and lost-time injuries were consistent with temporal trends and provided no evidence of an intervention effect attributable to the MSHA regulation. Rates of permanently disabling injuries (PDIs) declined markedly. Regression analyses documented a statistically significant reduction in the risk rate in the postintervention time period (risk rate = 0.591; 95% confidence interval = 0.529, 0.661). Although a causal relationship between the regulatory intervention and the decline in the rate of PDIs is plausible, inconsistency in the results with the other injury-severity categories preclude attributing the observed outcome to the MSHA regulation. Further analyses of these data are needed.

  13. Decreased pain sensitivity due to trimethylbenzene exposure ...

    EPA Pesticide Factsheets

    Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the development of quantitative approaches for evidence integration, including the application of meta-analyses. The following hazard identification case study applies qualitative as well as meta-analytic approaches to trimethylbenzene (TMB) isomers exposure and the potential neurotoxic effects on pain sensitivity. In the meta-analytic approach, a pooled effect size is calculated, after consideration of multiple confounding factors, in order to determine whether the entire database under consideration indicates that TMBs are likely to be a neurotoxic hazard. The pain sensitivity studies included in the present analyses initially seem discordant in their results: effects on pain sensitivity are seen immediately after termination of exposure, appear to resolve 24 hours after exposure, and then reappear 50 days later following foot-shock. Qualitative consideration of toxicological and toxicokinetic characteristics of the TMB isomers suggests that the observed differences between studies are due to testing time and can be explained through a complete consideration of the underlying biology of the effect and the nervous system as a whole. Meta-analyses and –regressions support this conclus

  14. Modeling and Analysis of Process Parameters for Evaluating Shrinkage Problems During Plastic Injection Molding of a DVD-ROM Cover

    NASA Astrophysics Data System (ADS)

    Öktem, H.

    2012-01-01

    Plastic injection molding plays a key role in the production of high-quality plastic parts. Shrinkage is one of the most significant problems of a plastic part in terms of quality in the plastic injection molding. This article focuses on the study of the modeling and analysis of the effects of process parameters on the shrinkage by evaluating the quality of the plastic part of a DVD-ROM cover made with Acrylonitrile Butadiene Styrene (ABS) polymer material. An effective regression model was developed to determine the mathematical relationship between the process parameters (mold temperature, melt temperature, injection pressure, injection time, and cooling time) and the volumetric shrinkage by utilizing the analysis data. Finite element (FE) analyses designed by Taguchi (L27) orthogonal arrays were run in the Moldflow simulation program. Analysis of variance (ANOVA) was then performed to check the adequacy of the regression model and to determine the effect of the process parameters on the shrinkage. Experiments were conducted to control the accuracy of the regression model with the FE analyses obtained from Moldflow. The results show that the regression model agrees very well with the FE analyses and the experiments. From this, it can be concluded that this study succeeded in modeling the shrinkage problem in our application.

  15. Aging, not menopause, is associated with higher prevalence of hyperuricemia among older women.

    PubMed

    Krishnan, Eswar; Bennett, Mihoko; Chen, Linjun

    2014-11-01

    This work aims to study the associations, if any, of hyperuricemia, gout, and menopause status in the US population. Using multiyear data from the National Health and Nutrition Examination Survey, we performed unmatched comparisons and one to three age-matched comparisons of women aged 20 to 70 years with and without hyperuricemia (serum urate ≥6 mg/dL). Analyses were performed using survey-weighted multiple logistic regression and conditional logistic regression, respectively. Overall, there were 1,477 women with hyperuricemia. Age and serum urate were significantly correlated. In unmatched analyses (n = 9,573 controls), postmenopausal women were older, were heavier, and had higher prevalence of renal impairment, hypertension, diabetes, and hyperlipidemia. In multivariable regression, after accounting for age, body mass index, glomerular filtration rate, and diuretic use, menopause was associated with hyperuricemia (odds ratio, 1.36; 95% CI, 1.05-1.76; P = 0.002). In corresponding multivariable regression using age-matched data (n = 4,431 controls), the odds ratio for menopause was 0.94 (95% CI, 0.83-1.06). Current use of hormone therapy was not associated with prevalent hyperuricemia in both unmatched and matched analyses. Age is a better statistical explanation for the higher prevalence of hyperuricemia among older women than menopause status.

  16. Cognitive, emotive, and cognitive-behavioral correlates of suicidal ideation among Chinese adolescents in Hong Kong.

    PubMed

    Kwok, Sylvia Lai Yuk Ching; Shek, Daniel Tan Lei

    2010-03-05

    Utilizing Daniel Goleman's theory of emotional competence, Beck's cognitive theory, and Rudd's cognitive-behavioral theory of suicidality, the relationships between hopelessness (cognitive component), social problem solving (cognitive-behavioral component), emotional competence (emotive component), and adolescent suicidal ideation were examined. Based on the responses of 5,557 Secondary 1 to Secondary 4 students from 42 secondary schools in Hong Kong, results showed that suicidal ideation was positively related to adolescent hopelessness, but negatively related to emotional competence and social problem solving. While standard regression analyses showed that all the above variables were significant predictors of suicidal ideation, hierarchical regression analyses showed that hopelessness was the most important predictor of suicidal ideation, followed by social problem solving and emotional competence. Further regression analyses found that all four subscales of emotional competence, i.e., empathy, social skills, self-management of emotions, and utilization of emotions, were important predictors of male adolescent suicidal ideation. However, the subscale of social skills was not a significant predictor of female adolescent suicidal ideation. Standard regression analysis also revealed that all three subscales of social problem solving, i.e., negative problem orientation, rational problem solving, and impulsiveness/carelessness style, were important predictors of suicidal ideation. Theoretical and practice implications of the findings are discussed.

  17. Risk factors for autistic regression: results of an ambispective cohort study.

    PubMed

    Zhang, Ying; Xu, Qiong; Liu, Jing; Li, She-chang; Xu, Xiu

    2012-08-01

    A subgroup of children diagnosed with autism experience developmental regression featured by a loss of previously acquired abilities. The pathogeny of autistic regression is unknown, although many risk factors likely exist. To better characterize autistic regression and investigate the association between autistic regression and potential influencing factors in Chinese autistic children, we conducted an ambispective study with a cohort of 170 autistic subjects. Analyses by multiple logistic regression showed significant correlations between autistic regression and febrile seizures (OR = 3.53, 95% CI = 1.17-10.65, P = .025), as well as with a family history of neuropsychiatric disorders (OR = 3.62, 95% CI = 1.35-9.71, P = .011). This study suggests that febrile seizures and family history of neuropsychiatric disorders are correlated with autistic regression.

  18. Self-perception and malocclusion and their relation to oral appearance and function.

    PubMed

    Peres, Sílvia Helena de Carvalho Sales; Goya, Suzana; Cortellazzi, Karine Laura; Ambrosano, Gláucia Maria Bovi; Meneghim, Marcelo de Castro; Pereira, Antonio Carlos

    2011-10-01

    The aim of this study was to evaluate the relationship between malocclusion and self-perception of oral appearance/function, in 12/15-year-old Brazilian adolescents. The cluster sample consisted of 717 teenagers attending 24 urban public (n=611) and 5 rural public (n=107) schools in Maringá/PR. Malocclusion was measured using the Dental Aesthetic Index (DAI), in accordance with WHO recommendations. A parental questionnaire was applied to collect information on esthetic perception level and oral variables related to oral health. Univariate and multiple logistic regression analyses were performed. Multiple logistic regression confirmed that for 12-year-old, missing teeth (OR=2.865) and presence of openbite (open occlusal relationship) (OR=2.865) were risk indicators for speech capability. With regard to 15-year-old, presence of mandibular overjet (horizontal overlap) (OR=4.016) was a risk indicator for speech capability and molar relationship (OR=1.661) was a risk indicator for chewing capability. The impact of malocclusion on adolescents' life was confirmed in this study. Speech and chewing capability were associated with orthodontic deviations, which should be taken into consideration in oral health planning, to identify risk groups and improve community health services.

  19. Using existing case-mix methods to fund trauma cases.

    PubMed

    Monakova, Julia; Blais, Irene; Botz, Charles; Chechulin, Yuriy; Picciano, Gino; Basinski, Antoni

    2010-01-01

    Policymakers frequently face the need to increase funding in isolated and frequently heterogeneous (clinically and in terms of resource consumption) patient subpopulations. This article presents a methodologic solution for testing the appropriateness of using existing grouping and weighting methodologies for funding subsets of patients in the scenario where a case-mix approach is preferable to a flat-rate based payment system. Using as an example the subpopulation of trauma cases of Ontario lead trauma hospitals, the statistical techniques of linear and nonlinear regression models, regression trees, and spline models were applied to examine the fit of the existing case-mix groups and reference weights for the trauma cases. The analyses demonstrated that for funding Ontario trauma cases, the existing case-mix systems can form the basis for rational and equitable hospital funding, decreasing the need to develop a different grouper for this subset of patients. This study confirmed that Injury Severity Score is a poor predictor of costs for trauma patients. Although our analysis used the Canadian case-mix classification system and cost weights, the demonstrated concept of using existing case-mix systems to develop funding rates for specific subsets of patient populations may be applicable internationally.

  20. Dynamic connectivity regression: Determining state-related changes in brain connectivity

    PubMed Central

    Cribben, Ivor; Haraldsdottir, Ragnheidur; Atlas, Lauren Y.; Wager, Tor D.; Lindquist, Martin A.

    2014-01-01

    Most statistical analyses of fMRI data assume that the nature, timing and duration of the psychological processes being studied are known. However, often it is hard to specify this information a priori. In this work we introduce a data-driven technique for partitioning the experimental time course into distinct temporal intervals with different multivariate functional connectivity patterns between a set of regions of interest (ROIs). The technique, called Dynamic Connectivity Regression (DCR), detects temporal change points in functional connectivity and estimates a graph, or set of relationships between ROIs, for data in the temporal partition that falls between pairs of change points. Hence, DCR allows for estimation of both the time of change in connectivity and the connectivity graph for each partition, without requiring prior knowledge of the nature of the experimental design. Permutation and bootstrapping methods are used to perform inference on the change points. The method is applied to various simulated data sets as well as to an fMRI data set from a study (N=26) of a state anxiety induction using a socially evaluative threat challenge. The results illustrate the method’s ability to observe how the networks between different brain regions changed with subjects’ emotional state. PMID:22484408

  1. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

  2. Obstructive sleep apnea severity estimation: Fusion of speech-based systems.

    PubMed

    Ben Or, D; Dafna, E; Tarasiuk, A; Zigel, Y

    2016-08-01

    Obstructive sleep apnea (OSA) is a common sleep-related breathing disorder. Previous studies associated OSA with anatomical abnormalities of the upper respiratory tract that may be reflected in the acoustic characteristics of speech. We tested the hypothesis that the speech signal carries essential information that can assist in early assessment of OSA severity by estimating apnea-hypopnea index (AHI). 198 men referred to routine polysomnography (PSG) were recorded shortly prior to sleep onset while reading a one-minute speech protocol. The different parts of the speech recordings, i.e., sustained vowels, short-time frames of fluent speech, and the speech recording as a whole, underwent separate analyses, using sustained vowels features, short-term features, and long-term features, respectively. Applying support vector regression and regression trees, these features were used in order to estimate AHI. The fusion of the outputs of the three subsystems resulted in a diagnostic agreement of 67.3% between the speech-estimated AHI and the PSG-determined AHI, and an absolute error rate of 10.8 events/hr. Speech signal analysis may assist in the estimation of AHI, thus allowing the development of a noninvasive tool for OSA screening.

  3. Sparse kernel methods for high-dimensional survival data.

    PubMed

    Evers, Ludger; Messow, Claudia-Martina

    2008-07-15

    Sparse kernel methods like support vector machines (SVM) have been applied with great success to classification and (standard) regression settings. Existing support vector classification and regression techniques however are not suitable for partly censored survival data, which are typically analysed using Cox's proportional hazards model. As the partial likelihood of the proportional hazards model only depends on the covariates through inner products, it can be 'kernelized'. The kernelized proportional hazards model however yields a solution that is dense, i.e. the solution depends on all observations. One of the key features of an SVM is that it yields a sparse solution, depending only on a small fraction of the training data. We propose two methods. One is based on a geometric idea, where-akin to support vector classification-the margin between the failed observation and the observations currently at risk is maximised. The other approach is based on obtaining a sparse model by adding observations one after another akin to the Import Vector Machine (IVM). Data examples studied suggest that both methods can outperform competing approaches. Software is available under the GNU Public License as an R package and can be obtained from the first author's website http://www.maths.bris.ac.uk/~maxle/software.html.

  4. Space shuttle propulsion parameter estimation using optional estimation techniques

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A regression analyses on tabular aerodynamic data provided. A representative aerodynamic model for coefficient estimation. It also reduced the storage requirements for the "normal' model used to check out the estimation algorithms. The results of the regression analyses are presented. The computer routines for the filter portion of the estimation algorithm and the :"bringing-up' of the SRB predictive program on the computer was developed. For the filter program, approximately 54 routines were developed. The routines were highly subsegmented to facilitate overlaying program segments within the partitioned storage space on the computer.

  5. Applying Regression Analysis to Problems in Institutional Research.

    ERIC Educational Resources Information Center

    Bohannon, Tom R.

    1988-01-01

    Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)

  6. Local polynomial estimation of heteroscedasticity in a multivariate linear regression model and its applications in economics.

    PubMed

    Su, Liyun; Zhao, Yanyong; Yan, Tianshun; Li, Fenglan

    2012-01-01

    Multivariate local polynomial fitting is applied to the multivariate linear heteroscedastic regression model. Firstly, the local polynomial fitting is applied to estimate heteroscedastic function, then the coefficients of regression model are obtained by using generalized least squares method. One noteworthy feature of our approach is that we avoid the testing for heteroscedasticity by improving the traditional two-stage method. Due to non-parametric technique of local polynomial estimation, it is unnecessary to know the form of heteroscedastic function. Therefore, we can improve the estimation precision, when the heteroscedastic function is unknown. Furthermore, we verify that the regression coefficients is asymptotic normal based on numerical simulations and normal Q-Q plots of residuals. Finally, the simulation results and the local polynomial estimation of real data indicate that our approach is surely effective in finite-sample situations.

  7. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    NASA Technical Reports Server (NTRS)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  8. Assessment of Weighted Quantile Sum Regression for Modeling Chemical Mixtures and Cancer Risk

    PubMed Central

    Czarnota, Jenna; Gennings, Chris; Wheeler, David C

    2015-01-01

    In evaluation of cancer risk related to environmental chemical exposures, the effect of many chemicals on disease is ultimately of interest. However, because of potentially strong correlations among chemicals that occur together, traditional regression methods suffer from collinearity effects, including regression coefficient sign reversal and variance inflation. In addition, penalized regression methods designed to remediate collinearity may have limitations in selecting the truly bad actors among many correlated components. The recently proposed method of weighted quantile sum (WQS) regression attempts to overcome these problems by estimating a body burden index, which identifies important chemicals in a mixture of correlated environmental chemicals. Our focus was on assessing through simulation studies the accuracy of WQS regression in detecting subsets of chemicals associated with health outcomes (binary and continuous) in site-specific analyses and in non-site-specific analyses. We also evaluated the performance of the penalized regression methods of lasso, adaptive lasso, and elastic net in correctly classifying chemicals as bad actors or unrelated to the outcome. We based the simulation study on data from the National Cancer Institute Surveillance Epidemiology and End Results Program (NCI-SEER) case–control study of non-Hodgkin lymphoma (NHL) to achieve realistic exposure situations. Our results showed that WQS regression had good sensitivity and specificity across a variety of conditions considered in this study. The shrinkage methods had a tendency to incorrectly identify a large number of components, especially in the case of strong association with the outcome. PMID:26005323

  9. Assessment of weighted quantile sum regression for modeling chemical mixtures and cancer risk.

    PubMed

    Czarnota, Jenna; Gennings, Chris; Wheeler, David C

    2015-01-01

    In evaluation of cancer risk related to environmental chemical exposures, the effect of many chemicals on disease is ultimately of interest. However, because of potentially strong correlations among chemicals that occur together, traditional regression methods suffer from collinearity effects, including regression coefficient sign reversal and variance inflation. In addition, penalized regression methods designed to remediate collinearity may have limitations in selecting the truly bad actors among many correlated components. The recently proposed method of weighted quantile sum (WQS) regression attempts to overcome these problems by estimating a body burden index, which identifies important chemicals in a mixture of correlated environmental chemicals. Our focus was on assessing through simulation studies the accuracy of WQS regression in detecting subsets of chemicals associated with health outcomes (binary and continuous) in site-specific analyses and in non-site-specific analyses. We also evaluated the performance of the penalized regression methods of lasso, adaptive lasso, and elastic net in correctly classifying chemicals as bad actors or unrelated to the outcome. We based the simulation study on data from the National Cancer Institute Surveillance Epidemiology and End Results Program (NCI-SEER) case-control study of non-Hodgkin lymphoma (NHL) to achieve realistic exposure situations. Our results showed that WQS regression had good sensitivity and specificity across a variety of conditions considered in this study. The shrinkage methods had a tendency to incorrectly identify a large number of components, especially in the case of strong association with the outcome.

  10. Comparison of regression coefficient and GIS-based methodologies for regional estimates of forest soil carbon stocks.

    PubMed

    Campbell, J Elliott; Moen, Jeremie C; Ney, Richard A; Schnoor, Jerald L

    2008-03-01

    Estimates of forest soil organic carbon (SOC) have applications in carbon science, soil quality studies, carbon sequestration technologies, and carbon trading. Forest SOC has been modeled using a regression coefficient methodology that applies mean SOC densities (mass/area) to broad forest regions. A higher resolution model is based on an approach that employs a geographic information system (GIS) with soil databases and satellite-derived landcover images. Despite this advancement, the regression approach remains the basis of current state and federal level greenhouse gas inventories. Both approaches are analyzed in detail for Wisconsin forest soils from 1983 to 2001, applying rigorous error-fixing algorithms to soil databases. Resulting SOC stock estimates are 20% larger when determined using the GIS method rather than the regression approach. Average annual rates of increase in SOC stocks are 3.6 and 1.0 million metric tons of carbon per year for the GIS and regression approaches respectively.

  11. Quantile regression applied to spectral distance decay

    USGS Publications Warehouse

    Rocchini, D.; Cade, B.S.

    2008-01-01

    Remotely sensed imagery has long been recognized as a powerful support for characterizing and estimating biodiversity. Spectral distance among sites has proven to be a powerful approach for detecting species composition variability. Regression analysis of species similarity versus spectral distance allows us to quantitatively estimate the amount of turnover in species composition with respect to spectral and ecological variability. In classical regression analysis, the residual sum of squares is minimized for the mean of the dependent variable distribution. However, many ecological data sets are characterized by a high number of zeroes that add noise to the regression model. Quantile regressions can be used to evaluate trend in the upper quantiles rather than a mean trend across the whole distribution of the dependent variable. In this letter, we used ordinary least squares (OLS) and quantile regressions to estimate the decay of species similarity versus spectral distance. The achieved decay rates were statistically nonzero (p < 0.01), considering both OLS and quantile regressions. Nonetheless, the OLS regression estimate of the mean decay rate was only half the decay rate indicated by the upper quantiles. Moreover, the intercept value, representing the similarity reached when the spectral distance approaches zero, was very low compared with the intercepts of the upper quantiles, which detected high species similarity when habitats are more similar. In this letter, we demonstrated the power of using quantile regressions applied to spectral distance decay to reveal species diversity patterns otherwise lost or underestimated by OLS regression. ?? 2008 IEEE.

  12. Sequence analysis to assess labour market participation following vocational rehabilitation: an observational study among patients sick-listed with low back pain from a randomised clinical trial in Denmark.

    PubMed

    Lindholdt, Louise; Labriola, Merete; Nielsen, Claus Vinther; Horsbøl, Trine Allerslev; Lund, Thomas

    2017-07-20

    The return-to-work (RTW) process after long-term sickness absence is often complex and long and implies multiple shifts between different labour market states for the absentee. Standard methods for examining RTW research typically rely on the analysis of one outcome measure at a time, which will not capture the many possible states and transitions the absentee can go through. The purpose of this study was to explore the potential added value of sequence analysis in supplement to standard regression analysis of a multidisciplinary RTW intervention among patients with low back pain (LBP). The study population consisted of 160 patients randomly allocated to either a hospital-based brief or a multidisciplinary intervention. Data on labour market participation following intervention were obtained from a national register and analysed in two ways: as a binary outcome expressed as active or passive relief at a 1-year follow-up and as four different categories for labour market participation. Logistic regression and sequence analysis were performed. The logistic regression analysis showed no difference in labour market participation for patients in the two groups after 1 year. Applying sequence analysis showed differences in subsequent labour market participation after 2 years after baseline in favour of the brief intervention group versus the multidisciplinary intervention group. The study indicated that sequence analysis could provide added analytical value as a supplement to traditional regression analysis in prospective studies of RTW among patients with LBP. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Temporal Synchronization Analysis for Improving Regression Modeling of Fecal Indicator Bacteria Levels

    EPA Science Inventory

    Multiple linear regression models are often used to predict levels of fecal indicator bacteria (FIB) in recreational swimming waters based on independent variables (IVs) such as meteorologic, hydrodynamic, and water-quality measures. The IVs used for these analyses are traditiona...

  14. Micro- and macro-geographic scale effect on the molecular imprint of selection and adaptation in Norway spruce.

    PubMed

    Scalfi, Marta; Mosca, Elena; Di Pierro, Erica Adele; Troggio, Michela; Vendramin, Giovanni Giuseppe; Sperisen, Christoph; La Porta, Nicola; Neale, David B

    2014-01-01

    Forest tree species of temperate and boreal regions have undergone a long history of demographic changes and evolutionary adaptations. The main objective of this study was to detect signals of selection in Norway spruce (Picea abies [L.] Karst), at different sampling-scales and to investigate, accounting for population structure, the effect of environment on species genetic diversity. A total of 384 single nucleotide polymorphisms (SNPs) representing 290 genes were genotyped at two geographic scales: across 12 populations distributed along two altitudinal-transects in the Alps (micro-geographic scale), and across 27 populations belonging to the range of Norway spruce in central and south-east Europe (macro-geographic scale). At the macrogeographic scale, principal component analysis combined with Bayesian clustering revealed three major clusters, corresponding to the main areas of southern spruce occurrence, i.e. the Alps, Carpathians, and Hercynia. The populations along the altitudinal transects were not differentiated. To assess the role of selection in structuring genetic variation, we applied a Bayesian and coalescent-based F(ST)-outlier method and tested for correlations between allele frequencies and climatic variables using regression analyses. At the macro-geographic scale, the F(ST)-outlier methods detected together 11 F(ST)-outliers. Six outliers were detected when the same analyses were carried out taking into account the genetic structure. Regression analyses with population structure correction resulted in the identification of two (micro-geographic scale) and 38 SNPs (macro-geographic scale) significantly correlated with temperature and/or precipitation. Six of these loci overlapped with F(ST)-outliers, among them two loci encoding an enzyme involved in riboflavin biosynthesis and a sucrose synthase. The results of this study indicate a strong relationship between genetic and environmental variation at both geographic scales. It also suggests that an integrative approach combining different outlier detection methods and population sampling at different geographic scales is useful to identify loci potentially involved in adaptation.

  15. Forest Type and Above Ground Biomass Estimation Based on Sentinel-2A and WorldView-2 Data Evaluation of Predictor nd Data Suitability

    NASA Astrophysics Data System (ADS)

    Fritz, Andreas; Enßle, Fabian; Zhang, Xiaoli; Koch, Barbara

    2016-08-01

    The present study analyses the two earth observation sensors regarding their capability of modelling forest above ground biomass and forest density. Our research is carried out at two different demonstration sites. The first is located in south-western Germany (region Karlsruhe) and the second is located in southern China in Jiangle County (Province Fujian). A set of spectral and spatial predictors are computed from both, Sentinel-2A and WorldView-2 data. Window sizes in the range of 3*3 pixels to 21*21 pixels are computed in order to cover the full range of the canopy sizes of mature forest stands. Textural predictors of first and second order (grey-level-co-occurrence matrix) are calculated and are further used within a feature selection procedure. Additionally common spectral predictors from WorldView-2 and Sentinel-2A data such as all relevant spectral bands and NDVI are integrated in the analyses. To examine the most important predictors, a predictor selection algorithm is applied to the data, whereas the entire predictor set of more than 1000 predictors is used to find most important ones. Out of the original set only the most important predictors are then further analysed. Predictor selection is done with the Boruta package in R (Kursa and Rudnicki (2010)), whereas regression is computed with random forest. Prior the classification and regression a tuning of parameters is done by a repetitive model selection (100 runs), based on the .632 bootstrapping. Both are implemented in the caret R pack- age (Kuhn et al. (2016)). To account for the variability in the data set 100 independent runs are performed. Within each run 80 percent of the data is used for training and the 20 percent are used for an independent validation. With the subset of original predictors mapping of above ground biomass is performed.

  16. Micro- and Macro-Geographic Scale Effect on the Molecular Imprint of Selection and Adaptation in Norway Spruce

    PubMed Central

    Scalfi, Marta; Mosca, Elena; Di Pierro, Erica Adele; Troggio, Michela; Vendramin, Giovanni Giuseppe; Sperisen, Christoph; La Porta, Nicola; Neale, David B.

    2014-01-01

    Forest tree species of temperate and boreal regions have undergone a long history of demographic changes and evolutionary adaptations. The main objective of this study was to detect signals of selection in Norway spruce (Picea abies [L.] Karst), at different sampling-scales and to investigate, accounting for population structure, the effect of environment on species genetic diversity. A total of 384 single nucleotide polymorphisms (SNPs) representing 290 genes were genotyped at two geographic scales: across 12 populations distributed along two altitudinal-transects in the Alps (micro-geographic scale), and across 27 populations belonging to the range of Norway spruce in central and south-east Europe (macro-geographic scale). At the macrogeographic scale, principal component analysis combined with Bayesian clustering revealed three major clusters, corresponding to the main areas of southern spruce occurrence, i.e. the Alps, Carpathians, and Hercynia. The populations along the altitudinal transects were not differentiated. To assess the role of selection in structuring genetic variation, we applied a Bayesian and coalescent-based F ST-outlier method and tested for correlations between allele frequencies and climatic variables using regression analyses. At the macro-geographic scale, the F ST-outlier methods detected together 11 F ST-outliers. Six outliers were detected when the same analyses were carried out taking into account the genetic structure. Regression analyses with population structure correction resulted in the identification of two (micro-geographic scale) and 38 SNPs (macro-geographic scale) significantly correlated with temperature and/or precipitation. Six of these loci overlapped with F ST-outliers, among them two loci encoding an enzyme involved in riboflavin biosynthesis and a sucrose synthase. The results of this study indicate a strong relationship between genetic and environmental variation at both geographic scales. It also suggests that an integrative approach combining different outlier detection methods and population sampling at different geographic scales is useful to identify loci potentially involved in adaptation. PMID:25551624

  17. Correlation between the complex PSA/total PSA ratio and the free PSA/total PSA ratio, sensitivity and specificity of both markers for the diagnosis of prostate cancer.

    PubMed

    Pérez-Lanzac-Lorca, A; Barco-Sánchez, A; Romero, E; Martinez-Peinado, A; López-Elorza, F; Sanchez-Sanchez, E; Alvarez-Ossorio-Fernandez, J L; Castiñeiras-Fernández, J

    2013-09-01

    To compare the behaviour of the PSAcomplex/PSAtotal percentage (PSAc%) against the PSA free/PSA total (PSAl%) and analyse both markers for their usefulness in diagnosing prostate cancer. We measured total PSA (PSAt), free PSA (PSAl), complex PSA (PSAc), PSAl% and PSAc% levels in 158 patients. Of these, 98 (62%) were biopsied for presenting PSAt≥3 ng/dl and PSAl%<20, PSAt>10, suspicious rectal examination or suspicious ultrasound node. We performed linear regression and Passing-Bablok regression analyses. The ROC curves were calculated to study the sensitivity and specificity of PSAl% and PSAc% and were compared to each other. The prostate cancer diagnoses were analysed by PSAl% and PSAc% by applying the χ(2) test. The correlation coefficient (r) was good (0.7447, P<.0001), and the index of determination (r(2)) was 0,5. The result of the Passing-Bablok analysis was a slope of 1.658 (1.452 to 1.897) and an intersection of 2.044 (-0,936 to 5.393). The optimal cutoff for PSAl% (≤14.7854) showed a sensitivity of 89.29% [95% CI, 0,642-0,823] and a specificity of 54.29% (95% CI, 0,642-0,823). The optimal cutoff for PSAc% (>89.7796) had a sensitivity of 71.43% (95% CI, 0,616-0,802) and a specificity of 71.43% (95% CI, 0,616-0,802). There were no significant differences when comparing the areas under the curve of both markers (P=.59). The PPV of PSAl% was less than that of PSAc% (45.7% vs. 71%). There was a good correlation between PSAl% and PSAc%. PSAc% has demonstrated greater specificity and efficacy than PSAl% in the diagnosis of prostate cancer. Copyright © 2012 AEU. Published by Elsevier Espana. All rights reserved.

  18. Effect of compression load and temperature on thermomechanical tests for gutta-percha and Resilon®.

    PubMed

    Tanomaru-Filho, M; Silveira, G F; Reis, J M S N; Bonetti-Filho, I; Guerreiro-Tanomaru, J M

    2011-11-01

    To analyse a method used to evaluate the thermomechanical properties of gutta-percha and Resilon(®) at different temperatures and compression loads. Two hundred and seventy specimens measuring 10 mm in diameter and 1.5 mm in height were made from the following materials: conventional gutta-percha (GCO), thermoplastic gutta-percha (GTP) and Resilon(®) cones (RE). After 24 h, the specimens were placed in water at 50 °C, 60 °C or 70 °C for 60 s. After that, specimens were placed between two glass slabs, and loads weighing 1.0, 3.0 or 5.0 kg were applied. Images of the specimens were digitized before and after the test and analysed using imaging software to determine their initial and final areas. The thermomechanical property of each material was determined by the difference between the initial and final areas of the specimens. Data were subjected to anova and SNK tests at 5% significance. To verify a possible correlation between the results of the materials, linear regression coefficients (r) were calculated. Data showed higher flow area values for RE under all compression loads at 70 °C and under the 5.0 kg load at 60 °C (P < 0.05). Regarding gutta-percha, GTP showed higher flow under loads weighing 3.0 and 5.0 kg, at 60 and 70 °C (P < 0.05). GCO presented higher flow at 70 °C with a load of 5.0 kg. Regression analyses showed a poor linear correlation amongst the results of the materials under the different experimental conditions. Gutta-percha and Resilon(®) cones require different compression loads and temperatures for evaluation of their thermomechanical properties. For all materials, the greatest flow occurred at 70 °C under a load of 5.0 kg; therefore, these parameters may be adopted when evaluating endodontic filling materials. © 2011 International Endodontic Journal.

  19. Acute imaging does not improve ASTRAL score's accuracy despite having a prognostic value.

    PubMed

    Ntaios, George; Papavasileiou, Vasileios; Faouzi, Mohamed; Vanacker, Peter; Wintermark, Max; Michel, Patrik

    2014-10-01

    The ASTRAL score was recently shown to reliably predict three-month functional outcome in patients with acute ischemic stroke. The study aims to investigate whether information from multimodal imaging increases ASTRAL score's accuracy. All patients registered in the ASTRAL registry until March 2011 were included. In multivariate logistic-regression analyses, we added covariates derived from parenchymal, vascular, and perfusion imaging to the 6-parameter model of the ASTRAL score. If a specific imaging covariate remained an independent predictor of three-month modified Rankin score>2, the area-under-the-curve (AUC) of this new model was calculated and compared with ASTRAL score's AUC. We also performed similar logistic regression analyses in arbitrarily chosen patient subgroups. When added to the ASTRAL score, the following covariates on admission computed tomography/magnetic resonance imaging-based multimodal imaging were not significant predictors of outcome: any stroke-related acute lesion, any nonstroke-related lesions, chronic/subacute stroke, leukoaraiosis, significant arterial pathology in ischemic territory on computed tomography angiography/magnetic resonance angiography/Doppler, significant intracranial arterial pathology in ischemic territory, and focal hypoperfusion on perfusion-computed tomography. The Alberta Stroke Program Early CT score on plain imaging and any significant extracranial arterial pathology on computed tomography angiography/magnetic resonance angiography/Doppler were independent predictors of outcome (odds ratio: 0·93, 95% CI: 0·87-0·99 and odds ratio: 1·49, 95% CI: 1·08-2·05, respectively) but did not increase ASTRAL score's AUC (0·849 vs. 0·850, and 0·8563 vs. 0·8564, respectively). In exploratory analyses in subgroups of different prognosis, age or stroke severity, no covariate was found to increase ASTRAL score's AUC, either. The addition of information derived from multimodal imaging does not increase ASTRAL score's accuracy to predict functional outcome despite having an independent prognostic value. More selected radiological parameters applied in specific subgroups of stroke patients may add prognostic value of multimodal imaging. © 2014 World Stroke Organization.

  20. Prognosis Related to Metastatic Burden Measured by 18F-Fluorocholine PET/CT in Castrate Resistant Prostate Cancer

    PubMed Central

    Kwee, Sandi A.; Lim, John; Watanabe, Alex; Kromer-Baker, Kathleen; Coel, Marc N.

    2015-01-01

    This study investigates the prognostic significance of metabolically active tumor volume (MATV) measurements applied to fluorine-18 fluorocholine (FC) PET/CT in castrate-resistant prostate cancer (CRPC). Methods FC PET/CT imaging was performed in 30 patients with CRPC. Metastatic disease was quantified on the basis of maximum standardized uptake value (SUVmax), MATV, and total lesion activity (TLA = MATV × mean SUV). Tumor burden indices derived from whole-body summation of PET tumor volume measurements (ie. net MATV and net TLA) were evaluated as variables in Cox regression and Kaplan-Meier survival analyses. Results Net MATV ranged from 0.12 cm3 to 1543.9 cm3 (median 52.6 cm3). Net TLA ranged from 0.40g to 6688.7g (median 225.1g). PSA level at the time of PET correlated significantly with net MATV (Pearson r = 0.65, p = 0.0001) and net TLA (r = 0.60, p = 0.0005) but not highest lesional SUVmax of each scan. Survivors were followed for a median 23 months (range 6 – 38 months). On Cox regression analyses, overall survival was significantly associated with net MATV (p = 0.0068), net TLA (p = 0.0072), and highest lesion SUVmax (p = 0.0173), and borderline associated with PSA level (p = 0.0458). Only net MATV and net TLA remained significant in univariate-adjusted survival analyses. Kaplan-Meier analysis demonstrated significant differences in survival between groups stratified by median net MATV (log-rank P = 0.0371), net TLA (log-rank P = 0.0371), and highest lesion SUVmax (log-rank P = 0.0223). Conclusions Metastatic prostate cancer detected by FC PET/CT can be quantified based on volumetric measurements of tumor metabolic activity. The prognostic value of FC PET/CT may stem from this capacity to assess whole-body tumor burden. With further clinical validation, FC PET-based indices of global disease activity and mortality risk could prove useful in patient-individualized treatment of CRPC. PMID:24676753

  1. Does more education mean less disability in people with dementia? A large cross-sectional study in Taiwan.

    PubMed

    Huang, Shih-Wei; Chi, Wen-Chou; Yen, Chia-Feng; Chang, Kwang-Hwa; Liao, Hua-Fang; Escorpizo, Reuben; Chang, Feng-Hang; Liou, Tsan-Hon

    2017-05-04

    WHO Disability Assessment Schedule 2.0 (WHODAS 2.0) is a feasible tool for assessing functional disability and analysing the risk of institutionalisation among elderly patients with dementia. However, the data for the effect of education on disability status in patients with dementia is lacking. The aim of this large-scale, population-based study was to analyse the effect of education on the disability status of elderly Taiwanese patients with dementia by using WHODAS 2.0. From the Taiwan Data Bank of Persons with Disability, we enrolled 7698 disabled elderly (older than 65 years) patients diagnosed with dementia between July 2012 and January 2014. According to their education status, we categorised these patients with and without formal education (3849 patients each). We controlled for the demographic variables through propensity score matching. The standardised scores of these patients in the six domains of WHODAS 2.0 were evaluated by certified interviewers. Student's t - test was used for comparing the WHODAS 2.0 scores of patients with dementia in the two aforementioned groups. Poisson regression was applied for analysing the association among all the investigated variables. Patients with formal education had low disability status in the domains of getting along and social participation than did patients without formal education. Poisson regression revealed that standardised scores in all domains of WHODAS 2.0-except self-care-were associated with education status. This study revealed lower disability status in the WHODAS 2.0 domains of getting along and social participation for patients with dementia with formal education compared with those without formal education. For patients with disability and dementia without formal education, community intervention of social participation should be implemented to maintain better social interaction ability. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. A Combined Pathway and Regional Heritability Analysis Indicates NETRIN1 Pathway Is Associated With Major Depressive Disorder.

    PubMed

    Zeng, Yanni; Navarro, Pau; Fernandez-Pujals, Ana M; Hall, Lynsey S; Clarke, Toni-Kim; Thomson, Pippa A; Smith, Blair H; Hocking, Lynne J; Padmanabhan, Sandosh; Hayward, Caroline; MacIntyre, Donald J; Wray, Naomi R; Deary, Ian J; Porteous, David J; Haley, Chris S; McIntosh, Andrew M

    2017-02-15

    Genome-wide association studies (GWASs) of major depressive disorder (MDD) have identified few significant associations. Testing the aggregation of genetic variants, in particular biological pathways, may be more powerful. Regional heritability analysis can be used to detect genomic regions that contribute to disease risk. We integrated pathway analysis and multilevel regional heritability analyses in a pipeline designed to identify MDD-associated pathways. The pipeline was applied to two independent GWAS samples [Generation Scotland: The Scottish Family Health Study (GS:SFHS, N = 6455) and Psychiatric Genomics Consortium (PGC:MDD) (N = 18,759)]. A polygenic risk score (PRS) composed of single nucleotide polymorphisms from the pathway most consistently associated with MDD was created, and its accuracy to predict MDD, using area under the curve, logistic regression, and linear mixed model analyses, was tested. In GS:SFHS, four pathways were significantly associated with MDD, and two of these explained a significant amount of pathway-level regional heritability. In PGC:MDD, one pathway was significantly associated with MDD. Pathway-level regional heritability was significant in this pathway in one subset of PGC:MDD. For both samples the regional heritabilities were further localized to the gene and subregion levels. The NETRIN1 signaling pathway showed the most consistent association with MDD across the two samples. PRSs from this pathway showed competitive predictive accuracy compared with the whole-genome PRSs when using area under the curve statistics, logistic regression, and linear mixed model. These post-GWAS analyses highlight the value of combining multiple methods on multiple GWAS data for the identification of risk pathways for MDD. The NETRIN1 signaling pathway is identified as a candidate pathway for MDD and should be explored in further large population studies. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  3. The effects of normal aging on multiple aspects of financial decision-making

    PubMed Central

    Bangma, Dorien F.; Fuermaier, Anselm B. M.; Tucha, Lara; Tucha, Oliver; Koerts, Janneke

    2017-01-01

    Objectives Financial decision-making (FDM) is crucial for independent living. Due to cognitive decline that accompanies normal aging, older adults might have difficulties in some aspects of FDM. However, an improved knowledge, personal experience and affective decision-making, which are also related to normal aging, may lead to a stable or even improved age-related performance in some other aspects of FDM. Therefore, the present explorative study examines the effects of normal aging on multiple aspects of FDM. Methods One-hundred and eighty participants (range 18–87 years) were assessed with eight FDM tests and several standard neuropsychological tests. Age effects were evaluated using hierarchical multiple regression analyses. The validity of the prediction models was examined by internal validation (i.e. bootstrap resampling procedure) as well as external validation on another, independent, sample of participants (n = 124). Multiple regression and correlation analyses were applied to investigate the mediation effect of standard measures of cognition on the observed effects of age on FDM. Results On a relatively basic level of FDM (e.g., paying bills or using FDM styles) no significant effects of aging were found. However more complex FDM, such as making decisions in accordance with specific rules, becomes more difficult with advancing age. Furthermore, an older age was found to be related to a decreased sensitivity for impulsive buying. These results were confirmed by the internal and external validation analyses. Mediation effects of numeracy and planning were found to explain parts of the association between one aspect of FDM (i.e. Competence in decision rules) and age; however, these cognitive domains were not able to completely explain the relation between age and FDM. Conclusion Normal aging has a negative influence on a complex aspect of FDM, however, other aspects appear to be unaffected by normal aging or improve. PMID:28792973

  4. Numerically accurate computational techniques for optimal estimator analyses of multi-parameter models

    NASA Astrophysics Data System (ADS)

    Berger, Lukas; Kleinheinz, Konstantin; Attili, Antonio; Bisetti, Fabrizio; Pitsch, Heinz; Mueller, Michael E.

    2018-05-01

    Modelling unclosed terms in partial differential equations typically involves two steps: First, a set of known quantities needs to be specified as input parameters for a model, and second, a specific functional form needs to be defined to model the unclosed terms by the input parameters. Both steps involve a certain modelling error, with the former known as the irreducible error and the latter referred to as the functional error. Typically, only the total modelling error, which is the sum of functional and irreducible error, is assessed, but the concept of the optimal estimator enables the separate analysis of the total and the irreducible errors, yielding a systematic modelling error decomposition. In this work, attention is paid to the techniques themselves required for the practical computation of irreducible errors. Typically, histograms are used for optimal estimator analyses, but this technique is found to add a non-negligible spurious contribution to the irreducible error if models with multiple input parameters are assessed. Thus, the error decomposition of an optimal estimator analysis becomes inaccurate, and misleading conclusions concerning modelling errors may be drawn. In this work, numerically accurate techniques for optimal estimator analyses are identified and a suitable evaluation of irreducible errors is presented. Four different computational techniques are considered: a histogram technique, artificial neural networks, multivariate adaptive regression splines, and an additive model based on a kernel method. For multiple input parameter models, only artificial neural networks and multivariate adaptive regression splines are found to yield satisfactorily accurate results. Beyond a certain number of input parameters, the assessment of models in an optimal estimator analysis even becomes practically infeasible if histograms are used. The optimal estimator analysis in this paper is applied to modelling the filtered soot intermittency in large eddy simulations using a dataset of a direct numerical simulation of a non-premixed sooting turbulent flame.

  5. Missing Data in Clinical Studies: Issues and Methods

    PubMed Central

    Ibrahim, Joseph G.; Chu, Haitao; Chen, Ming-Hui

    2012-01-01

    Missing data are a prevailing problem in any type of data analyses. A participant variable is considered missing if the value of the variable (outcome or covariate) for the participant is not observed. In this article, various issues in analyzing studies with missing data are discussed. Particularly, we focus on missing response and/or covariate data for studies with discrete, continuous, or time-to-event end points in which generalized linear models, models for longitudinal data such as generalized linear mixed effects models, or Cox regression models are used. We discuss various classifications of missing data that may arise in a study and demonstrate in several situations that the commonly used method of throwing out all participants with any missing data may lead to incorrect results and conclusions. The methods described are applied to data from an Eastern Cooperative Oncology Group phase II clinical trial of liver cancer and a phase III clinical trial of advanced non–small-cell lung cancer. Although the main area of application discussed here is cancer, the issues and methods we discuss apply to any type of study. PMID:22649133

  6. Enhancement of docosahexaenoic acid production by Schizochytrium SW1 using response surface methodology

    NASA Astrophysics Data System (ADS)

    Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul

    2015-09-01

    In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.

  7. Random regression analyses using B-splines functions to model growth from birth to adult age in Canchim cattle.

    PubMed

    Baldi, F; Alencar, M M; Albuquerque, L G

    2010-12-01

    The objective of this work was to estimate covariance functions using random regression models on B-splines functions of animal age, for weights from birth to adult age in Canchim cattle. Data comprised 49,011 records on 2435 females. The model of analysis included fixed effects of contemporary groups, age of dam as quadratic covariable and the population mean trend taken into account by a cubic regression on orthogonal polynomials of animal age. Residual variances were modelled through a step function with four classes. The direct and maternal additive genetic effects, and animal and maternal permanent environmental effects were included as random effects in the model. A total of seventeen analyses, considering linear, quadratic and cubic B-splines functions and up to seven knots, were carried out. B-spline functions of the same order were considered for all random effects. Random regression models on B-splines functions were compared to a random regression model on Legendre polynomials and with a multitrait model. Results from different models of analyses were compared using the REML form of the Akaike Information criterion and Schwarz' Bayesian Information criterion. In addition, the variance components and genetic parameters estimated for each random regression model were also used as criteria to choose the most adequate model to describe the covariance structure of the data. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most adequate to describe the covariance structure of the data. Random regression models using B-spline functions as base functions fitted the data better than Legendre polynomials, especially at mature ages, but higher number of parameters need to be estimated with B-splines functions. © 2010 Blackwell Verlag GmbH.

  8. Deriving the Regression Equation without Using Calculus

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2004-01-01

    Probably the one "new" mathematical topic that is most responsible for modernizing courses in college algebra and precalculus over the last few years is the idea of fitting a function to a set of data in the sense of a least squares fit. Whether it be simple linear regression or nonlinear regression, this topic opens the door to applying the…

  9. After accounting for competing causes of death and more advanced stage, do Aboriginal and Torres Strait Islander peoples with cancer still have worse survival? A population-based cohort study in New South Wales.

    PubMed

    Tervonen, Hanna E; Walton, Richard; You, Hui; Baker, Deborah; Roder, David; Currow, David; Aranda, Sanchia

    2017-06-02

    Aboriginal and Torres Strait Islander peoples in Australia have been found to have poorer cancer survival than non-Aboriginal people. However, use of conventional relative survival analyses is limited due to a lack of life tables. This cohort study examined whether poorer survival persist after accounting for competing risks of death from other causes and disparities in cancer stage at diagnosis, for all cancers collectively and by cancer site. People diagnosed in 2000-2008 were extracted from the population-based New South Wales Cancer Registry. Aboriginal status was multiply imputed for people with missing information (12.9%). Logistic regression models were used to compute odds ratios (ORs) with 95% confidence intervals (CIs) for 'advanced stage' at diagnosis (separately for distant and distant/regional stage). Survival was examined using competing risk regression to compute subhazard ratios (SHRs) with 95%CIs. Of the 301,356 cases, 2517 (0.84%) identified as Aboriginal (0.94% after imputation). After adjusting for age, sex, year of diagnosis, socio-economic status, remoteness, and cancer site Aboriginal peoples were more likely to be diagnosed with distant (OR 1.30, 95%CI 1.17-1.44) or distant/regional stage (OR 1.29, 95%CI 1.18-1.40) for all cancers collectively. This applied to cancers of the female breast, uterus, prostate, kidney, others (those not included in other categories) and cervix (when analyses were restricted to cases with known stages/known Aboriginal status). Aboriginal peoples had a higher hazard of death than non-Aboriginal people after accounting for competing risks from other causes of death, socio-demographic factors, stage and cancer site (SHR 1.40, 95%CI 1.31-1.50 for all cancers collectively). Consistent results applied to colorectal, lung, breast, prostate and other cancers. Aboriginal peoples with cancer have an elevated hazard of cancer death compared with non-Aboriginal people, after accounting for more advanced stage and competing causes of death. Further research is needed to determine reasons, including any contribution of co-morbidity, lifestyle factors and differentials in service access to help explain disparities.

  10. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  11. Stata Modules for Calculating Novel Predictive Performance Indices for Logistic Models.

    PubMed

    Barkhordari, Mahnaz; Padyab, Mojgan; Hadaegh, Farzad; Azizi, Fereidoun; Bozorgmanesh, Mohammadreza

    2016-01-01

    Prediction is a fundamental part of prevention of cardiovascular diseases (CVD). The development of prediction algorithms based on the multivariate regression models loomed several decades ago. Parallel with predictive models development, biomarker researches emerged in an impressively great scale. The key question is how best to assess and quantify the improvement in risk prediction offered by new biomarkers or more basically how to assess the performance of a risk prediction model. Discrimination, calibration, and added predictive value have been recently suggested to be used while comparing the predictive performances of the predictive models' with and without novel biomarkers. Lack of user-friendly statistical software has restricted implementation of novel model assessment methods while examining novel biomarkers. We intended, thus, to develop a user-friendly software that could be used by researchers with few programming skills. We have written a Stata command that is intended to help researchers obtain cut point-free and cut point-based net reclassification improvement index and (NRI) and relative and absolute Integrated discriminatory improvement index (IDI) for logistic-based regression analyses.We applied the commands to a real data on women participating the Tehran lipid and glucose study (TLGS) to examine if information of a family history of premature CVD, waist circumference, and fasting plasma glucose can improve predictive performance of the Framingham's "general CVD risk" algorithm. The command is addpred for logistic regression models. The Stata package provided herein can encourage the use of novel methods in examining predictive capacity of ever-emerging plethora of novel biomarkers.

  12. Obesity and changes in urine albumin/creatinine ratio in patients with type 2 diabetes: the DEMAND study.

    PubMed

    Rossi, M C E; Nicolucci, A; Pellegrini, F; Comaschi, M; Ceriello, A; Cucinotta, D; Giorda, C; Pomili, B; Valentini, U; Vespasiani, G; De Cosmo, S

    2010-02-01

    Obesity is a potential risk factor for renal disease in non-diabetic subjects. It remains unclear whether this also applies to diabetic patients. We investigated whether obesity predicted changes in albumin excretion rate in individuals with type 2 diabetes. Fifty Italian diabetes outpatient clinics enrolled a random sample of 1289 patients. A morning spot urine sample was collected to determine urinary albumin/creatinine ratio (ACR) at baseline and after 1 year from the study initiation. Progression of albumin excretion was defined as a doubling in ACR, while regression was defined as a 50% reduction. Multivariate logistic regression analyses were used to evaluate correlates of these outcomes. Data are expressed as odds ratios (OR) with 95% confidence intervals (CI). The risk of progression increased by 7% (OR=1.07; 95%CI 1.00-1.15) for every 5-cm increase in waist circumference measured at baseline, and by 17% (OR=1.17; 95%CI 1.03-1.33) for every one-unit increase in BMI during follow-up. The likelihood of regression was not independently associated with any of the variables investigated. The effect of obesity on progression of ACR was independent of metabolic control, blood pressure, treatment, and baseline level of albumin excretion. We found a tight link between obesity and changes in albumin excretion in diabetic subjects, suggesting potential benefits of interventions on body weight on end-organ renal damage. Copyright 2009 Elsevier B.V. All rights reserved.

  13. Estimates of Flow Duration, Mean Flow, and Peak-Discharge Frequency Values for Kansas Stream Locations

    USGS Publications Warehouse

    Perry, Charles A.; Wolock, David M.; Artman, Joshua C.

    2004-01-01

    Streamflow statistics of flow duration and peak-discharge frequency were estimated for 4,771 individual locations on streams listed on the 1999 Kansas Surface Water Register. These statistics included the flow-duration values of 90, 75, 50, 25, and 10 percent, as well as the mean flow value. Peak-discharge frequency values were estimated for the 2-, 5-, 10-, 25-, 50-, and 100-year floods. Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating flow-duration values of 90, 75, 50, 25, and 10 percent and the mean flow for uncontrolled flow stream locations. The contributing-drainage areas of 149 U.S. Geological Survey streamflow-gaging stations in Kansas and parts of surrounding States that had flow uncontrolled by Federal reservoirs and used in the regression analyses ranged from 2.06 to 12,004 square miles. Logarithmic transformations of climatic and basin data were performed to yield the best linear relation for developing equations to compute flow durations and mean flow. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were contributing-drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. The analyses yielded a model standard error of prediction range of 0.43 logarithmic units for the 90-percent duration analysis to 0.15 logarithmic units for the 10-percent duration analysis. The model standard error of prediction was 0.14 logarithmic units for the mean flow. Regression equations used to estimate peak-discharge frequency values were obtained from a previous report, and estimates for the 2-, 5-, 10-, 25-, 50-, and 100-year floods were determined for this report. The regression equations and an interpolation procedure were used to compute flow durations, mean flow, and estimates of peak-discharge frequency for locations along uncontrolled flow streams on the 1999 Kansas Surface Water Register. Flow durations, mean flow, and peak-discharge frequency values determined at available gaging stations were used to interpolate the regression-estimated flows for the stream locations where available. Streamflow statistics for locations that had uncontrolled flow were interpolated using data from gaging stations weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled reaches of Kansas streams, the streamflow statistics were interpolated between gaging stations using only gaged data weighted by drainage area.

  14. Determining the response of sea level to atmospheric pressure forcing using TOPEX/POSEIDON data

    NASA Technical Reports Server (NTRS)

    Fu, Lee-Lueng; Pihos, Greg

    1994-01-01

    The static response of sea level to the forcing of atmospheric pressure, the so-called inverted barometer (IB) effect, is investigated using TOPEX/POSEIDON data. This response, characterized by the rise and fall of sea level to compensate for the change of atmospheric pressure at a rate of -1 cm/mbar, is not associated with any ocean currents and hence is normally treated as an error to be removed from sea level observation. Linear regression and spectral transfer function analyses are applied to sea level and pressure to examine the validity of the IB effect. In regions outside the tropics, the regression coefficient is found to be consistently close to the theoretical value except for the regions of western boundary currents, where the mesoscale variability interferes with the IB effect. The spectral transfer function shows near IB response at periods of 30 degrees is -0.84 +/- 0.29 cm/mbar (1 standard deviation). The deviation from = 1 cm /mbar is shown to be caused primarily by the effect of wind forcing on sea level, based on multivariate linear regression model involving both pressure and wind forcing. The regression coefficient for pressure resulting from the multivariate analysis is -0.96 +/- 0.32 cm/mbar. In the tropics the multivariate analysis fails because sea level in the tropics is primarily responding to remote wind forcing. However, after removing from the data the wind-forced sea level estimated by a dynamic model of the tropical Pacific, the pressure regression coefficient improves from -1.22 +/- 0.69 cm/mbar to -0.99 +/- 0.46 cm/mbar, clearly revealing an IB response. The result of the study suggests that with a proper removal of the effect of wind forcing the IB effect is valid in most of the open ocean at periods longer than 20 days and spatial scales larger than 500 km.

  15. Comparison of enzyme-linked immunosorbent assay and gas chromatography procedures for the detection of cyanazine and metolachlor in surface water samples

    USGS Publications Warehouse

    Schraer, S.M.; Shaw, D.R.; Boyette, M.; Coupe, R.H.; Thurman, E.M.

    2000-01-01

    Enzyme-linked immunosorbent assay (ELISA) data from surface water reconnaissance were compared to data from samples analyzed by gas chromatography for the pesticide residues cyanazine (2-[[4-chloro-6-(ethylamino)-l,3,5-triazin-2-yl]amino]-2-methylpropanenitrile ) and metolachlor (2-chloro-N-(2-ethyl-6-methylphenyl)-N-(2-methoxy-1-methylethyl)acetamide). When ELISA analyses were duplicated, cyanazine and metolachlor detection was found to have highly reproducible results; adjusted R2s were 0.97 and 0.94, respectively. When ELISA results for cyanazine were regressed against gas chromatography results, the models effectively predicted cyanazine concentrations from ELISA analyses (adjusted R2s ranging from 0.76 to 0.81). The intercepts and slopes for these models were not different from 0 and 1, respectively. This indicates that cyanazine analysis by ELISA is expected to give the same results as analysis by gas chromatography. However, regressing ELISA analyses for metolachlor against gas chromatography data provided more variable results (adjusted R2s ranged from 0.67 to 0.94). Regression models for metolachlor analyses had two of three intercepts that were not different from 0. Slopes for all metolachlor regression models were significantly different from 1. This indicates that as metolachlor concentrations increase, ELISA will over- or under-estimate metolachlor concentration, depending on the method of comparison. ELISA can be effectively used to detect cyanazine and metolachlor in surface water samples. However, when detections of metolachlor have significant consequences or implications it may be necessary to use other analytical methods.

  16. VBM-DTI correlates of verbal intelligence: a potential link to Broca's area.

    PubMed

    Konrad, Andreas; Vucurevic, Goran; Musso, Francesco; Winterer, Georg

    2012-04-01

    Human brain lesion studies first investigated the biological roots of cognitive functions including language in the late 1800s. Neuroimaging studies have reported correlation findings with general intelligence predominantly in fronto-parietal cortical areas. However, there is still little evidence about the relationship between verbal intelligence and structural properties of the brain. We predicted that verbal performance is related to language regions of Broca's and Wernicke's areas. Verbal intelligence quotient (vIQ) was assessed in 30 healthy young subjects. T1-weighted MRI and diffusion tensor imaging data sets were acquired. Voxel-wise regression analyses were used to correlate fractional anisotropy (FA) and mean diffusivity values with vIQ. Moreover, regression analyses of regional brain volume with vIQ were performed adopting voxel-based morphometry (VBM) and ROI methodology. Our analyses revealed a significant negative correlation between vIQ and FA and a significant positive correlation between vIQ and mean diffusivity in the left-hemispheric Broca's area. VBM regression analyses did not show significant results, whereas a subsequent ROI analysis of Broca's area FA peak cluster demonstrated a positive correlation of gray matter volume and vIQ. These findings suggest that cortical thickness in Broca's area contributes to verbal intelligence. Diffusion parameters predicted gray matter ratio in Broca's area more sensitive than VBM methodology.

  17. Adjusting data to body size: a comparison of methods as applied to quantitative trait loci analysis of musculoskeletal phenotypes.

    PubMed

    Lang, Dean H; Sharkey, Neil A; Lionikas, Arimantas; Mack, Holly A; Larsson, Lars; Vogler, George P; Vandenbergh, David J; Blizard, David A; Stout, Joseph T; Stitt, Joseph P; McClearn, Gerald E

    2005-05-01

    The aim of this study was to compare three methods of adjusting skeletal data for body size and examine their use in QTL analyses. It was found that dividing skeletal phenotypes by body mass index induced erroneous QTL results. The preferred method of body size adjustment was multiple regression. Many skeletal studies have reported strong correlations between phenotypes for muscle, bone, and body size, and these correlations add to the difficulty in identifying genetic influence on skeletal traits that are not mediated through overall body size. Quantitative trait loci (QTL) identified for skeletal phenotypes often map to the same chromosome regions as QTLs for body size. The actions of a QTL identified as influencing BMD could therefore be mediated through the generalized actions of growth on body size or muscle mass. Three methods of adjusting skeletal phenotypes to body size were performed on morphologic, structural, and compositional measurements of the femur and tibia in 200-day-old C57BL/6J x DBA/2 (BXD) second generation (F(2)) mice (n = 400). A common method of removing the size effect has been through the use of ratios. This technique and two alternative techniques using simple and multiple regression were performed on muscle and skeletal data before QTL analyses, and the differences in QTL results were examined. The use of ratios to remove the size effect was shown to increase the size effect by inducing spurious correlations, thereby leading to inaccurate QTL results. Adjustments for body size using multiple regression eliminated these problems. Multiple regression should be used to remove the variance of co-factors related to skeletal phenotypes to allow for the study of genetic influence independent of correlated phenotypes. However, to better understand the genetic influence, adjusted and unadjusted skeletal QTL results should be compared. Additional insight can be gained by observing the difference in LOD score between the adjusted and nonadjusted phenotypes. Identifying QTLs that exert their effects on skeletal phenotypes through body size-related pathways as well as those having a more direct and independent influence on bone are equally important in deciphering the complex physiologic pathways responsible for the maintenance of bone health.

  18. Deterioration of Speech Recognition Ability Over a Period of 5 Years in Adults Ages 18 to 70 Years: Results of the Dutch Online Speech-in-Noise Test.

    PubMed

    Stam, Mariska; Smits, Cas; Twisk, Jos W R; Lemke, Ulrike; Festen, Joost M; Kramer, Sophia E

    2015-01-01

    The first aim of the present study was to determine the change in speech recognition in noise over a period of 5 years in participants ages 18 to 70 years at baseline. The second aim was to investigate whether age, gender, educational level, the level of initial speech recognition in noise, and reported chronic conditions were associated with a change in speech recognition in noise. The baseline and 5-year follow-up data of 427 participants with and without hearing impairment participating in the National Longitudinal Study on Hearing (NL-SH) were analyzed. The ability to recognize speech in noise was measured twice with the online National Hearing Test, a digit-triplet speech-in-noise test. Speech-reception-threshold in noise (SRTn) scores were calculated, corresponding to 50% speech intelligibility. Unaided SRTn scores obtained with the same transducer (headphones or loudspeakers) at both test moments were included. Changes in SRTn were calculated as a raw shift (T1 - T0) and an adjusted shift for regression towards the mean. Paired t tests and multivariable linear regression analyses were applied. The mean increase (i.e., deterioration) in SRTn was 0.38-dB signal-to-noise ratio (SNR) over 5 years (p < 0.001). Results of the multivariable regression analyses showed that the age group of 50 to 59 years had a significantly larger deterioration in SRTn compared with the age group of 18 to 39 years (raw shift: beta: 0.64-dB SNR; 95% confidence interval: 0.07-1.22; p = 0.028, adjusted for initial speech recognition level - adjusted shift: beta: 0.82-dB SNR; 95% confidence interval: 0.27-1.34; p = 0.004). Gender, educational level, and the number of chronic conditions were not associated with a change in SRTn over time. No significant differences in increase of SRTn were found between the initial levels of speech recognition (i.e., good, insufficient, or poor) when taking into account the phenomenon regression towards the mean. The study results indicate that hearing deterioration of speech recognition in noise over 5 years can also be detected in adults ages 18 to 70 years. This rather small numeric change might represent a relevant impact on an individual's ability to understand speech in everyday life.

  19. [Application of marketing strategies for the management of public hospitals from the viewpoint of the staff members].

    PubMed

    Riveros S, Jorge; Berné M, Carmen

    2006-03-01

    The implementation of the marketing strategies in public hospitals provides management advantages and improves the relationship between customers and staff. To analyze the application of marketing strategies in a public hospital, from the perspective of the staff. A structured survey that asked about perceptions in 50 items about communication between personnel and customers/users, customer satisfaction, participation in the development of new policies and incentives for efficiency was applied to a stratified sample of the staff. Factorial and regression analyses were performed to define the impact of marketing strategies on the degree of preoccupation and orientation of the organization towards the satisfaction of customer needs. The survey was applied to 74 males and 122 females. The survey showed that the orientation of the hospital towards the satisfaction of its beneficiaries basically depends on the generation of an organizational culture oriented towards them and the implementation of adequate policies in staff management and quality of service. These basic aspects can be accompanied with practices associated to the new marketing approaches such as a market orientation, customer orientation and relational marketing. All these factors presented positive and significant relations. New marketing strategies should be applied, to achieve an efficient and customer oriented hospital management.

  20. Association between cardiovascular risk factors and carotid intima-media thickness in prepubertal Brazilian children.

    PubMed

    Gazolla, Fernanda Mussi; Neves Bordallo, Maria Alice; Madeira, Isabel Rey; de Miranda Carvalho, Cecilia Noronha; Vieira Monteiro, Alexandra Maria; Pinheiro Rodrigues, Nádia Cristina; Borges, Marcos Antonio; Collett-Solberg, Paulo Ferrez; Muniz, Bruna Moreira; de Oliveira, Cecilia Lacroix; Pinheiro, Suellen Martins; de Queiroz Ribeiro, Rebeca Mathias

    2015-05-01

    Early exposure to cardiovascular risk factors creates a chronic inflammatory state that could damage the endothelium followed by thickening of the carotid intima-media. To investigate the association of cardiovascular risk factors and thickening of the carotid intima. Media in prepubertal children. In this cross-sectional study, carotid intima-media thickness (cIMT) and cardiovascular risk factors were assessed in 129 prepubertal children aged from 5 to 10 year. Association was assessed by simple and multivariate logistic regression analyses. In simple logistic regression analyses, body mass index (BMI) z-score, waist circumference, and systolic blood pressure (SBP) were positively associated with increased left, right, and average cIMT, whereas diastolic blood pressure was positively associated only with increased left and average cIMT (p<0.05). In multivariate logistic regression analyses increased left cIMT was positively associated to BMI z-score and SBP, and increased average cIMT was only positively associated to SBP (p<0.05). BMI z-score and SBP were the strongest risk factors for increased cIMT.

  1. Exploration of walking behavior in Vermont using spatial regression.

    DOT National Transportation Integrated Search

    2015-06-01

    This report focuses on the relationship between walking and its contributing factors by : applying spatial regression methods. Using the Vermont data from the New England : Transportation Survey (NETS), walking variables as well as 170 independent va...

  2. Logistic regression for risk factor modelling in stuttering research.

    PubMed

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Regression analysis for LED color detection of visual-MIMO system

    NASA Astrophysics Data System (ADS)

    Banik, Partha Pratim; Saha, Rappy; Kim, Ki-Doo

    2018-04-01

    Color detection from a light emitting diode (LED) array using a smartphone camera is very difficult in a visual multiple-input multiple-output (visual-MIMO) system. In this paper, we propose a method to determine the LED color using a smartphone camera by applying regression analysis. We employ a multivariate regression model to identify the LED color. After taking a picture of an LED array, we select the LED array region, and detect the LED using an image processing algorithm. We then apply the k-means clustering algorithm to determine the number of potential colors for feature extraction of each LED. Finally, we apply the multivariate regression model to predict the color of the transmitted LEDs. In this paper, we show our results for three types of environmental light condition: room environmental light, low environmental light (560 lux), and strong environmental light (2450 lux). We compare the results of our proposed algorithm from the analysis of training and test R-Square (%) values, percentage of closeness of transmitted and predicted colors, and we also mention about the number of distorted test data points from the analysis of distortion bar graph in CIE1931 color space.

  4. Parental education predicts change in intelligence quotient after childhood epilepsy surgery.

    PubMed

    Meekes, Joost; van Schooneveld, Monique M J; Braams, Olga B; Jennekens-Schinkel, Aag; van Rijen, Peter C; Hendriks, Marc P H; Braun, Kees P J; van Nieuwenhuizen, Onno

    2015-04-01

    To know whether change in the intelligence quotient (IQ) of children who undergo epilepsy surgery is associated with the educational level of their parents. Retrospective analysis of data obtained from a cohort of children who underwent epilepsy surgery between January 1996 and September 2010. We performed simple and multiple regression analyses to identify predictors associated with IQ change after surgery. In addition to parental education, six variables previously demonstrated to be associated with IQ change after surgery were included as predictors: age at surgery, duration of epilepsy, etiology, presurgical IQ, reduction of antiepileptic drugs, and seizure freedom. We used delta IQ (IQ 2 years after surgery minus IQ shortly before surgery) as the primary outcome variable, but also performed analyses with pre- and postsurgical IQ as outcome variables to support our findings. To validate the results we performed simple regression analysis with parental education as the predictor in specific subgroups. The sample for regression analysis included 118 children (60 male; median age at surgery 9.73 years). Parental education was significantly associated with delta IQ in simple regression analysis (p = 0.004), and also contributed significantly to postsurgical IQ in multiple regression analysis (p = 0.008). Additional analyses demonstrated that parental education made a unique contribution to prediction of delta IQ, that is, it could not be replaced by the illness-related variables. Subgroup analyses confirmed the association of parental education with IQ change after surgery for most groups. Children whose parents had higher education demonstrate on average a greater increase in IQ after surgery and a higher postsurgical--but not presurgical--IQ than children whose parents completed at most lower secondary education. Parental education--and perhaps other environmental variables--should be considered in the prognosis of cognitive function after childhood epilepsy surgery. Wiley Periodicals, Inc. © 2015 International League Against Epilepsy.

  5. Comparison of Xenon-Enhanced Area-Detector CT and Krypton Ventilation SPECT/CT for Assessment of Pulmonary Functional Loss and Disease Severity in Smokers.

    PubMed

    Ohno, Yoshiharu; Fujisawa, Yasuko; Takenaka, Daisuke; Kaminaga, Shigeo; Seki, Shinichiro; Sugihara, Naoki; Yoshikawa, Takeshi

    2018-02-01

    The objective of this study was to compare the capability of xenon-enhanced area-detector CT (ADCT) performed with a subtraction technique and coregistered 81m Kr-ventilation SPECT/CT for the assessment of pulmonary functional loss and disease severity in smokers. Forty-six consecutive smokers (32 men and 14 women; mean age, 67.0 years) underwent prospective unenhanced and xenon-enhanced ADCT, 81m Kr-ventilation SPECT/CT, and pulmonary function tests. Disease severity was evaluated according to the Global Initiative for Chronic Obstructive Lung Disease (GOLD) classification. CT-based functional lung volume (FLV), the percentage of wall area to total airway area (WA%), and ventilated FLV on xenon-enhanced ADCT and SPECT/CT were calculated for each smoker. All indexes were correlated with percentage of forced expiratory volume in 1 second (%FEV 1 ) using step-wise regression analyses, and univariate and multivariate logistic regression analyses were performed. In addition, the diagnostic accuracy of the proposed model was compared with that of each radiologic index by means of McNemar analysis. Multivariate logistic regression showed that %FEV 1 was significantly affected (r = 0.77, r 2 = 0.59) by two factors: the first factor, ventilated FLV on xenon-enhanced ADCT (p < 0.0001); and the second factor, WA% (p = 0.004). Univariate logistic regression analyses indicated that all indexes significantly affected GOLD classification (p < 0.05). Multivariate logistic regression analyses revealed that ventilated FLV on xenon-enhanced ADCT and CT-based FLV significantly influenced GOLD classification (p < 0.0001). The diagnostic accuracy of the proposed model was significantly higher than that of ventilated FLV on SPECT/CT (p = 0.03) and WA% (p = 0.008). Xenon-enhanced ADCT is more effective than 81m Kr-ventilation SPECT/CT for the assessment of pulmonary functional loss and disease severity.

  6. Psychometric properties of the Triarchic Psychopathy Measure: An item response theory approach.

    PubMed

    Shou, Yiyun; Sellbom, Martin; Xu, Jing

    2018-05-01

    There is cumulative evidence for the cross-cultural validity of the Triarchic Psychopathy Measure (TriPM; Patrick, 2010) among non-Western populations. Recent studies using correlational and regression analyses show promising construct validity of the TriPM in Chinese samples. However, little is known about the efficiency of items in TriPM in assessing the proposed latent traits. The current study evaluated the psychometric properties of the Chinese TriPM at the item level using item response theory analyses. It also examined the measurement invariance of the TriPM between the Chinese and the U.S. student samples by applying differential item functioning analyses under the item response theory framework. The results supported the unidimensional nature of the Disinhibition and Meanness scales. Both scales had a greater level of precision in the respective underlying constructs at the positive ends. The two scales, however, had several items that were weakly associated with their respective latent traits in the Chinese student sample. Boldness, on the other hand, was found to be multidimensional, and reflected a more normally distributed range of variation. The examination of measurement bias via differential item functioning analyses revealed that a number of items of the TriPM were not equivalent across the Chinese and the U.S. Some modification and adaptation of items might be considered for improving the precision of the TriPM for Chinese participants. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Spatial Double Generalized Beta Regression Models: Extensions and Application to Study Quality of Education in Colombia

    ERIC Educational Resources Information Center

    Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente

    2013-01-01

    In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we motivate…

  8. Implementations of geographically weighted lasso in spatial data with multicollinearity (Case study: Poverty modeling of Java Island)

    NASA Astrophysics Data System (ADS)

    Setiyorini, Anis; Suprijadi, Jadi; Handoko, Budhi

    2017-03-01

    Geographically Weighted Regression (GWR) is a regression model that takes into account the spatial heterogeneity effect. In the application of the GWR, inference on regression coefficients is often of interest, as is estimation and prediction of the response variable. Empirical research and studies have demonstrated that local correlation between explanatory variables can lead to estimated regression coefficients in GWR that are strongly correlated, a condition named multicollinearity. It later results on a large standard error on estimated regression coefficients, and, hence, problematic for inference on relationships between variables. Geographically Weighted Lasso (GWL) is a method which capable to deal with spatial heterogeneity and local multicollinearity in spatial data sets. GWL is a further development of GWR method, which adds a LASSO (Least Absolute Shrinkage and Selection Operator) constraint in parameter estimation. In this study, GWL will be applied by using fixed exponential kernel weights matrix to establish a poverty modeling of Java Island, Indonesia. The results of applying the GWL to poverty datasets show that this method stabilizes regression coefficients in the presence of multicollinearity and produces lower prediction and estimation error of the response variable than GWR does.

  9. Infantile hemangioma-like vascular lesion in a 26-year-old woman after abortion.

    PubMed

    Lu, Yang; Wang, Shu Jun; Li, Xin; Hu, Li; Zhang, Wen Jie; Li, Wei

    2014-01-01

    A 26-year-old woman (G2P1A1) presented with a 5-week history of multiple red marks on her body after a therapeutic abortion. A physical examination found 15 palpable red marks on her head, neck, chest, arms and legs. Proliferating endothelial cells, which expressed CD31, CD34, von Willebrand factor, but not Glut-1 and merosin, were observed in the lesional area by histopathological analyses. Histocompatibility antigen typing of 2 lesions was identical to a sample from peripheral blood. Accelerated regression was observed in 2 lesions treated by intralesional injection of betamethasone, while spontaneous regression was observed within 9 months in the remaining lesions without any treatment. Rapid growth, spontaneous regression and histological analyses in this case support the diagnosis of 'infantile hemangioma-like vascular lesion'.

  10. Network regularised Cox regression and multiplex network models to predict disease comorbidities and survival of cancer.

    PubMed

    Xu, Haoming; Moni, Mohammad Ali; Liò, Pietro

    2015-12-01

    In cancer genomics, gene expression levels provide important molecular signatures for all types of cancer, and this could be very useful for predicting the survival of cancer patients. However, the main challenge of gene expression data analysis is high dimensionality, and microarray is characterised by few number of samples with large number of genes. To overcome this problem, a variety of penalised Cox proportional hazard models have been proposed. We introduce a novel network regularised Cox proportional hazard model and a novel multiplex network model to measure the disease comorbidities and to predict survival of the cancer patient. Our methods are applied to analyse seven microarray cancer gene expression datasets: breast cancer, ovarian cancer, lung cancer, liver cancer, renal cancer and osteosarcoma. Firstly, we applied a principal component analysis to reduce the dimensionality of original gene expression data. Secondly, we applied a network regularised Cox regression model on the reduced gene expression datasets. By using normalised mutual information method and multiplex network model, we predict the comorbidities for the liver cancer based on the integration of diverse set of omics and clinical data, and we find the diseasome associations (disease-gene association) among different cancers based on the identified common significant genes. Finally, we evaluated the precision of the approach with respect to the accuracy of survival prediction using ROC curves. We report that colon cancer, liver cancer and renal cancer share the CXCL5 gene, and breast cancer, ovarian cancer and renal cancer share the CCND2 gene. Our methods are useful to predict survival of the patient and disease comorbidities more accurately and helpful for improvement of the care of patients with comorbidity. Software in Matlab and R is available on our GitHub page: https://github.com/ssnhcom/NetworkRegularisedCox.git. Copyright © 2015. Published by Elsevier Ltd.

  11. Quantitative computed tomography applied to interstitial lung diseases.

    PubMed

    Obert, Martin; Kampschulte, Marian; Limburg, Rebekka; Barańczuk, Stefan; Krombach, Gabriele A

    2018-03-01

    To evaluate a new image marker that retrieves information from computed tomography (CT) density histograms, with respect to classification properties between different lung parenchyma groups. Furthermore, to conduct a comparison of the new image marker with conventional markers. Density histograms from 220 different subjects (normal = 71; emphysema = 73; fibrotic = 76) were used to compare the conventionally applied emphysema index (EI), 15 th percentile value (PV), mean value (MV), variance (V), skewness (S), kurtosis (K), with a new histogram's functional shape (HFS) method. Multinomial logistic regression (MLR) analyses was performed to calculate predictions of different lung parenchyma group membership using the individual methods, as well as combinations thereof, as covariates. Overall correct assigned subjects (OCA), sensitivity (sens), specificity (spec), and Nagelkerke's pseudo R 2 (NR 2 ) effect size were estimated. NR 2 was used to set up a ranking list of the different methods. MLR indicates the highest classification power (OCA of 92%; sens 0.95; spec 0.89; NR 2 0.95) when all histogram analyses methods were applied together in the MLR. Highest classification power among individually applied methods was found using the HFS concept (OCA 86%; sens 0.93; spec 0.79; NR 2 0.80). Conventional methods achieved lower classification potential on their own: EI (OCA 69%; sens 0.95; spec 0.26; NR 2 0.52); PV (OCA 69%; sens 0.90; spec 0.37; NR 2 0.57); MV (OCA 65%; sens 0.71; spec 0.58; NR 2 0.61); V (OCA 66%; sens 0.72; spec 0.53; NR 2 0.66); S (OCA 65%; sens 0.88; spec 0.26; NR 2 0.55); and K (OCA 63%; sens 0.90; spec 0.16; NR 2 0.48). The HFS method, which was so far applied to a CT bone density curve analysis, is also a remarkable information extraction tool for lung density histograms. Presumably, being a principle mathematical approach, the HFS method can extract valuable health related information also from histograms from complete different areas. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Traffic flow forecasting using approximate nearest neighbor nonparametric regression

    DOT National Transportation Integrated Search

    2000-12-01

    The purpose of this research is to enhance nonparametric regression (NPR) for use in real-time systems by first reducing execution time using advanced data structures and imprecise computations and then developing a methodology for applying NPR. Due ...

  13. Analysis of the Magnitude and Frequency of Peak Discharges for the Navajo Nation in Arizona, Utah, Colorado, and New Mexico

    USGS Publications Warehouse

    Waltemeyer, Scott D.

    2006-01-01

    Estimates of the magnitude and frequency of peak discharges are necessary for the reliable flood-hazard mapping in the Navajo Nation in Arizona, Utah, Colorado, and New Mexico. The Bureau of Indian Affairs, U.S. Army Corps of Engineers, and Navajo Nation requested that the U.S. Geological Survey update estimates of peak discharge magnitude for gaging stations in the region and update regional equations for estimation of peak discharge and frequency at ungaged sites. Equations were developed for estimating the magnitude of peak discharges for recurrence intervals of 2, 5, 10, 25, 50, 100, and 500 years at ungaged sites using data collected through 1999 at 146 gaging stations, an additional 13 years of peak-discharge data since a 1997 investigation, which used gaging-station data through 1986. The equations for estimation of peak discharges at ungaged sites were developed for flood regions 8, 11, high elevation, and 6 and are delineated on the basis of the hydrologic codes from the 1997 investigation. Peak discharges for selected recurrence intervals were determined at gaging stations by fitting observed data to a log-Pearson Type III distribution with adjustments for a low-discharge threshold and a zero skew coefficient. A low-discharge threshold was applied to frequency analysis of 82 of the 146 gaging stations. This application provides an improved fit of the log-Pearson Type III frequency distribution. Use of the low-discharge threshold generally eliminated the peak discharge having a recurrence interval of less than 1.4 years in the probability-density function. Within each region, logarithms of the peak discharges for selected recurrence intervals were related to logarithms of basin and climatic characteristics using stepwise ordinary least-squares regression techniques for exploratory data analysis. Generalized least-squares regression techniques, an improved regression procedure that accounts for time and spatial sampling errors, then was applied to the same data used in the ordinary least-squares regression analyses. The average standard error of prediction for a peak discharge have a recurrence interval of 100-years for region 8 was 53 percent (average) for the 100-year flood. The average standard of prediction, which includes average sampling error and average standard error of regression, ranged from 45 to 83 percent for the 100-year flood. Estimated standard error of prediction for a hybrid method for region 11 was large in the 1997 investigation. No distinction of floods produced from a high-elevation region was presented in the 1997 investigation. Overall, the equations based on generalized least-squares regression techniques are considered to be more reliable than those in the 1997 report because of the increased length of record and improved GIS method. Techniques for transferring flood-frequency relations to ungaged sites on the same stream can be estimated at an ungaged site by a direct application of the regional regression equation or at an ungaged site on a stream that has a gaging station upstream or downstream by using the drainage-area ratio and the drainage-area exponent from the regional regression equation of the respective region.

  14. Addressing the identification problem in age-period-cohort analysis: a tutorial on the use of partial least squares and principal components analysis.

    PubMed

    Tu, Yu-Kang; Krämer, Nicole; Lee, Wen-Chung

    2012-07-01

    In the analysis of trends in health outcomes, an ongoing issue is how to separate and estimate the effects of age, period, and cohort. As these 3 variables are perfectly collinear by definition, regression coefficients in a general linear model are not unique. In this tutorial, we review why identification is a problem, and how this problem may be tackled using partial least squares and principal components regression analyses. Both methods produce regression coefficients that fulfill the same collinearity constraint as the variables age, period, and cohort. We show that, because the constraint imposed by partial least squares and principal components regression is inherent in the mathematical relation among the 3 variables, this leads to more interpretable results. We use one dataset from a Taiwanese health-screening program to illustrate how to use partial least squares regression to analyze the trends in body heights with 3 continuous variables for age, period, and cohort. We then use another dataset of hepatocellular carcinoma mortality rates for Taiwanese men to illustrate how to use partial least squares regression to analyze tables with aggregated data. We use the second dataset to show the relation between the intrinsic estimator, a recently proposed method for the age-period-cohort analysis, and partial least squares regression. We also show that the inclusion of all indicator variables provides a more consistent approach. R code for our analyses is provided in the eAppendix.

  15. Introduction to the use of regression models in epidemiology.

    PubMed

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  16. Time Series Expression Analyses Using RNA-seq: A Statistical Approach

    PubMed Central

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  17. Color Trails Test: normative data and criterion validity for the greek adult population.

    PubMed

    Messinis, Lambros; Malegiannaki, Amaryllis-Chryssi; Christodoulou, Tessa; Panagiotopoulos, Vassillis; Papathanasopoulos, Panagiotis

    2011-06-01

    The Color Trails Test (CTT) was developed as a culturally fair analog of the Trail Making Test. In the present study, normative data for the CTT were developed for the Greek adult population and further the criterion validity of the CTT was examined in two clinical groups (29 Parkinson's disease [PD] and 25 acute stroke patients). The instrument was applied to 163 healthy participants, aged 19-75. Stepwise linear regression analyses revealed a significant influence of age and education level on completion time in both parts of the CTT (increased age and decreased educational level contributed to slower completion times for both parts), whereas gender did not influence time to completion of part B. Further, the CTT appears to discriminate adequately between the performance of PD and acute stroke patients and matched healthy controls.

  18. Domain General Mediators of the Relation between Kindergarten Number Sense and First-Grade Mathematics Achievement

    PubMed Central

    Hassinger-Das, Brenna; Jordan, Nancy C.; Glutting, Joseph; Irwin, Casey; Dyson, Nancy

    2013-01-01

    Domain general skills that mediate the relation between kindergarten number sense and first-grade mathematics skills were investigated. Participants were 107 children who displayed low number sense in the fall of kindergarten. Controlling for background variables, multiple regression analyses showed that attention problems and executive functioning both were unique predictors of mathematics outcomes. Attention problems were more important for predicting first-grade calculation performance while executive functioning was more important for predicting first-grade performance on applied problems. Moreover, both executive functioning and attention problems were unique partial mediators of the relationship between kindergarten and first-grade mathematics skills. The results provide empirical support for developing interventions that target executive functioning and attention problems in addition to instruction in number skills for kindergartners with initial low number sense. PMID:24237789

  19. Applicability of Newton's law of cooling in monetary economics

    NASA Astrophysics Data System (ADS)

    Todorović, Jadranka Đurović; Tomić, Zoran; Denić, Nebojša; Petković, Dalibor; Kojić, Nenad; Petrović, Jelena; Petković, Biljana

    2018-03-01

    Inflation is a phenomenon which attracts the attention of many researchers. Inflation is not a recent date phenomenon, but it has existed ever since money emerged in world's first economies. With the development of economy and market, inflation developed as well. Today, even though there is a considerable number of research papers on inflation, there is still not enough knowledge about all factors which might cause inflation, and influence its evolution and dynamics. Regression analysis is a powerful statistical tool which might help analyse a vast amount of data on inflation, and provide an answer to the question about the factors of inflation, as well as the way those factors influence it. In this article Newton's Law of Cooling was applied to determine the long-term dynamics of monetary aggregates and inflation in Serbia and Croatia.

  20. Domain-general mediators of the relation between kindergarten number sense and first-grade mathematics achievement.

    PubMed

    Hassinger-Das, Brenna; Jordan, Nancy C; Glutting, Joseph; Irwin, Casey; Dyson, Nancy

    2014-02-01

    Domain-general skills that mediate the relation between kindergarten number sense and first-grade mathematics skills were investigated. Participants were 107 children who displayed low number sense in the fall of kindergarten. Controlling for background variables, multiple regression analyses showed that both attention problems and executive functioning were unique predictors of mathematics outcomes. Attention problems were more important for predicting first-grade calculation performance, whereas executive functioning was more important for predicting first-grade performance on applied problems. Moreover, both executive functioning and attention problems were unique partial mediators of the relationship between kindergarten and first-grade mathematics skills. The results provide empirical support for developing interventions that target executive functioning and attention problems in addition to instruction in number skills for kindergartners with initial low number sense. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.

  2. QSRR modeling for diverse drugs using different feature selection methods coupled with linear and nonlinear regressions.

    PubMed

    Goodarzi, Mohammad; Jensen, Richard; Vander Heyden, Yvan

    2012-12-01

    A Quantitative Structure-Retention Relationship (QSRR) is proposed to estimate the chromatographic retention of 83 diverse drugs on a Unisphere poly butadiene (PBD) column, using isocratic elutions at pH 11.7. Previous work has generated QSRR models for them using Classification And Regression Trees (CART). In this work, Ant Colony Optimization is used as a feature selection method to find the best molecular descriptors from a large pool. In addition, several other selection methods have been applied, such as Genetic Algorithms, Stepwise Regression and the Relief method, not only to evaluate Ant Colony Optimization as a feature selection method but also to investigate its ability to find the important descriptors in QSRR. Multiple Linear Regression (MLR) and Support Vector Machines (SVMs) were applied as linear and nonlinear regression methods, respectively, giving excellent correlation between the experimental, i.e. extrapolated to a mobile phase consisting of pure water, and predicted logarithms of the retention factors of the drugs (logk(w)). The overall best model was the SVM one built using descriptors selected by ACO. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Logistic regression for circular data

    NASA Astrophysics Data System (ADS)

    Al-Daffaie, Kadhem; Khan, Shahjahan

    2017-05-01

    This paper considers the relationship between a binary response and a circular predictor. It develops the logistic regression model by employing the linear-circular regression approach. The maximum likelihood method is used to estimate the parameters. The Newton-Raphson numerical method is used to find the estimated values of the parameters. A data set from weather records of Toowoomba city is analysed by the proposed methods. Moreover, a simulation study is considered. The R software is used for all computations and simulations.

  4. Estimates of Median Flows for Streams on the 1999 Kansas Surface Water Register

    USGS Publications Warehouse

    Perry, Charles A.; Wolock, David M.; Artman, Joshua C.

    2004-01-01

    The Kansas State Legislature, by enacting Kansas Statute KSA 82a?2001 et. seq., mandated the criteria for determining which Kansas stream segments would be subject to classification by the State. One criterion for the selection as a classified stream segment is based on the statistic of median flow being equal to or greater than 1 cubic foot per second. As specified by KSA 82a?2001 et. seq., median flows were determined from U.S. Geological Survey streamflow-gaging-station data by using the most-recent 10 years of gaged data (KSA) for each streamflow-gaging station. Median flows also were determined by using gaged data from the entire period of record (all-available hydrology, AAH). Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating median flows for uncontrolled stream segments. The drainage area of the gaging stations on uncontrolled stream segments used in the regression analyses ranged from 2.06 to 12,004 square miles. A logarithmic transformation of the data was needed to develop the best linear relation for computing median flows. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. Tobit analyses of KSA data yielded a model standard error of prediction of 0.285 logarithmic units, and the best equations using Tobit analyses of AAH data had a model standard error of prediction of 0.250 logarithmic units. These regression equations and an interpolation procedure were used to compute median flows for the uncontrolled stream segments on the 1999 Kansas Surface Water Register. Measured median flows from gaging stations were incorporated into the regression-estimated median flows along the stream segments where available. The segments that were uncontrolled were interpolated using gaged data weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled segments of Kansas streams, the median flow information was interpolated between gaging stations using only gaged data weighted by drainage area. Of the 2,232 total stream segments on the Kansas Surface Water Register, 34.5 percent of the segments had an estimated median streamflow of less than 1 cubic foot per second when the KSA analysis was used. When the AAH analysis was used, 36.2 percent of the segments had an estimated median streamflow of less than 1 cubic foot per second. This report supercedes U.S. Geological Survey Water-Resources Investigations Report 02?4292.

  5. CO2 flux determination by closed-chamber methods can be seriously biased by inappropriate application of linear regression

    NASA Astrophysics Data System (ADS)

    Kutzbach, L.; Schneider, J.; Sachs, T.; Giebels, M.; Nykänen, H.; Shurpali, N. J.; Martikainen, P. J.; Alm, J.; Wilmking, M.

    2007-07-01

    Closed (non-steady state) chambers are widely used for quantifying carbon dioxide (CO2) fluxes between soils or low-stature canopies and the atmosphere. It is well recognised that covering a soil or vegetation by a closed chamber inherently disturbs the natural CO2 fluxes by altering the concentration gradients between the soil, the vegetation and the overlying air. Thus, the driving factors of CO2 fluxes are not constant during the closed chamber experiment, and no linear increase or decrease of CO2 concentration over time within the chamber headspace can be expected. Nevertheless, linear regression has been applied for calculating CO2 fluxes in many recent, partly influential, studies. This approach was justified by keeping the closure time short and assuming the concentration change over time to be in the linear range. Here, we test if the application of linear regression is really appropriate for estimating CO2 fluxes using closed chambers over short closure times and if the application of nonlinear regression is necessary. We developed a nonlinear exponential regression model from diffusion and photosynthesis theory. This exponential model was tested with four different datasets of CO2 flux measurements (total number: 1764) conducted at three peatland sites in Finland and a tundra site in Siberia. The flux measurements were performed using transparent chambers on vegetated surfaces and opaque chambers on bare peat surfaces. Thorough analyses of residuals demonstrated that linear regression was frequently not appropriate for the determination of CO2 fluxes by closed-chamber methods, even if closure times were kept short. The developed exponential model was well suited for nonlinear regression of the concentration over time c(t) evolution in the chamber headspace and estimation of the initial CO2 fluxes at closure time for the majority of experiments. CO2 flux estimates by linear regression can be as low as 40% of the flux estimates of exponential regression for closure times of only two minutes and even lower for longer closure times. The degree of underestimation increased with increasing CO2 flux strength and is dependent on soil and vegetation conditions which can disturb not only the quantitative but also the qualitative evaluation of CO2 flux dynamics. The underestimation effect by linear regression was observed to be different for CO2 uptake and release situations which can lead to stronger bias in the daily, seasonal and annual CO2 balances than in the individual fluxes. To avoid serious bias of CO2 flux estimates based on closed chamber experiments, we suggest further tests using published datasets and recommend the use of nonlinear regression models for future closed chamber studies.

  6. Advanced Statistics for Exotic Animal Practitioners.

    PubMed

    Hodsoll, John; Hellier, Jennifer M; Ryan, Elizabeth G

    2017-09-01

    Correlation and regression assess the association between 2 or more variables. This article reviews the core knowledge needed to understand these analyses, moving from visual analysis in scatter plots through correlation, simple and multiple linear regression, and logistic regression. Correlation estimates the strength and direction of a relationship between 2 variables. Regression can be considered more general and quantifies the numerical relationships between an outcome and 1 or multiple variables in terms of a best-fit line, allowing predictions to be made. Each technique is discussed with examples and the statistical assumptions underlying their correct application. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Coffee intake, cardiovascular disease and all-cause mortality: observational and Mendelian randomization analyses in 95 000-223 000 individuals.

    PubMed

    Nordestgaard, Ask Tybjærg; Nordestgaard, Børge Grønne

    2016-12-01

    Coffee has been associated with modestly lower risk of cardiovascular disease and all-cause mortality in meta-analyses; however, it is unclear whether these are causal associations. We tested first whether coffee intake is associated with cardiovascular disease and all-cause mortality observationally; second, whether genetic variations previously associated with caffeine intake are associated with coffee intake; and third, whether the genetic variations are associated with cardiovascular disease and all-cause mortality. First, we used multivariable adjusted Cox proportional hazard regression models evaluated with restricted cubic splines to examine observational associations in 95 366 White Danes. Second, we estimated mean coffee intake according to five genetic variations near the AHR (rs4410790; rs6968865) and CYP1A1/2 genes (rs2470893; rs2472297; rs2472299). Third, we used sex- and age adjusted Cox proportional hazard regression models to examine genetic associations with cardiovascular disease and all-cause mortality in 112 509 Danes. Finally, we used sex and age-adjusted logistic regression models to examine genetic associations with ischaemic heart disease including the Cardiogram and C4D consortia in a total of up to 223 414 individuals. We applied similar analyses to ApoE genotypes associated with plasma cholesterol levels, as a positive control. In observational analyses, we observed U-shaped associations between coffee intake and cardiovascular disease and all-cause mortality; lowest risks were observed in individuals with medium coffee intake. Caffeine intake allele score (rs4410790 + rs2470893) was associated with a 42% higher coffee intake. Hazard ratios per caffeine intake allele were 1.02 (95% confidence interval: 1.00-1.03) for ischaemic heart disease, 1.02 (0.99-1.02) for ischaemic stroke, 1.02 (1.00-1.03) for ischaemic vascular disease, 1.02 (0.99-1.06) for cardiovascular mortality and 1.01 (0.99-1.03) for all-cause mortality. Including international consortia, odds ratios per caffeine intake allele for ischaemic heart disease were 1.00 (0.98-1.02) for rs4410790, 1.01 (0.99-1.03) for rs6968865, 1.02 (1.00-1.04) for rs2470893, 1.02 (1.00-1.04) for rs2472297 and 1.03 (0.99-1.06) for rs2472299. Conversely, 5% lower cholesterol level caused by ApoE genotype had a corresponding odds ratio for ischaemic heart disease of 0.93 (0.89-0.97). Observationally, coffee intake was associated with U-shaped lower risk of cardiovascular disease and all-cause mortality; however, genetically caffeine intake was not associated with risk of cardiovascular disease or all-cause mortality. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  8. The association of serum prolactin concentration with inflammatory biomarkers - cross-sectional findings from the population-based Study of Health in Pomerania.

    PubMed

    Friedrich, Nele; Schneider, Harald J; Spielhagen, Christin; Markus, Marcello Ricardo Paulista; Haring, Robin; Grabe, Hans J; Buchfelder, Michael; Wallaschofski, Henri; Nauck, Matthias

    2011-10-01

    Prolactin (PRL) is involved in immune regulation and may contribute to an atherogenic phenotype. Previous results on the association of PRL with inflammatory biomarkers have been conflicting and limited by small patient studies. Therefore, we used data from a large population-based sample to assess the cross-sectional associations between serum PRL concentration and high-sensitivity C-reactive protein (hsCRP), fibrinogen, interleukin-6 (IL-6), and white blood cell (WBC) count. From the population-based Study of Health in Pomerania (SHIP), a total of 3744 subjects were available for the present analyses. PRL and inflammatory biomarkers were measured. Linear and logistic regression models adjusted for age, sex, body-mass-index, total cholesterol and glucose were analysed. Multivariable linear regression models revealed a positive association of PRL with WBC. Multivariable logistic regression analyses showed a significant association of PRL with increased IL-6 in non-smokers [highest vs lowest quintile: odds ratio 1·69 (95% confidence interval 1·10-2·58), P = 0·02] and smokers [OR 2·06 (95%-CI 1·10-3·89), P = 0·02]. Similar results were found for WBC in non-smokers [highest vs lowest quintile: OR 2·09 (95%-CI 1·21-3·61), P = 0·01)] but not in smokers. Linear and logistic regression analyses revealed no significant associations of PRL with hsCRP or fibrinogen. Serum PRL concentrations are associated with inflammatory biomarkers including IL-6 and WBC, but not hsCRP or fibrinogen. The suggested role of PRL in inflammation needs further investigation in future prospective studies. © 2011 Blackwell Publishing Ltd.

  9. Using Marginal Structural Modeling to Estimate the Cumulative Impact of an Unconditional Tax Credit on Self-Rated Health.

    PubMed

    Pega, Frank; Blakely, Tony; Glymour, M Maria; Carter, Kristie N; Kawachi, Ichiro

    2016-02-15

    In previous studies, researchers estimated short-term relationships between financial credits and health outcomes using conventional regression analyses, but they did not account for time-varying confounders affected by prior treatment (CAPTs) or the credits' cumulative impacts over time. In this study, we examined the association between total number of years of receiving New Zealand's Family Tax Credit (FTC) and self-rated health (SRH) in 6,900 working-age parents using 7 waves of New Zealand longitudinal data (2002-2009). We conducted conventional linear regression analyses, both unadjusted and adjusted for time-invariant and time-varying confounders measured at baseline, and fitted marginal structural models (MSMs) that more fully adjusted for confounders, including CAPTs. Of all participants, 5.1%-6.8% received the FTC for 1-3 years and 1.8%-3.6% for 4-7 years. In unadjusted and adjusted conventional regression analyses, each additional year of receiving the FTC was associated with 0.033 (95% confidence interval (CI): -0.047, -0.019) and 0.026 (95% CI: -0.041, -0.010) units worse SRH (on a 5-unit scale). In the MSMs, the average causal treatment effect also reflected a small decrease in SRH (unstabilized weights: β = -0.039 unit, 95% CI: -0.058, -0.020; stabilized weights: β = -0.031 unit, 95% CI: -0.050, -0.007). Cumulatively receiving the FTC marginally reduced SRH. Conventional regression analyses and MSMs produced similar estimates, suggesting little bias from CAPTs. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Brucellosis in pregnant women from Pakistan: an observational study.

    PubMed

    Ali, Shahzad; Akhter, Shamim; Neubauer, Heinrich; Scherag, André; Kesselmeier, Miriam; Melzer, Falk; Khan, Iahtasham; El-Adawy, Hosny; Azam, Asima; Qadeer, Saima; Ali, Qurban

    2016-09-02

    Brucella species occasionally cause spontaneous human abortion. Brucella can be transmitted commonly through the ingestion of raw milk or milk products. The objective of this study was to determine the sero-prevalence of and to identify potential risk factors for brucellosis in pregnant women from Rawalpindi, Pakistan. We conducted a cross-sectional study at the Gynecology Outdoor Patient department of the Benazir Bhutto Hospital, Rawalpindi, Pakistan from March to June 2013. Data related to potential risk factors and clinical history was collected by individual interviews on the blood sampling day. The 429 serum samples collected were initially screened by Rose Bengal Plate Agglutination test for the detection of Brucella antibodies. We applied standard descriptive statistics and logistic regression analyses. Twenty five (5.8 %; 95 % confidence interval (CI): 3.8 % -8.5 %) serum samples were found to be seropositive. Brucellosis-related clinical symptoms were recorded in various seropositive cases. Animal contact, raw milk consumption, having an abortion history and the experience of an intrauterine fetal death were associated with seropositivity for brucellosis in univariate analyses (all p <0.05). In multiple logistic regression models only the contact with animals remained as independent and robust risk factor (odds ratio 5.21; 95 % CI: 1.88-13.75; p = 0.001) for seropositivity. Brucellosis is a serious threat for pregnant women and their unborn children in Pakistan. Pregnant women having brucellosis-related symptoms or previous history of abortions, miscarriages, intrauterine fetal death and other brucellosis-related manifestations should be screened for brucellosis - especially those exposed to animals given the increased risk - and medication should be administered according to state of the art.

  11. [Impulsivity - aggression - depression: study of adolescents' problem behavior in light of their personality traits].

    PubMed

    Pikó, Bettina; Pinczés, Tamás

    2014-01-01

    Impulsivity is a personality trait that may determine our everyday living, quality of life, decisions. Impulsivity in particular has a great significance during adolescence as a part of the adolescent neuroanatomical and neuropsychiatric developmental processes. The main goal of the present study was to examine correlations among depressive symptomatology, types of aggressive behaviors (verbal, physical, psychic), impulsivity and other personality traits (risk-taking, empathy and self-efficacy) in adolescents. Data collection was going on in Debrecen during the first semester of the year 2012 using classes from three high schools (N = 413), 237 (57,4%) boys and 176 (42,6%) girls. The self-administered questionnaire contained items on mental health and personality traits beyond sociodemographics. After descriptive statistics, correlation and multiple linear regression analyses were applied to detect correlations. Girls reported more depressive symptoms and their level of empathy was also higher. In terms of aggression, significant gender difference was detected only in case of physical aggression with a surplus of boys. Based on multiple regression analyses, we may conclude that impulsivity acted as a risk factor both for mean levels of depressive symptomatology and aggression scales. Besides, lack of empathy proved to be related to physical aggression. In both sexes, self-efficacy was positively associated with verbal and physical aggression. Among girls, self-efficacy was a negative predictor of psychic aggression, that is, it seems to act as a protective factor. In summary, we may conclude that there are strong correlations among depressive symptomatology, aggressive behaviors and impulsivity, and this association may be colored by further personality traits, such as risk-taking, empathy, and self-efficacy. There is a need for learning some basic effective techniques for aggression management and self control as early as adolescence.

  12. Estimating regression to the mean and true effects of an intervention in a four-wave panel study.

    PubMed

    Gmel, Gerhard; Wicki, Matthias; Rehm, Jürgen; Heeb, Jean-Luc

    2008-01-01

    First, to analyse whether a taxation-related decrease in spirit prices had a similar effect on spirit consumption for low-, medium- and high-level drinkers. Secondly, as the relationship between baseline values and post-intervention changes is confounded with regression to the mean (RTM) effects, to apply different approaches for estimating the RTM effect and true change. Consumption of spirits and total alcohol consumption were analysed in a four-wave panel study (one pre-intervention and three post-intervention measurements) of 889 alcohol consumers sampled from the general population of Switzerland. Two correlational methods, one method quantitatively estimating the RTM effect and one growth curve approach based on hierarchical linear models (HLM), were used to estimate RTM effects among low-, medium- and high-level drinkers. Adjusted for RTM effects, high-level drinkers increased consumption more than lighter drinkers in the short term, but this was not a persisting effect. Changes in taxation affected mainly light and moderate drinkers in the long term. All methods concurred that RTM effects were present to a considerable degree, and methods quantifying the RTM effect or adjusting for it yielded similar estimates. Intervention studies have to consider RTM effects both in the study design and in the evaluation methods. Observed changes can be adjusted for RTM effects and true change can be estimated. The recommended method, particularly if the aim is to estimate change not only for the sample as a whole, but for groups of drinkers with different baseline consumption levels, is growth curve modelling. If reliability of measurement instruments cannot be increased, the incorporation of more than one pre-intervention measurement point may be a valuable adjustment of the study design.

  13. Poverty dynamics, poverty thresholds and mortality: An age-stage Markovian model

    PubMed Central

    Rehkopf, David; Tuljapurkar, Shripad; Horvitz, Carol C.

    2018-01-01

    Recent studies have examined the risk of poverty throughout the life course, but few have considered how transitioning in and out of poverty shape the dynamic heterogeneity and mortality disparities of a cohort at each age. Here we use state-by-age modeling to capture individual heterogeneity in crossing one of three different poverty thresholds (defined as 1×, 2× or 3× the “official” poverty threshold) at each age. We examine age-specific state structure, the remaining life expectancy, its variance, and cohort simulations for those above and below each threshold. Survival and transitioning probabilities are statistically estimated by regression analyses of data from the Health and Retirement Survey RAND data-set, and the National Longitudinal Survey of Youth. Using the results of these regression analyses, we parameterize discrete state, discrete age matrix models. We found that individuals above all three thresholds have higher annual survival than those in poverty, especially for mid-ages to about age 80. The advantage is greatest when we classify individuals based on 1× the “official” poverty threshold. The greatest discrepancy in average remaining life expectancy and its variance between those above and in poverty occurs at mid-ages for all three thresholds. And fewer individuals are in poverty between ages 40-60 for all three thresholds. Our findings are consistent with results based on other data sets, but also suggest that dynamic heterogeneity in poverty and the transience of the poverty state is associated with income-related mortality disparities (less transience, especially of those above poverty, more disparities). This paper applies the approach of age-by-stage matrix models to human demography and individual poverty dynamics. In so doing we extend the literature on individual poverty dynamics across the life course. PMID:29768416

  14. Predictors of protein-energy wasting in haemodialysis patients: a cross-sectional study.

    PubMed

    Ruperto, M; Sánchez-Muniz, F J; Barril, G

    2016-02-01

    Protein-energy wasting (PEW) is a highly prevalent condition in haemodialysis patients (HD). The potential usefulness of nutritional-inflammatory markers in the diagnosis of PEW in chronic kidney disease has not been established completely. We hypothesised that a combination of serum albumin, percentage of mid-arm muscle circumference and standard body weight comprises a better discriminator than either single marker of nutritional status in HD patients. A cross-sectional study was performed in 80 HD patients. Patients were categorised in two groups: well-nourished and PEW. Logistic regression analysis was applied to corroborate the reliability of the three markers of PEW with all the nutritional-inflammatory markers analysed. PEW was identified in 52.5% of HD patients. Compared with the well-nourished patients, PEW patients had lower body mass index, serum pre-albumin and body cell mass (all P < 0.001) and higher C-reactive protein (s-CRP) (P < 0.01). Logistic regression analyses showed that the combination of the three criteria were significantly related with s-CRP >1 mg dL(-1) , phase angle <4°, and serum pre-albumin <30 mg dL(-1) (all P < 0.05). Other indicators, such as lymphocytes <20% and Charlson comorbidity index, were significantly involved (both P < 0.01). A receiver operating characteristic curve (area under the curve) of 0.86 (P < 0.001) was found. The combined utilisation of serum albumin, percentage of mid-arm muscle circumference and standard body weight as PEW markers appears to be useful for nutritional-inflammatory status assessment and adds predictive value to the traditional indicators. Larger studies are needed to achieve the reliability of these predictor combinations and their cut-off values in HD patients and other populations. © 2014 The British Dietetic Association Ltd.

  15. MDM2 and Ki-67 predict for distant metastasis and mortality in men treated with radiotherapy and androgen deprivation for prostate cancer: RTOG 92-02.

    PubMed

    Khor, Li-Yan; Bae, Kyounghwa; Paulus, Rebecca; Al-Saleem, Tahseen; Hammond, M Elizabeth; Grignon, David J; Che, Mingxin; Venkatesan, Varagur; Byhardt, Roger W; Rotman, Marvin; Hanks, Gerald E; Sandler, Howard M; Pollack, Alan

    2009-07-01

    PURPOSE MDM2 regulates p53, which controls cell cycle arrest and apoptosis. Both proteins, along with Ki-67, which is an established strong determinant of metastasis, have shown promise in predicting the outcome of men treated with radiation therapy (RT) with or without short-term androgen deprivation (STAD). This report compares the utility of abnormal expression of these biomarkers in estimating progression in a cohort of men treated on RTOG 92-02. PATIENTS AND METHODS Adequate tissue for immunohistochemistry was available for p53, Ki-67, and MDM2 analyses in 478 patient cases. The percentage of tumor nuclei staining positive (PSP) was quantified manually or by image analysis, and the per-sample mean intensity score (MIS) was quantified by image analysis. Cox regression models were used to estimate overall mortality (OM), and Fine and Gray's regressions were applied to the end points of distant metastasis (DM) and cause-specific mortality (CSM). Results In multivariate analyses that adjusted for all markers and treatment covariates, MDM2 overexpression was significantly related to DM (P = .02) and OM (P = .003), and Ki-67 overexpression was significantly related to DM (P < .0001), CSM (P = .0007), and OM (P = .01). P53 overexpression was significantly related to OM (P = .02). When considered in combination, the overexpression of both Ki-67 and MDM2 at high levels was associated with significantly increased failure rates for all end points (P < .001 for DM, CSM, and OM). CONCLUSION Combined MDM2 and Ki-67 expression levels were independently related to distant metastasis and mortality and, if validated, could be considered for risk stratification of patients with prostate cancer in clinical trials.

  16. Socioeconomic Differences in Parenting Strategies to Prevent Adolescent Smoking: A Case Study from the Netherlands.

    PubMed

    Kuipers, Mirte A G; Haal, Sylke; Kunst, Anton E

    2016-06-01

    This study aimed to identify possible socioeconomic differences in the use of anti-smoking parenting strategies. In 2012, survey data of adolescents (N = 225) aged 13 to 17 years and their mothers (N = 122) and fathers (N = 105) were collected in Haarlem, the Netherlands. Questions on smoking behaviour and eleven anti-smoking parenting strategies were answered by adolescents, mothers and fathers. School tracks of adolescents and educational level of parents were measured as indicators of socioeconomic position. Linear multilevel regression analyses were applied to study the association between socioeconomic position (SEP) and standardised scores of anti-smoking strategies. Analyses were controlled for age, sex and smoking by parents and adolescents. We found no consistent socioeconomic differences in the use of anti-smoking parenting strategies. There were no statistically significant differences in relation to parental educational level or when using adolescent reports on parenting practices. However, when using parental reports, a few strategies varied significantly according to adolescent educational track. Adolescents in higher educational tracks were more likely to have no-smoking rules in the home (standardised regression coefficient (β) = 0.20, 95 % confidence interval (CI): 0.03; 0.37, p = 0.022) and more likely to have a no-smoking agreement (β = 0.17, 95 % CI: 0.00; 0.34, p = 0.048). However, they were less likely to frequently communicate about smoking with their parents (β = -0.25, 95 % CI: -0.41; -0.08, p = 0.004). In this specific population, there was no consistent support for the hypothesis that anti-smoking parenting strategies contribute to socioeconomic inequalities in adolescent smoking. Parental factors that are more likely to contribute to these inequalities include parental smoking and parenting styles.

  17. Clinical Phenotype Classifications Based on Static Varus Alignment and Varus Thrust in Japanese Patients With Medial Knee Osteoarthritis

    PubMed Central

    Iijima, Hirotaka; Fukutani, Naoto; Fukumoto, Takahiko; Uritani, Daisuke; Kaneda, Eishi; Ota, Kazuo; Kuroki, Hiroshi; Matsuda, Shuichi

    2015-01-01

    Objective To investigate the association between knee pain during gait and 4 clinical phenotypes based on static varus alignment and varus thrust in patients with medial knee osteoarthritis (OA). Methods Patients in an orthopedic clinic (n = 266) diagnosed as having knee OA (Kellgren/Lawrence [K/L] grade ≥1) were divided into 4 phenotype groups according to the presence or absence of static varus alignment and varus thrust (dynamic varus): no varus (n = 173), dynamic varus (n = 17), static varus (n = 50), and static varus + dynamic varus (n = 26). The knee range of motion, spatiotemporal gait parameters, visual analog scale scores for knee pain, and scores on the Japanese Knee Osteoarthritis Measure were used to assess clinical outcomes. Multiple logistic regression analyses identified the relationship between knee pain during gait and the 4 phenotypes, adjusted for possible risk factors, including age, sex, body mass index, K/L grade, and gait velocity. Results Multiple logistic regression analysis showed that varus thrust without varus alignment was associated with knee pain during gait (odds ratio [OR] 3.30, 95% confidence interval [95% CI] 1.08–12.4), and that varus thrust combined with varus alignment was strongly associated with knee pain during gait (OR 17.1, 95% CI 3.19–320.0). Sensitivity analyses applying alternative cutoff values for defining static varus alignment showed comparable results. Conclusion Varus thrust with or without static varus alignment was associated with the occurrence of knee pain during gait. Tailored interventions based on individual malalignment phenotypes may improve clinical outcomes in patients with knee OA. PMID:26017348

  18. Predicting having condoms available among adolescents: the role of personal norm and enjoyment.

    PubMed

    Jellema, Ilke J; Abraham, Charles; Schaalma, Herman P; Gebhardt, Winifred A; van Empelen, Pepijn

    2013-05-01

    Having condoms available has been shown to be an important predictor of condom use. We examined whether or not personal norm and goal enjoyment contribute to predicting having condoms available in the context of cognition specified by the theory of planned behaviour (TPB). Prospective survey study, with a baseline and follow-up measurement (at 3 months). Data were gathered using an online survey. In total 282 adolescents (mean age = 15.6, 74% female adolescents) completed both questionnaires. At baseline, demographics, sexual experience, condom use, TPB variables, descriptive norm, personal norm, and enjoyment towards having condoms available were measured. At T2 (3 months later) having condoms available was measured. Direct and moderating effects of personal norm and goal enjoyment were examined by means of hierarchical linear regression analyses. Regression analyses yielded a direct effect of self-efficacy and personal norm on condom availability. In addition, moderation of the intention-behaviour relation by goal enjoyment added to the variance explained. The final model explained approximately 35% of the variance in condom availability. Personal norm and goal enjoyment add to the predictive utility of a TPB model of having condoms available and may be useful intervention targets. What is already known about this subject? Having condoms available is an important prerequisite for actual condom use. The theory of planned behaviour has successfully been applied to explain condom availability behaviour. The theory of planned behaviour has been criticized for not adequately taking into account affective motivation. What does this study add? Personal norm and goal enjoyment add to the predictive utility of the model. Personal norm explains condom availability directly, enjoyment increases intention enactment. Personal norm and goal enjoyment therefore are useful intervention targets. © 2012 The British Psychological Society.

  19. Performance and effects of land cover type on synthetic surface reflectance data and NDVI estimates for assessment and monitoring of semi-arid rangeland

    USGS Publications Warehouse

    Olexa, Edward M.; Lawrence, Rick L

    2014-01-01

    Federal land management agencies provide stewardship over much of the rangelands in the arid andsemi-arid western United States, but they often lack data of the proper spatiotemporal resolution andextent needed to assess range conditions and monitor trends. Recent advances in the blending of com-plementary, remotely sensed data could provide public lands managers with the needed information.We applied the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) to five Landsat TMand concurrent Terra MODIS scenes, and used pixel-based regression and difference image analyses toevaluate the quality of synthetic reflectance and NDVI products associated with semi-arid rangeland. Pre-dicted red reflectance data consistently demonstrated higher accuracy, less bias, and stronger correlationwith observed data than did analogous near-infrared (NIR) data. The accuracy of both bands tended todecline as the lag between base and prediction dates increased; however, mean absolute errors (MAE)were typically ≤10%. The quality of area-wide NDVI estimates was less consistent than either spectra lband, although the MAE of estimates predicted using early season base pairs were ≤10% throughout the growing season. Correlation between known and predicted NDVI values and agreement with the 1:1regression line tended to decline as the prediction lag increased. Further analyses of NDVI predictions,based on a 22 June base pair and stratified by land cover/land use (LCLU), revealed accurate estimates through the growing season; however, inter-class performance varied. This work demonstrates the successful application of the STARFM algorithm to semi-arid rangeland; however, we encourage evaluation of STARFM’s performance on a per product basis, stratified by LCLU, with attention given to the influence of base pair selection and the impact of the time lag.

  20. Socioeconomic Correlates of Contraceptive Use among the Ethnic Tribal Women of Bangladesh: Does Sex Preference Matter?

    PubMed Central

    Hassan, Che Hashim

    2013-01-01

    Objective To examine the relationship between socioeconomic factors affecting contraceptive use among tribal women of Bangladesh with focusing on son preference over daughter. Materials and methods The study used data gathered through a cross sectional survey on four tribal communities resided in the Rangamati Hill District of the Chittagong Hill Tracts, Bangladesh. A multistage random sampling procedure was applied to collect data from 865 currently married women of whom 806 women were currently married, non-pregnant and had at least one living child, which are the basis of this study. The information was recorded in a pre-structured questionnaire. Simple cross tabulation, chi-square tests and logistic regression analyses were performed to analyzing data. Results The contraceptive prevalence rate among the study tribal women was 73%. The multivariate analyses yielded quantitatively important and reliable estimates of likelihood of contraceptive use. Findings revealed that after controlling for other variables, the likelihood of contraceptive use was found not to be significant among women with at least one son than those who had only daughters, indicating no preference of son over daughter. Multivariate logistic regression analysis suggests that home visitations by family planning workers, tribal identity, place of residence, husband's education, and type of family, television ownership, electricity connection in the household and number of times married are important determinants of any contraceptive method use among the tribal women. Conclusion The contraceptive use rate among the disadvantaged tribal women was more than that of the national level. Door-step delivery services of modern methods should be reached and available targeting the poor and remote zones. PMID:24971107

  1. Are adolescents with high self-esteem protected from psychosomatic symptomatology?

    PubMed

    Piko, Bettina F; Varga, Szabolcs; Mellor, David

    2016-06-01

    This study investigated the role of self-esteem, social (need to belong, loneliness, competitiveness, and shyness), and health (smoking, drinking) behaviors in Hungarian adolescents' psychosomatic symptoms. Our sample of 490 students (ages 14-19 years) from Debrecen (Hungary) completed the questionnaires. Besides descriptive statistics, correlation and multiple regression analyses were applied to test interrelationships. Frequency analysis revealed that fatigue was the most commonly experienced psychosomatic symptom in this sample, followed by sleeping problems and (lower) back pain. Girls reported experiencing more symptoms. Multiple regression analyses suggested that (1) need to belong, shyness, and competitiveness may serve as social behavioral risk factors for adolescents' psychosomatic symptomatology, whereas (2) self-esteem may play a protective role. The role of social and health behaviors was modified when analyzed by gender: the psychosomatic index score was positively related to smoking and shyness among girls, and need to belong among boys. Self-esteem provided protection for both sexes. We conclude that problems with social relationships (namely, unmet need to belong, competitiveness, and shyness) may lead to psychosomatic health complaints, whereas self-esteem may serve as a protection. Findings suggest that social skills training and strengthening self-esteem should be an important part of children's health promotion programs in schools to improve their psychosomatic health and well-being. • Despite being free of serious physical illness, many adolescents often report subjective health complaints, such as psychosomatic symptoms • As children in this life stage develop independence and autonomy, new types of social relationships, and identity, their social needs and skills also change What is new: • Need to belong, shyness, and competitiveness may serve as social behavioral risk factors for adolescents' psychosomatic symptomatology, whereas self-esteem may play a protective role • The role of social and health behaviors may vary by gender.

  2. Hospital outpatient perceptions of the physical environment of waiting areas: the role of patient characteristics on atmospherics in one academic medical center.

    PubMed

    Tsai, Chun-Yen; Wang, Mu-Chia; Liao, Wei-Tsen; Lu, Jui-Heng; Sun, Pi-Hung; Lin, Blossom Yen-Ju; Breen, Gerald-Mark

    2007-12-05

    This study examines hospital outpatient perceptions of the physical environment of the outpatient waiting areas in one medical center. The relationship of patient characteristics and their perceptions and needs for the outpatient waiting areas are also examined. The examined medical center consists of five main buildings which house seventeen primary waiting areas for the outpatient clinics of nine medical specialties: 1) Internal Medicine; 2) Surgery; 3) Ophthalmology; 4) Obstetrics-Gynecology and Pediatrics; 5) Chinese Medicine; 6) Otolaryngology; 7) Orthopedics; 8) Family Medicine; and 9) Dermatology. A 15-item structured questionnaire was developed to rate patient satisfaction covering the four dimensions of the physical environments of the outpatient waiting areas: 1) visual environment; 2) hearing environment; 3) body contact environment; and 4) cleanliness. The survey was conducted between November 28, 2005 and December 8, 2005. A total of 680 outpatients responded. Descriptive, univariate, and multiple regression analyses were applied in this study. All of the 15 items were ranked as relatively high with a range from 3.362 to 4.010, with a neutral score of 3. Using a principal component analysis' summated scores of four constructed dimensions of patient satisfaction with the physical environments (i.e. visual environment, hearing environment, body contact environment, and cleanliness), multiple regression analyses revealed that patient satisfaction with the physical environment of outpatient waiting areas was associated with gender, age, visiting frequency, and visiting time. Patients' socio-demographics and context backgrounds demonstrated to have effects on their satisfaction with the physical environment of outpatient waiting areas. In addition to noticing the overall rankings for less satisfactory items, what should receive further attention is the consideration of the patients' personal characteristics when redesigning more comfortable and customized physical environments of waiting areas.

  3. Association of serum anti-rotavirus immunoglobulin A antibody seropositivity and protection against severe rotavirus gastroenteritis: analysis of clinical trials of human rotavirus vaccine.

    PubMed

    Cheuvart, Brigitte; Neuzil, Kathleen M; Steele, A Duncan; Cunliffe, Nigel; Madhi, Shabir A; Karkada, Naveen; Han, Htay Htay; Vinals, Carla

    2014-01-01

    Clinical trials of the human rotavirus vaccine Rotarix™ (RV1) have demonstrated significant reductions in severe rotavirus gastroenteritis (RVGE) in children worldwide. However, no correlate of vaccine efficacy (VE) has yet been established. This paper presents 2 analyses which aimed to investigate whether serum anti-RV IgA measured by ELISA 1 or 2 mo post-vaccination can serve as a correlate of efficacy against RVGE: (1) In a large Phase III efficacy trial (Rota-037), the Prentice criteria for surrogate endpoints was applied to anti-RV IgA seropositivity 1 mo post-vaccination. These criteria determine whether a significant vaccine group effect can be predicted from the surrogate, namely seropositivity (anti-RV IgA concentration>20 U/mL); (2) Among other GSK-sponsored RV1 VE studies, 8 studies which assessed immunogenicity at 1 or 2 mo post-vaccination in all or a sub-cohort of enrolled subjects and had at least 10 RVGE episodes were included in a meta-analysis to measure the regression between clinical VE and VE predicted from immunogenicity (VE1). In Rota-037, anti-RV IgA seropositivity post-vaccination was associated with a lower incidence of any or severe RVGE, however, the proportion of vaccine group effect explained by seropositivity was only 43.6% and 32.7% respectively. This low proportion was due to the vaccine group effect observed in seronegative subjects. In the meta-analysis, the slope of the regression between clinical VE and VE1 was statistically significant. These two independent analyses support the hypothesis that post-vaccination anti-RV IgA seropositivity (antibody concentration ≥20 U/mL) may serve as a useful correlate of efficacy in clinical trials of RV1 vaccines.

  4. HAPRAP: a haplotype-based iterative method for statistical fine mapping using GWAS summary statistics.

    PubMed

    Zheng, Jie; Rodriguez, Santiago; Laurin, Charles; Baird, Denis; Trela-Larsen, Lea; Erzurumluoglu, Mesut A; Zheng, Yi; White, Jon; Giambartolomei, Claudia; Zabaneh, Delilah; Morris, Richard; Kumari, Meena; Casas, Juan P; Hingorani, Aroon D; Evans, David M; Gaunt, Tom R; Day, Ian N M

    2017-01-01

    Fine mapping is a widely used approach for identifying the causal variant(s) at disease-associated loci. Standard methods (e.g. multiple regression) require individual level genotypes. Recent fine mapping methods using summary-level data require the pairwise correlation coefficients ([Formula: see text]) of the variants. However, haplotypes rather than pairwise [Formula: see text], are the true biological representation of linkage disequilibrium (LD) among multiple loci. In this article, we present an empirical iterative method, HAPlotype Regional Association analysis Program (HAPRAP), that enables fine mapping using summary statistics and haplotype information from an individual-level reference panel. Simulations with individual-level genotypes show that the results of HAPRAP and multiple regression are highly consistent. In simulation with summary-level data, we demonstrate that HAPRAP is less sensitive to poor LD estimates. In a parametric simulation using Genetic Investigation of ANthropometric Traits height data, HAPRAP performs well with a small training sample size (N < 2000) while other methods become suboptimal. Moreover, HAPRAP's performance is not affected substantially by single nucleotide polymorphisms (SNPs) with low minor allele frequencies. We applied the method to existing quantitative trait and binary outcome meta-analyses (human height, QTc interval and gallbladder disease); all previous reported association signals were replicated and two additional variants were independently associated with human height. Due to the growing availability of summary level data, the value of HAPRAP is likely to increase markedly for future analyses (e.g. functional prediction and identification of instruments for Mendelian randomization). The HAPRAP package and documentation are available at http://apps.biocompute.org.uk/haprap/ CONTACT: : jie.zheng@bristol.ac.uk or tom.gaunt@bristol.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  5. Preoperative determinant of early postoperative renal function following radical cystectomy and intestinal urinary diversion.

    PubMed

    Gondo, Tatsuo; Ohno, Yoshio; Nakashima, Jun; Hashimoto, Takeshi; Nakagami, Yoshihiro; Tachibana, Masaaki

    2017-02-01

    To identify preoperative factors correlated with postoperative early renal function in patients who had undergone radical cystectomy (RC) and intestinal urinary diversion. We retrospectively identified 201 consecutive bladder cancer patients without distant metastasis who had undergone RC at our institution between 2003 and 2012. The estimated glomerular filtration rate (eGFR) was calculated using the modified Chronic Kidney Disease Epidemiology equation before RC and 3 months following RC. Univariate and stepwise multiple linear regression analyses were applied to estimate postoperative renal function and to identify significant preoperative predictors of postoperative renal function. Patients who had undergone intestinal urinary diversion and were available for the collection of follow-up data (n = 164) were eligible for the present study. Median preoperative and postoperative eGFRs were 69.7 (interquartile range [IQR] 56.3-78.0) and 70.7 (IQR 57.3-78.1), respectively. In univariate analyses, age, preoperative proteinuria, thickness of abdominal subcutaneous fat tissue (TSF), preoperative serum creatinine level, preoperative eGFR, and urinary diversion type were significantly associated with postoperative eGFR. In a stepwise multiple linear regression analysis, preoperative eGFR, age, and TSF were significant factors for predicting postoperative eGFR (p < 0.001, p = 0.02, and p = 0.046, respectively). The estimated postoperative eGFRs correlated well with the actual postoperative eGFRs (r = 0.65, p < 0.001). Preoperative eGFR, age, and TSF were independent preoperative factors for determining postoperative renal function in patients who had undergone RC and intestinal urinary diversion. These results may be used for patient counseling before surgery, including the planning of perioperative chemotherapy administration.

  6. Measuring the statistical validity of summary meta‐analysis and meta‐regression results for use in clinical practice

    PubMed Central

    Riley, Richard D.

    2017-01-01

    An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945

  7. Measuring the statistical validity of summary meta-analysis and meta-regression results for use in clinical practice.

    PubMed

    Willis, Brian H; Riley, Richard D

    2017-09-20

    An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  8. Hemostasis and Lipoprotein Indices Signify Exacerbated Lung Injury in TB With Diabetes Comorbidity.

    PubMed

    Dong, Zhengwei; Shi, Jingyun; Dorhoi, Anca; Zhang, Jie; Soodeen-Lalloo, Adiilah K; Tan, WenLing; Yin, Hongyun; Sha, Wei; Li, Weitong; Zheng, Ruijuan; Liu, Zhonghua; Yang, Hua; Qin, Lianhua; Wang, Jie; Huang, Xiaochen; Wu, Chunyan; Kaufmann, Stefan H E; Feng, Yonghong

    2018-05-01

    Exacerbated immunopathology is a frequent consequence of TB that is complicated by diabetes mellitus (DM); however, the underlying mechanisms are still poorly defined. In the two groups of age- and sex-matched patients with TB and DM (DM-TB) and with TB and without DM, we microscopically evaluated the areas of caseous necrosis and graded the extent of perinecrotic fibrosis in lung biopsies from the sputum smear-negative (SN) patients. We scored acid-fast bacilli in sputum smear-positive (SP) patients and compiled CT scan data from both the SN and SP patients. We compared inflammatory biomarkers and routine hematologic and biochemical parameters. Binary logistic regression analyses were applied to define the indices associated with the extent of lung injury. Enlarged caseous necrotic areas with exacerbated fibrotic encapsulations were found in SN patients with DM-TB, consistent with the higher ratio of thick-walled cavities and more bacilli in the sputum from SP patients with DM-TB. Larger necrotic foci were detected in men compared with women within the SN TB groups. Significantly higher fibrinogen and lower high-density lipoprotein cholesterol (HDL-C) were observed in SN patients with DM-TB. Regression analyses revealed that diabetes, activation of the coagulation pathway (shown by increased platelet distribution width, decreased mean platelet volume, and shortened prothrombin time), and dyslipidemia (shown by decreased low-density lipoprotein cholesterol, HDL-C, and apolipoprotein A) are risk factors for severe lung lesions in both SN and SP patients with TB. Hemostasis and dyslipidemia are associated with granuloma necrosis and fibroplasia leading to exacerbated lung damage in TB, especially in patients with DM-TB. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  9. Testing Mediation Using Multiple Regression and Structural Equation Modeling Analyses in Secondary Data

    ERIC Educational Resources Information Center

    Li, Spencer D.

    2011-01-01

    Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…

  10. An Exploratory Study of Face-to-Face and Cyberbullying in Sixth Grade Students

    ERIC Educational Resources Information Center

    Accordino, Denise B.; Accordino, Michael P.

    2011-01-01

    In a pilot study, sixth grade students (N = 124) completed a questionnaire assessing students' experience with bullying and cyberbullying, demographic information, quality of parent-child relationship, and ways they have dealt with bullying/cyberbullying in the past. Two multiple regression analyses were conducted. The multiple regression analysis…

  11. Selecting risk factors: a comparison of discriminant analysis, logistic regression and Cox's regression model using data from the Tromsø Heart Study.

    PubMed

    Brenn, T; Arnesen, E

    1985-01-01

    For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.

  12. Hypnotism as a Function of Trance State Effects, Expectancy, and Suggestibility: An Italian Replication.

    PubMed

    Pekala, Ronald J; Baglio, Francesca; Cabinio, Monia; Lipari, Susanna; Baglio, Gisella; Mendozzi, Laura; Cecconi, Pietro; Pugnetti, Luigi; Sciaky, Riccardo

    2017-01-01

    Previous research using stepwise regression analyses found self-reported hypnotic depth (srHD) to be a function of suggestibility, trance state effects, and expectancy. This study sought to replicate and expand that research using a general state measure of hypnotic responsivity, the Phenomenology of Consciousness Inventory: Hypnotic Assessment Procedure (PCI-HAP). Ninety-five participants completed an Italian translation of the PCI-HAP, with srHD scores predicted from the PCI-HAP assessment items. The regression analysis replicated the previous research results. Additionally, stepwise regression analyses were able to predict the srHD score equally well using only the PCI dimension scores. These results not only replicated prior research but suggest how this methodology to assess hypnotic responsivity, when combined with more traditional neurophysiological and cognitive-behavioral methodologies, may allow for a more comprehensive understanding of that enigma called hypnosis.

  13. Analyzing industrial energy use through ordinary least squares regression models

    NASA Astrophysics Data System (ADS)

    Golden, Allyson Katherine

    Extensive research has been performed using regression analysis and calibrated simulations to create baseline energy consumption models for residential buildings and commercial institutions. However, few attempts have been made to discuss the applicability of these methodologies to establish baseline energy consumption models for industrial manufacturing facilities. In the few studies of industrial facilities, the presented linear change-point and degree-day regression analyses illustrate ideal cases. It follows that there is a need in the established literature to discuss the methodologies and to determine their applicability for establishing baseline energy consumption models of industrial manufacturing facilities. The thesis determines the effectiveness of simple inverse linear statistical regression models when establishing baseline energy consumption models for industrial manufacturing facilities. Ordinary least squares change-point and degree-day regression methods are used to create baseline energy consumption models for nine different case studies of industrial manufacturing facilities located in the southeastern United States. The influence of ambient dry-bulb temperature and production on total facility energy consumption is observed. The energy consumption behavior of industrial manufacturing facilities is only sometimes sufficiently explained by temperature, production, or a combination of the two variables. This thesis also provides methods for generating baseline energy models that are straightforward and accessible to anyone in the industrial manufacturing community. The methods outlined in this thesis may be easily replicated by anyone that possesses basic spreadsheet software and general knowledge of the relationship between energy consumption and weather, production, or other influential variables. With the help of simple inverse linear regression models, industrial manufacturing facilities may better understand their energy consumption and production behavior, and identify opportunities for energy and cost savings. This thesis study also utilizes change-point and degree-day baseline energy models to disaggregate facility annual energy consumption into separate industrial end-user categories. The baseline energy model provides a suitable and economical alternative to sub-metering individual manufacturing equipment. One case study describes the conjoined use of baseline energy models and facility information gathered during a one-day onsite visit to perform an end-point energy analysis of an injection molding facility conducted by the Alabama Industrial Assessment Center. Applying baseline regression model results to the end-point energy analysis allowed the AIAC to better approximate the annual energy consumption of the facility's HVAC system.

  14. Exploring the application of latent class cluster analysis for investigating pedestrian crash injury severities in Switzerland.

    PubMed

    Sasidharan, Lekshmi; Wu, Kun-Feng; Menendez, Monica

    2015-12-01

    One of the major challenges in traffic safety analyses is the heterogeneous nature of safety data, due to the sundry factors involved in it. This heterogeneity often leads to difficulties in interpreting results and conclusions due to unrevealed relationships. Understanding the underlying relationship between injury severities and influential factors is critical for the selection of appropriate safety countermeasures. A method commonly employed to address systematic heterogeneity is to focus on any subgroup of data based on the research purpose. However, this need not ensure homogeneity in the data. In this paper, latent class cluster analysis is applied to identify homogenous subgroups for a specific crash type-pedestrian crashes. The manuscript employs data from police reported pedestrian (2009-2012) crashes in Switzerland. The analyses demonstrate that dividing pedestrian severity data into seven clusters helps in reducing the systematic heterogeneity of the data and to understand the hidden relationships between crash severity levels and socio-demographic, environmental, vehicle, temporal, traffic factors, and main reason for the crash. The pedestrian crash injury severity models were developed for the whole data and individual clusters, and were compared using receiver operating characteristics curve, for which results favored clustering. Overall, the study suggests that latent class clustered regression approach is suitable for reducing heterogeneity and revealing important hidden relationships in traffic safety analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. A community-based intervention to reduce alcohol-related accidents and violence in 9th grade students in southern Sweden: the example of the Trelleborg project.

    PubMed

    Stafström, Martin; Ostergren, Per-Olof

    2008-05-01

    The purpose of the present study was to analyse if a community-based intervention has led to a decrease in alcohol-related accidents and violence, and whether this was mediated by a reduction in excessive drinking and frequency of distilled spirits consumption. We applied logistic regression analyses on cross-sectional, non-repeated data, which was collected from a questionnaire distributed in classrooms to all 9th graders from 1999 to 2001, and in 2003 (n=1376, 724 boys and 652 girls; response rate=92.3%). All alcohol abstainers (n=330) were excluded from the analyses, making the sample 1046 individuals. The odds ratio for alcohol-related accidents was significantly lower, comparing the baseline year (1999) with 2003 (OR 0.5, 95% CI 0.27-0.76). There was also an indication that self-reported alcohol-related violence had decreased between 1999 and 2003 (OR 0.7, 95% CI 0.43-1.01). When controlling these estimates for excessive drinking and frequency of distilled spirits consumption, the differences between survey years were substantially reduced or even deleted. In conclusion, the decrease in alcohol-related accidents and violence among 15-16-year-olds in Trelleborg, between 1999 and 2002, is likely to be attributed to the identified reduction in excessive drinking and frequency of distilled spirits consumption.

  16. Identification of immune correlates of protection in Shigella infection by application of machine learning.

    PubMed

    Arevalillo, Jorge M; Sztein, Marcelo B; Kotloff, Karen L; Levine, Myron M; Simon, Jakub K

    2017-10-01

    Immunologic correlates of protection are important in vaccine development because they give insight into mechanisms of protection, assist in the identification of promising vaccine candidates, and serve as endpoints in bridging clinical vaccine studies. Our goal is the development of a methodology to identify immunologic correlates of protection using the Shigella challenge as a model. The proposed methodology utilizes the Random Forests (RF) machine learning algorithm as well as Classification and Regression Trees (CART) to detect immune markers that predict protection, identify interactions between variables, and define optimal cutoffs. Logistic regression modeling is applied to estimate the probability of protection and the confidence interval (CI) for such a probability is computed by bootstrapping the logistic regression models. The results demonstrate that the combination of Classification and Regression Trees and Random Forests complements the standard logistic regression and uncovers subtle immune interactions. Specific levels of immunoglobulin IgG antibody in blood on the day of challenge predicted protection in 75% (95% CI 67-86). Of those subjects that did not have blood IgG at or above a defined threshold, 100% were protected if they had IgA antibody secreting cells above a defined threshold. Comparison with the results obtained by applying only logistic regression modeling with standard Akaike Information Criterion for model selection shows the usefulness of the proposed method. Given the complexity of the immune system, the use of machine learning methods may enhance traditional statistical approaches. When applied together, they offer a novel way to quantify important immune correlates of protection that may help the development of vaccines. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Factors that influence the efficacy of acarbose and metformin as initial therapy in Chinese patients with newly diagnosed type 2 diabetes: a subanalysis of the MARCH trial.

    PubMed

    Zhang, Jinping; Wang, Na; Xing, Xiaoyan; Yang, Zhaojun; Wang, Xin; Yang, Wenying

    2016-01-01

    To conduct a subanalysis of the randomized MARCH (Metformin and AcaRbose in Chinese as the initial Hypoglycemic treatment) trial to investigate whether specific characteristics are associated with the efficacy of either acarbose or metformin as initial therapy. A total of 657 type 2 diabetes patients who were randomly assigned to 48 weeks of therapy with either acarbose or metformin in the MARCH trial were divided into two groups based upon their hemoglobin A1c (HbA1c) levels at the end of follow-up: HbA1c <7% (<53 mmol/mol) and ≥7% (≥53 mmol/mol). Univariate, multivariate, and stepwise linear regression analyses were applied to identify the factors associated with treatment efficacy. Because this was a subanalysis, no measurement was performed. Univariate analysis showed that the efficacy of acarbose and metformin was influenced by HbA1c, fasting blood glucose (FBG), and 2 hour postprandial venous blood glucose (2hPPG) levels, as well as by changes in body mass index (BMI) (p ≤ 0.006). Multivariate analysis and stepwise linear regression analyses indicated that lower baseline 2hPPG values and greater changes in BMI were factors that positively influenced efficacy in both treatment groups (p ≤ 0.05). Stepwise regression model analysis also revealed that a lower baseline homeostasis model assessment-estimated insulin resistance (HOMA-IR) and higher serum insulin area under the curve (AUC) were factors positively influencing HbA1c normalization in all patients (p ≤ 0.032). Newly diagnosed type 2 diabetes patients with lower baseline 2hPPG and HOMA-IR values are more likely to achieve glucose control with acarbose or metformin treatment. Furthermore, the change in BMI after acarbose or metformin treatment is also a factor influencing HbA1c normalization. A prospective study with a larger sample size is necessary to confirm our results as well as measure β cell function and examine the influence of the patients' dietary habits.

  18. Digital soil classification and elemental mapping using imaging Vis-NIR spectroscopy: How to explicitly quantify stagnic properties of a Luvisol under Norway spruce

    NASA Astrophysics Data System (ADS)

    Kriegs, Stefanie; Buddenbaum, Henning; Rogge, Derek; Steffens, Markus

    2015-04-01

    Laboratory imaging Vis-NIR spectroscopy of soil profiles is a novel technique in soil science that can determine quantity and quality of various chemical soil properties with a hitherto unreached spatial resolution in undisturbed soil profiles. We have applied this technique to soil cores in order to get quantitative proof of redoximorphic processes under two different tree species and to proof tree-soil interactions at microscale. Due to the imaging capabilities of Vis-NIR spectroscopy a spatially explicit understanding of soil processes and properties can be achieved. Spatial heterogeneity of the soil profile can be taken into account. We took six 30 cm long rectangular soil columns of adjacent Luvisols derived from quaternary aeolian sediments (Loess) in a forest soil near Freising/Bavaria using stainless steel boxes (100×100×300 mm). Three profiles were sampled under Norway spruce and three under European beech. A hyperspectral camera (VNIR, 400-1000 nm in 160 spectral bands) with spatial resolution of 63×63 µm² per pixel was used for data acquisition. Reference samples were taken at representative spots and analysed for organic carbon (OC) quantity and quality with a CN elemental analyser and for iron oxides (Fe) content using dithionite extraction followed by ICP-OES measurement. We compared two supervised classification algorithms, Spectral Angle Mapper and Maximum Likelihood, using different sets of training areas and spectral libraries. As established in chemometrics we used multivariate analysis such as partial least-squares regression (PLSR) in addition to multivariate adaptive regression splines (MARS) to correlate chemical data with Vis-NIR spectra. As a result elemental mapping of Fe and OC within the soil core at high spatial resolution has been achieved. The regression model was validated by a new set of reference samples for chemical analysis. Digital soil classification easily visualizes soil properties within the soil profiles. By combining both techniques, detailed soil maps, elemental balances and a deeper understanding of soil forming processes at the microscale become feasible for complete soil profiles.

  19. Explorative spatial analysis of traffic accident statistics and road mortality among the provinces of Turkey.

    PubMed

    Erdogan, Saffet

    2009-10-01

    The aim of the study is to describe the inter-province differences in traffic accidents and mortality on roads of Turkey. Two different risk indicators were used to evaluate the road safety performance of the provinces in Turkey. These indicators are the ratios between the number of persons killed in road traffic accidents (1) and the number of accidents (2) (nominators) and their exposure to traffic risk (denominator). Population and the number of registered motor vehicles in the provinces were used as denominators individually. Spatial analyses were performed to the mean annual rate of deaths and to the number of fatal accidents that were calculated for the period of 2001-2006. Empirical Bayes smoothing was used to remove background noise from the raw death and accident rates because of the sparsely populated provinces and small number of accident and death rates of provinces. Global and local spatial autocorrelation analyses were performed to show whether the provinces with high rates of deaths-accidents show clustering or are located closer by chance. The spatial distribution of provinces with high rates of deaths and accidents was nonrandom and detected as clustered with significance of P<0.05 with spatial autocorrelation analyses. Regions with high concentration of fatal accidents and deaths were located in the provinces that contain the roads connecting the Istanbul, Ankara, and Antalya provinces. Accident and death rates were also modeled with some independent variables such as number of motor vehicles, length of roads, and so forth using geographically weighted regression analysis with forward step-wise elimination. The level of statistical significance was taken as P<0.05. Large differences were found between the rates of deaths and accidents according to denominators in the provinces. The geographically weighted regression analyses did significantly better predictions for both accident rates and death rates than did ordinary least regressions, as indicated by adjusted R(2) values. Geographically weighted regression provided values of 0.89-0.99 adjusted R(2) for death and accident rates, compared with 0.88-0.95, respectively, by ordinary least regressions. Geographically weighted regression has the potential to reveal local patterns in the spatial distribution of rates, which would be ignored by the ordinary least regression approach. The application of spatial analysis and modeling of accident statistics and death rates at provincial level in Turkey will help to identification of provinces with outstandingly high accident and death rates. This could help more efficient road safety management in Turkey.

  20. A comparison of three methods of assessing differential item functioning (DIF) in the Hospital Anxiety Depression Scale: ordinal logistic regression, Rasch analysis and the Mantel chi-square procedure.

    PubMed

    Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C

    2014-12-01

    It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.

  1. Genetic parameters for growth characteristics of free-range chickens under univariate random regression models.

    PubMed

    Rovadoscki, Gregori A; Petrini, Juliana; Ramirez-Diaz, Johanna; Pertile, Simone F N; Pertille, Fábio; Salvian, Mayara; Iung, Laiza H S; Rodriguez, Mary Ana P; Zampar, Aline; Gaya, Leila G; Carvalho, Rachel S B; Coelho, Antonio A D; Savino, Vicente J M; Coutinho, Luiz L; Mourão, Gerson B

    2016-09-01

    Repeated measures from the same individual have been analyzed by using repeatability and finite dimension models under univariate or multivariate analyses. However, in the last decade, the use of random regression models for genetic studies with longitudinal data have become more common. Thus, the aim of this research was to estimate genetic parameters for body weight of four experimental chicken lines by using univariate random regression models. Body weight data from hatching to 84 days of age (n = 34,730) from four experimental free-range chicken lines (7P, Caipirão da ESALQ, Caipirinha da ESALQ and Carijó Barbado) were used. The analysis model included the fixed effects of contemporary group (gender and rearing system), fixed regression coefficients for age at measurement, and random regression coefficients for permanent environmental effects and additive genetic effects. Heterogeneous variances for residual effects were considered, and one residual variance was assigned for each of six subclasses of age at measurement. Random regression curves were modeled by using Legendre polynomials of the second and third orders, with the best model chosen based on the Akaike Information Criterion, Bayesian Information Criterion, and restricted maximum likelihood. Multivariate analyses under the same animal mixed model were also performed for the validation of the random regression models. The Legendre polynomials of second order were better for describing the growth curves of the lines studied. Moderate to high heritabilities (h(2) = 0.15 to 0.98) were estimated for body weight between one and 84 days of age, suggesting that selection for body weight at all ages can be used as a selection criteria. Genetic correlations among body weight records obtained through multivariate analyses ranged from 0.18 to 0.96, 0.12 to 0.89, 0.06 to 0.96, and 0.28 to 0.96 in 7P, Caipirão da ESALQ, Caipirinha da ESALQ, and Carijó Barbado chicken lines, respectively. Results indicate that genetic gain for body weight can be achieved by selection. Also, selection for body weight at 42 days of age can be maintained as a selection criterion. © 2016 Poultry Science Association Inc.

  2. The theory of reasoned action and intention to seek cancer information.

    PubMed

    Ross, Levi; Kohler, Connie L; Grimley, Diane M; Anderson-Lewis, Charkarra

    2007-01-01

    To evaluate the applicability of the theory of reasoned action to explain men's intentions to seek prostate cancer information. Three hundred randomly selected African American men participated in telephone interviews. Correlational and regression analyses were conducted to examine relationships among measures. All relationships were significant in regression analyses. Attitudes and subjective norm were significantly related to intentions. Indirect measures of beliefs derived from elicitation research were associated with direct measures of attitude and subjective norms. The data are sufficiently clear to support the applicability of the theory for this behavioral domain with African American men and suggest several important areas for future research.

  3. An Investigation of the Relations Between Student Knowledge, Personal Contact, and Attitudes Toward Individuals with Schizophrenia

    PubMed Central

    Eack, Shaun M.; Newhill, Christina E.

    2013-01-01

    A survey of 118 MSW students was conducted to examine the relationship between social work students’ knowledge about, contact with, and attitudes toward persons with schizophrenia. Hierarchical regression analyses indicated that students’ knowledge about and contact with persons with schizophrenia were significantly related to better attitudes toward this population. Moderated multiple regression analyses revealed a significant interaction between knowledge about and contact with persons with schizophrenia, such that knowledge was only related to positive attitudes among students who had more personal contact with persons with the illness. Implications for social work training in severe mental illness are discussed (99 words). PMID:24353396

  4. Robust regression on noisy data for fusion scaling laws

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdoolaege, Geert, E-mail: geert.verdoolaege@ugent.be; Laboratoire de Physique des Plasmas de l'ERM - Laboratorium voor Plasmafysica van de KMS

    2014-11-15

    We introduce the method of geodesic least squares (GLS) regression for estimating fusion scaling laws. Based on straightforward principles, the method is easily implemented, yet it clearly outperforms established regression techniques, particularly in cases of significant uncertainty on both the response and predictor variables. We apply GLS for estimating the scaling of the L-H power threshold, resulting in estimates for ITER that are somewhat higher than predicted earlier.

  5. Regression Model Optimization for the Analysis of Experimental Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.

    2009-01-01

    A candidate math model search algorithm was developed at Ames Research Center that determines a recommended math model for the multivariate regression analysis of experimental data. The search algorithm is applicable to classical regression analysis problems as well as wind tunnel strain gage balance calibration analysis applications. The algorithm compares the predictive capability of different regression models using the standard deviation of the PRESS residuals of the responses as a search metric. This search metric is minimized during the search. Singular value decomposition is used during the search to reject math models that lead to a singular solution of the regression analysis problem. Two threshold dependent constraints are also applied. The first constraint rejects math models with insignificant terms. The second constraint rejects math models with near-linear dependencies between terms. The math term hierarchy rule may also be applied as an optional constraint during or after the candidate math model search. The final term selection of the recommended math model depends on the regressor and response values of the data set, the user s function class combination choice, the user s constraint selections, and the result of the search metric minimization. A frequently used regression analysis example from the literature is used to illustrate the application of the search algorithm to experimental data.

  6. Using a social capital framework to enhance measurement of the nursing work environment.

    PubMed

    Sheingold, Brenda Helen; Sheingold, Steven H

    2013-07-01

    To develop, field test and analyse a social capital survey instrument for measuring the nursing work environment. The concept of social capital, which focuses on improving productive capacity by examining relationships and networks, may provide a promising framework to measure and evaluate the nurse work environment in a variety of settings. A survey instrument for measuring social capital in the nurse work environment was developed by adapting the World Bank's Social Capital - Integrated Questionnaire (SC-IQ). Exploratory factor analysis and multiple regression analyses were applied to assess the properties of the instrument. The exploratory factor analysis yielded five factors that align well with the social capital framework, while reflecting unique aspects of the nurse work environment. The results suggest that the social capital framework provides a promising context to assess the nurse work environment. Further work is needed to refine the instrument for a diverse range of health-care providers and to correlate social capital measures with quality of patient care. Social capital measurement of the nurse work environment has the potential to provide managers with an enhanced set of tools for building productive capacity in health-care organisations and achieving desired outcomes. © 2013 John Wiley & Sons Ltd.

  7. Gender inequalities in the health of immigrants and workplace discrimination in Czechia.

    PubMed

    Dzúrová, Dagmar; Drbohlav, Dušan

    2014-01-01

    This study analyses the relationship between immigrants' self-reported/rated health (SRH) and their perceived working conditions in Czechia materialized via discrimination, based on the example of Ukrainian immigrants analyzed by gender dimension. The role of age, education, and marital status is also analyzed. A sample of native-born Czechs serves as a reference frame. A cross-sectional design was applied. Using data from two surveys of Ukrainian immigrants in Czechia and a countrywide health interview survey for Czechs, we analyse inequalities in SRH and workplace discrimination loads. Four binary logistic regression models were computed separately for women and men from Ukraine and Czechia to identify the determinants of fair/poor SRH. We found that only Ukrainian immigrant females were heavily exposed to all four measured types of workplace discrimination, thereby modifying and worsening the quality of their SRH. Determinants which are behind respondents' SRH differ between Ukrainian immigrants vis-à-vis Czechs with one exception. The "oldest age group" (41-62) contributes to poorer assessment of SRH among Ukrainian females, Czech females, and Czech males too. The lowest educational level (primary education) correlates with poor SRH within the sample of Czech males.

  8. The impact of global budgeting on treatment intensity and outcomes.

    PubMed

    Kan, Kamhon; Li, Shu-Fen; Tsai, Wei-Der

    2014-12-01

    This paper investigates the effects of global budgets on the amount of resources devoted to cardio-cerebrovascular disease patients by hospitals of different ownership types and these patients' outcomes. Theoretical models predict that hospitals have financial incentives to increase the quantity of treatments applied to patients. This is especially true for for-profit hospitals. If that's the case, it is important to examine whether the increase in treatment quantity is translated into better treatment outcomes. Our analyses take advantage of the National Health Insurance of Taiwan's implementation of global budgets for hospitals in 2002. Our data come from the National Health Insurance's claim records, covering the universe of hospitalized patients suffering acute myocardial infarction, ischemic heart disease, hemorrhagic stroke, and ischemic stroke. Regression analyses are carried out separately for government, private not-for-profit and for-profit hospitals. We find that for-profit hospitals and private not-for-profit hospitals did increase their treatment intensity for cardio-cerebrovascular disease patients after the 2002 implementation of global budgets. However, this was not accompanied by an improvement in these patients' mortality rates. This reveals a waste of medical resources and implies that aggregate expenditure caps should be supplemented by other designs to prevent resources misallocation.

  9. Exploring the potential uses of value-added metrics in the context of postgraduate medical education.

    PubMed

    Gregory, Simon; Patterson, Fiona; Baron, Helen; Knight, Alec; Walsh, Kieran; Irish, Bill; Thomas, Sally

    2016-10-01

    Increasing pressure is being placed on external accountability and cost efficiency in medical education and training internationally. We present an illustrative data analysis of the value-added of postgraduate medical education. We analysed historical selection (entry) and licensure (exit) examination results for trainees sitting the UK Membership of the Royal College of General Practitioners (MRCGP) licensing examination (N = 2291). Selection data comprised: a clinical problem solving test (CPST); a situational judgement test (SJT); and a selection centre (SC). Exit data was an applied knowledge test (AKT) from MRCGP. Ordinary least squares (OLS) regression analyses were used to model differences in attainment in the AKT based on performance at selection (the value-added score). Results were aggregated to the regional level for comparisons. We discovered significant differences in the value-added score between regional training providers. Whilst three training providers confer significant value-added, one training provider was significantly lower than would be predicted based on the attainment of trainees at selection. Value-added analysis in postgraduate medical education potentially offers useful information, although the methodology is complex, controversial, and has significant limitations. Developing models further could offer important insights to support continuous improvement in medical education in future.

  10. The application of cat swarm optimisation algorithm in classifying small loan performance

    NASA Astrophysics Data System (ADS)

    Kencana, Eka N.; Kiswanti, Nyoman; Sari, Kartika

    2017-10-01

    It is common for banking system to analyse the feasibility of credit application before its approval. Although this process has been carefully done, there is no warranty that all credits will be repaid smoothly. This study aimed to know the accuracy of Cat Swarm Optimisation (CSO) algorithm in classifying small loans’ performance that is approved by Bank Rakyat Indonesia (BRI), one of several public banks in Indonesia. Data collected from 200 lenders were used in this work. The data matrix consists of 9 independent variables that represent profile of the credit, and one categorical dependent variable reflects credit’s performance. Prior to the analyses, data was divided into two data subset with equal size. Ordinal logistic regression (OLR) procedure is applied for the first subset and gave 3 out of 9 independent variables i.e. the amount of credit, credit’s period, and income per month of lender proved significantly affect credit performance. By using significantly estimated parameters from OLR procedure as the initial values for observations at the second subset, CSO procedure started. This procedure gave 76 percent of classification accuracy of credit performance, slightly better compared to 64 percent resulted from OLR procedure.

  11. The importance of personality and life-events in anxious depression: from trait to state anxiety.

    PubMed

    van der Veen, Date C; van Dijk, Silvia D M; Comijs, Hannie C; van Zelst, Willeke H; Schoevers, Robert A; Oude Voshaar, Richard C

    2017-11-01

    Anxious depression is associated with severe impairment and bad prognoses. We hypothesize that recent life-events are associated with more anxiety in late-life depression and that this is conditional upon the level of certain personality traits. Baseline data of the Netherlands Study of Depression in Older Persons (NESDO) were used. In 333 patients (≥60 years) suffering from a major depressive disorder, anxiety was assessed with the BAI, personality traits with the NEO-FFI and the Mastery Scale, and life-events with the Brugha questionnaire. Multiple linear regression analyses were applied with anxiety severity as dependent and life-events and personality traits as independent variables. 147 patients (44.1%) had recently experienced one or more life-events. The presence of a life-event is not associated with anxiety (p = .161) or depression severity (p = .440). However, certain personality traits interacted with life-events in explaining anxiety severity. Stratified analyses showed that life-events were associated with higher anxiety levels in case of high levels of neuroticism and openness and low levels of conscientiousness or mastery. In the face of a life-event, personality traits may play a central role in increased anxiety levels in late-life depression.

  12. Gender Inequalities in the Health of Immigrants and Workplace Discrimination in Czechia

    PubMed Central

    Dzúrová, Dagmar; Drbohlav, Dušan

    2014-01-01

    This study analyses the relationship between immigrants' self-reported/rated health (SRH) and their perceived working conditions in Czechia materialized via discrimination, based on the example of Ukrainian immigrants analyzed by gender dimension. The role of age, education, and marital status is also analyzed. A sample of native-born Czechs serves as a reference frame. A cross-sectional design was applied. Using data from two surveys of Ukrainian immigrants in Czechia and a countrywide health interview survey for Czechs, we analyse inequalities in SRH and workplace discrimination loads. Four binary logistic regression models were computed separately for women and men from Ukraine and Czechia to identify the determinants of fair/poor SRH. We found that only Ukrainian immigrant females were heavily exposed to all four measured types of workplace discrimination, thereby modifying and worsening the quality of their SRH. Determinants which are behind respondents' SRH differ between Ukrainian immigrants vis-à-vis Czechs with one exception. The “oldest age group” (41–62) contributes to poorer assessment of SRH among Ukrainian females, Czech females, and Czech males too. The lowest educational level (primary education) correlates with poor SRH within the sample of Czech males. PMID:25105125

  13. A serum protein-based algorithm for the detection of Alzheimer disease.

    PubMed

    O'Bryant, Sid E; Xiao, Guanghua; Barber, Robert; Reisch, Joan; Doody, Rachelle; Fairchild, Thomas; Adams, Perrie; Waring, Steven; Diaz-Arrastia, Ramon

    2010-09-01

    To develop an algorithm that separates patients with Alzheimer disease (AD) from controls. Longitudinal case-control study. The Texas Alzheimer's Research Consortium project. Patients  We analyzed serum protein-based multiplex biomarker data from 197 patients diagnosed with AD and 203 controls. Main Outcome Measure  The total sample was randomized equally into training and test sets and random forest methods were applied to the training set to create a biomarker risk score. The biomarker risk score had a sensitivity and specificity of 0.80 and 0.91, respectively, and an area under the curve of 0.91 in detecting AD. When age, sex, education, and APOE status were added to the algorithm, the sensitivity, specificity, and area under the curve were 0.94, 0.84, and 0.95, respectively. These initial data suggest that serum protein-based biomarkers can be combined with clinical information to accurately classify AD. A disproportionate number of inflammatory and vascular markers were weighted most heavily in the analyses. Additionally, these markers consistently distinguished cases from controls in significant analysis of microarray, logistic regression, and Wilcoxon analyses, suggesting the existence of an inflammatory-related endophenotype of AD that may provide targeted therapeutic opportunities for this subset of patients.

  14. Does activity limitation predict discharge destination for postacute care patients?

    PubMed

    Chang, Feng-Hang; Ni, Pengsheng; Jette, Alan M

    2014-09-01

    This study aimed to examine the ability of different domains of activity limitation to predict discharge destination (home vs. nonhome settings) 1 mo after hospital discharge for postacute rehabilitation patients. A secondary analysis was conducted using a data set of 518 adults with neurologic, lower extremity orthopedic, and complex medical conditions followed after discharge from a hospital into postacute care. Variables collected at baseline include activity limitations (basic mobility, daily activity, and applied cognitive function, measured by the Activity Measure for Post-Acute Care), demographics, diagnosis, and cognitive status. The discharge destination was recorded at 1 mo after being discharged from the hospital. Correlational analyses revealed that the 1-mo discharge destination was correlated with two domains of activity (basic mobility and daily activity) and cognitive status. However, multiple logistic regression and receiver operating characteristic curve analyses showed that basic mobility functioning performed the best in discriminating home vs. nonhome living. This study supported the evidence that basic mobility functioning is a critical determinant of discharge home for postacute rehabilitation patients. The Activity Measure for Post-Acute Care-basic mobility showed good usability in discriminating home vs. nonhome living. The findings shed light on the importance of basic mobility functioning in the discharge planning process.

  15. Chemometric models for the quantitative descriptive sensory analysis of Arabica coffee beverages using near infrared spectroscopy.

    PubMed

    Ribeiro, J S; Ferreira, M M C; Salva, T J G

    2011-02-15

    Mathematical models based on chemometric analyses of the coffee beverage sensory data and NIR spectra of 51 Arabica roasted coffee samples were generated aiming to predict the scores of acidity, bitterness, flavour, cleanliness, body and overall quality of coffee beverage. Partial least squares (PLS) were used to construct the models. The ordered predictor selection (OPS) algorithm was applied to select the wavelengths for the regression model of each sensory attribute in order to take only significant regions into account. The regions of the spectrum defined as important for sensory quality were closely related to the NIR spectra of pure caffeine, trigonelline, 5-caffeoylquinic acid, cellulose, coffee lipids, sucrose and casein. The NIR analyses sustained that the relationship between the sensory characteristics of the beverage and the chemical composition of the roasted grain were as listed below: 1 - the lipids and proteins were closely related to the attribute body; 2 - the caffeine and chlorogenic acids were related to bitterness; 3 - the chlorogenic acids were related to acidity and flavour; 4 - the cleanliness and overall quality were related to caffeine, trigonelline, chlorogenic acid, polysaccharides, sucrose and protein. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Financial ties and concordance between results and conclusions in meta-analyses: retrospective cohort study.

    PubMed

    Yank, Veronica; Rennie, Drummond; Bero, Lisa A

    2007-12-08

    To determine whether financial ties to one drug company are associated with favourable results or conclusions in meta-analyses on antihypertensive drugs. Retrospective cohort study. Meta-analyses published up to December 2004 that were not duplicates and evaluated the effects of antihypertensive drugs compared with any comparator on clinical end points in adults. Financial ties were categorised as one drug company compared with all others. The main outcomes were the results and conclusions of meta-analyses, with both outcomes separately categorised as being favourable or not favourable towards the study drug. We also collected data on characteristics of meta-analyses that the literature suggested might be associated with favourable results or conclusions. 124 meta-analyses were included in the study, 49 (40%) of which had financial ties to one drug company. On univariate logistic regression analyses, meta-analyses of better methodological quality were more likely to have favourable results (odds ratio 1.16, 95% confidence interval 1.07 to 1.27). Although financial ties to one drug company were not associated with favourable results, such ties constituted the only characteristic significantly associated with favourable conclusions (4.09, 1.30 to 12.83). When controlling for other characteristics of meta-analyses in multiple logistic regression analyses, meta-analyses that had financial ties to one drug company remained more likely to report favourable conclusions (5.11, 1.54 to 16.92). Meta-analyses on antihypertensive drugs and with financial ties to one drug company are not associated with favourable results but are associated with favourable conclusions.

  17. Evaluating atmospheric blocking in the global climate model EC-Earth

    NASA Astrophysics Data System (ADS)

    Hartung, Kerstin; Hense, Andreas; Kjellström, Erik

    2013-04-01

    Atmospheric blocking is a phenomenon of the midlatitudal troposphere, which plays an important role in climate variability. Therefore a correct representation of blocking in climate models is necessary, especially for evaluating the results of climate projections. In my master's thesis a validation of blocking in the coupled climate model EC-Earth is performed. Blocking events are detected based on the Tibaldi-Molteni Index. At first, a comparison with the reanalysis dataset ERA-Interim is conducted. The blocking frequency depending on longitude shows a small general underestimation of blocking in the model - a well known problem. Scaife et al. (2011) proposed the correction of model bias as a way to solve this problem. However, applying the correction to the higher resolution EC-Earth model does not yield any improvement. Composite maps show a link between blocking events and surface variables. One example is the formation of a positive surface temperature anomaly north and a negative anomaly south of the blocking anticyclone. In winter the surface temperature in EC-Earth can be reproduced quite well, but in summer a cold bias over the inner-European ocean is present. Using generalized linear models (GLMs) I want to study the connection between regional blocking and global atmospheric variables further. GLMs have the advantage of being applicable to non-Gaussian variables. Therefore the blocking index at each longitude, which is Bernoulli distributed, can be analysed statistically with GLMs. I applied a logistic regression between the blocking index and the geopotential height at 500 hPa to study the teleconnection of blocking events at midlatitudes with global geopotential height. GLMs also offer the possibility of quantifying the connections shown in composite maps. The implementation of the logistic regression can even be expanded to a search for trends in blocking frequency, for example in the scenario simulations.

  18. Are all quantitative postmarketing signal detection methods equal? Performance characteristics of logistic regression and Multi-item Gamma Poisson Shrinker.

    PubMed

    Berlin, Conny; Blanch, Carles; Lewis, David J; Maladorno, Dionigi D; Michel, Christiane; Petrin, Michael; Sarp, Severine; Close, Philippe

    2012-06-01

    The detection of safety signals with medicines is an essential activity to protect public health. Despite widespread acceptance, it is unclear whether recently applied statistical algorithms provide enhanced performance characteristics when compared with traditional systems. Novartis has adopted a novel system for automated signal detection on the basis of disproportionality methods within a safety data mining application (Empirica™ Signal System [ESS]). ESS uses two algorithms for routine analyses: empirical Bayes Multi-item Gamma Poisson Shrinker and logistic regression (LR). A model was developed comprising 14 medicines, categorized as "new" or "established." A standard was prepared on the basis of safety findings selected from traditional sources. ESS results were compared with the standard to calculate the positive predictive value (PPV), specificity, and sensitivity. PPVs of the lower one-sided 5% and 0.05% confidence limits of the Bayes geometric mean (EB05) and of the LR odds ratio (LR0005) almost coincided for all the drug-event combinations studied. There was no obvious difference comparing the PPV of the leading Medical Dictionary for Regulatory Activities (MedDRA) terms to the PPV for all terms. The PPV of narrow MedDRA query searches was higher than that for broad searches. The widely used threshold value of EB05 = 2.0 or LR0005 = 2.0 together with more than three spontaneous reports of the drug-event combination produced balanced results for PPV, sensitivity, and specificity. Consequently, performance characteristics were best for leading terms with narrow MedDRA query searches irrespective of applying Multi-item Gamma Poisson Shrinker or LR at a threshold value of 2.0. This research formed the basis for the configuration of ESS for signal detection at Novartis. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

    PubMed Central

    Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip

    2011-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561

  20. Removing Batch Effects from Longitudinal Gene Expression - Quantile Normalization Plus ComBat as Best Approach for Microarray Transcriptome Data

    PubMed Central

    Müller, Christian; Schillert, Arne; Röthemeier, Caroline; Trégouët, David-Alexandre; Proust, Carole; Binder, Harald; Pfeiffer, Norbert; Beutel, Manfred; Lackner, Karl J.; Schnabel, Renate B.; Tiret, Laurence; Wild, Philipp S.; Blankenberg, Stefan

    2016-01-01

    Technical variation plays an important role in microarray-based gene expression studies, and batch effects explain a large proportion of this noise. It is therefore mandatory to eliminate technical variation while maintaining biological variability. Several strategies have been proposed for the removal of batch effects, although they have not been evaluated in large-scale longitudinal gene expression data. In this study, we aimed at identifying a suitable method for batch effect removal in a large study of microarray-based longitudinal gene expression. Monocytic gene expression was measured in 1092 participants of the Gutenberg Health Study at baseline and 5-year follow up. Replicates of selected samples were measured at both time points to identify technical variability. Deming regression, Passing-Bablok regression, linear mixed models, non-linear models as well as ReplicateRUV and ComBat were applied to eliminate batch effects between replicates. In a second step, quantile normalization prior to batch effect correction was performed for each method. Technical variation between batches was evaluated by principal component analysis. Associations between body mass index and transcriptomes were calculated before and after batch removal. Results from association analyses were compared to evaluate maintenance of biological variability. Quantile normalization, separately performed in each batch, combined with ComBat successfully reduced batch effects and maintained biological variability. ReplicateRUV performed perfectly in the replicate data subset of the study, but failed when applied to all samples. All other methods did not substantially reduce batch effects in the replicate data subset. Quantile normalization plus ComBat appears to be a valuable approach for batch correction in longitudinal gene expression data. PMID:27272489

Top