Sample records for regression analysis proved

  1. From Equal to Equivalent Pay: Salary Discrimination in Academia

    ERIC Educational Resources Information Center

    Greenfield, Ester

    1977-01-01

    Examines the federal statutes barring sex discrimination in employment and argues that the work of any two professors is comparable but not equal. Suggests using regression analysis to prove salary discrimination and discusses the legal justification for adopting regression analysis and the standard of comparable pay for comparable work.…

  2. A novel simple QSAR model for the prediction of anti-HIV activity using multiple linear regression analysis.

    PubMed

    Afantitis, Antreas; Melagraki, Georgia; Sarimveis, Haralambos; Koutentis, Panayiotis A; Markopoulos, John; Igglessi-Markopoulou, Olga

    2006-08-01

    A quantitative-structure activity relationship was obtained by applying Multiple Linear Regression Analysis to a series of 80 1-[2-hydroxyethoxy-methyl]-6-(phenylthio) thymine (HEPT) derivatives with significant anti-HIV activity. For the selection of the best among 37 different descriptors, the Elimination Selection Stepwise Regression Method (ES-SWR) was utilized. The resulting QSAR model (R (2) (CV) = 0.8160; S (PRESS) = 0.5680) proved to be very accurate both in training and predictive stages.

  3. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  4. Linear regression analysis of survival data with missing censoring indicators.

    PubMed

    Wang, Qihua; Dinse, Gregg E

    2011-04-01

    Linear regression analysis has been studied extensively in a random censorship setting, but typically all of the censoring indicators are assumed to be observed. In this paper, we develop synthetic data methods for estimating regression parameters in a linear model when some censoring indicators are missing. We define estimators based on regression calibration, imputation, and inverse probability weighting techniques, and we prove all three estimators are asymptotically normal. The finite-sample performance of each estimator is evaluated via simulation. We illustrate our methods by assessing the effects of sex and age on the time to non-ambulatory progression for patients in a brain cancer clinical trial.

  5. Correlation Equation of Fault Size, Moment Magnitude, and Height of Tsunami Case Study: Historical Tsunami Database in Sulawesi

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Pribadi, Sugeng; Muzli, Muzli

    2018-03-01

    Sulawesi, one of the biggest island in Indonesia, located on the convergence of two macro plate that is Eurasia and Pacific. NOAA and Novosibirsk Tsunami Laboratory show more than 20 tsunami data recorded in Sulawesi since 1820. Based on this data, determination of correlation between tsunami and earthquake parameter need to be done to proved all event in the past. Complete data of magnitudes, fault sizes and tsunami heights on this study sourced from NOAA and Novosibirsk Tsunami database, completed with Pacific Tsunami Warning Center (PTWC) catalog. This study aims to find correlation between moment magnitude, fault size and tsunami height by simple regression. The step of this research are data collecting, processing, and regression analysis. Result shows moment magnitude, fault size and tsunami heights strongly correlated. This analysis is enough to proved the accuracy of historical tsunami database in Sulawesi on NOAA, Novosibirsk Tsunami Laboratory and PTWC.

  6. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  7. Interactions between cadmium and decabrominated diphenyl ether on blood cells count in rats-Multiple factorial regression analysis.

    PubMed

    Curcic, Marijana; Buha, Aleksandra; Stankovic, Sanja; Milovanovic, Vesna; Bulat, Zorica; Đukić-Ćosić, Danijela; Antonijević, Evica; Vučinić, Slavica; Matović, Vesna; Antonijevic, Biljana

    2017-02-01

    The objective of this study was to assess toxicity of Cd and BDE-209 mixture on haematological parameters in subacutely exposed rats and to determine the presence and type of interactions between these two chemicals using multiple factorial regression analysis. Furthermore, for the assessment of interaction type, an isobologram based methodology was applied and compared with multiple factorial regression analysis. Chemicals were given by oral gavage to the male Wistar rats weighing 200-240g for 28days. Animals were divided in 16 groups (8/group): control vehiculum group, three groups of rats were treated with 2.5, 7.5 or 15mg Cd/kg/day. These doses were chosen on the bases of literature data and reflect relatively high Cd environmental exposure, three groups of rats were treated with 1000, 2000 or 4000mg BDE-209/kg/bw/day, doses proved to induce toxic effects in rats. Furthermore, nine groups of animals were treated with different mixtures of Cd and BDE-209 containing doses of Cd and BDE-209 stated above. Blood samples were taken at the end of experiment and red blood cells, white blood cells and platelets counts were determined. For interaction assessment multiple factorial regression analysis and fitted isobologram approach were used. In this study, we focused on multiple factorial regression analysis as a method for interaction assessment. We also investigated the interactions between Cd and BDE-209 by the derived model for the description of the obtained fitted isobologram curves. Current study indicated that co-exposure to Cd and BDE-209 can result in significant decrease in RBC count, increase in WBC count and decrease in PLT count, when compared with controls. Multiple factorial regression analysis used for the assessment of interactions type between Cd and BDE-209 indicated synergism for the effect on RBC count and no interactions i.e. additivity for the effects on WBC and PLT counts. On the other hand, isobologram based approach showed slight antagonism for the effects on RBC and WBC while no interactions were proved for the joint effect on PLT count. These results confirm that the assessment of interactions between chemicals in the mixture greatly depends on the concept or method used for this evaluation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Analysis of cost regression and post-accident absence

    NASA Astrophysics Data System (ADS)

    Wojciech, Drozd

    2017-07-01

    The article presents issues related with costs of work safety. It proves the thesis that economic aspects cannot be overlooked in effective management of occupational health and safety and that adequate expenditures on safety can bring tangible benefits to the company. Reliable analysis of this problem is essential for the description the problem of safety the work. In the article attempts to carry it out using the procedures of mathematical statistics [1, 2, 3].

  9. [On the effectiveness of the homeopathic remedy Arnica montana].

    PubMed

    Lüdtke, Rainer; Hacke, Daniela

    2005-11-01

    Arnica montana is a homeopathic remedy often prescribed after traumata and injuries. To assess whether Arnica is effective beyond placebo and to identify factors which support or contradict this effectiveness. All prospective, controlled trials on the effectiveness of homeopathic Arnica were included. Overall effectiveness was assessed by meta-analysis and meta-regression techniques. 68 comparisons from 49 clinical trials show a significant effectiveness of Arnica in traumatic injuries in random effects meta-analysis (odds ratio [OR], 0.36; 95% confidence interval [CI], 0.24-0.55), but not in meta-regression models (OR, 0.37; CI, 0.11-1.24). We found no evidence for publication bias. Studies from Medline-listed journals and high-quality studies are less likely to report positive results (p = 0.0006 and p = 0.0167). The hypothesis that homeopathic Arnica is effective could neither be proved nor rejected. All trials were highly heterogeneous, meta-regression does not help to explain this heterogeneity substantially.

  10. Local linear regression for function learning: an analysis based on sample discrepancy.

    PubMed

    Cervellera, Cristiano; Macciò, Danilo

    2014-11-01

    Local linear regression models, a kind of nonparametric structures that locally perform a linear estimation of the target function, are analyzed in the context of empirical risk minimization (ERM) for function learning. The analysis is carried out with emphasis on geometric properties of the available data. In particular, the discrepancy of the observation points used both to build the local regression models and compute the empirical risk is considered. This allows to treat indifferently the case in which the samples come from a random external source and the one in which the input space can be freely explored. Both consistency of the ERM procedure and approximating capabilities of the estimator are analyzed, proving conditions to ensure convergence. Since the theoretical analysis shows that the estimation improves as the discrepancy of the observation points becomes smaller, low-discrepancy sequences, a family of sampling methods commonly employed for efficient numerical integration, are also analyzed. Simulation results involving two different examples of function learning are provided.

  11. [New method of mixed gas infrared spectrum analysis based on SVM].

    PubMed

    Bai, Peng; Xie, Wen-Jun; Liu, Jun-Hua

    2007-07-01

    A new method of infrared spectrum analysis based on support vector machine (SVM) for mixture gas was proposed. The kernel function in SVM was used to map the seriously overlapping absorption spectrum into high-dimensional space, and after transformation, the high-dimensional data could be processed in the original space, so the regression calibration model was established, then the regression calibration model with was applied to analyze the concentration of component gas. Meanwhile it was proved that the regression calibration model with SVM also could be used for component recognition of mixture gas. The method was applied to the analysis of different data samples. Some factors such as scan interval, range of the wavelength, kernel function and penalty coefficient C that affect the model were discussed. Experimental results show that the component concentration maximal Mean AE is 0.132%, and the component recognition accuracy is higher than 94%. The problems of overlapping absorption spectrum, using the same method for qualitative and quantitative analysis, and limit number of training sample, were solved. The method could be used in other mixture gas infrared spectrum analyses, promising theoretic and application values.

  12. Influence of storage conditions on the stability of monomeric anthocyanins studied by reversed-phase high-performance liquid chromatography.

    PubMed

    Morais, Helena; Ramos, Cristina; Forgács, Esther; Cserháti, Tibor; Oliviera, José

    2002-04-25

    The effect of light, storage time and temperature on the decomposition rate of monomeric anthocyanin pigments extracted from skins of grape (Vitis vinifera var. Red globe) was determined by reversed-phase high-performance liquid chromatography (RP-HPLC). The impact of various storage conditions on the pigment stability was assessed by stepwise regression analysis. RP-HPLC separated well the five anthocyanins identified and proved the presence of other unidentified pigments at lower concentrations. Stepwise regression analysis confirmed that the overall decomposition rate of monomeric anthocyanins, peonidin-3-glucoside and malvidin-3-glucoside significantly depended on the time and temperature of storage, the effect of storage time being the most important. The presence or absence of light exerted a negligible impact on the decomposition rate.

  13. Analysis and improvement measures of flight delay in China

    NASA Astrophysics Data System (ADS)

    Zang, Yuhang

    2017-03-01

    Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.

  14. Effects of Corporate Social Responsibility and Governance on Its Credit Ratings

    PubMed Central

    Kim, Dong-young

    2014-01-01

    This study reviews the impact of corporate social responsibility (CSR) and corporate governance on its credit rating. The result of regression analysis to credit ratings with relevant primary independent variables shows that both factors have significant effects on it. As we have predicted, the signs of both regression coefficients have a positive sign (+) proving that corporates with excellent CSR and governance index (CGI) scores have higher credit ratings and vice versa. The results show nonfinancial information also may have effects on corporate credit rating. The investment on personal data protection could be an example of CSR/CGI activities which have positive effects on corporate credit ratings. PMID:25401134

  15. Effects of corporate social responsibility and governance on its credit ratings.

    PubMed

    Kim, Dong-young; Kim, JeongYeon

    2014-01-01

    This study reviews the impact of corporate social responsibility (CSR) and corporate governance on its credit rating. The result of regression analysis to credit ratings with relevant primary independent variables shows that both factors have significant effects on it. As we have predicted, the signs of both regression coefficients have a positive sign (+) proving that corporates with excellent CSR and governance index (CGI) scores have higher credit ratings and vice versa. The results show nonfinancial information also may have effects on corporate credit rating. The investment on personal data protection could be an example of CSR/CGI activities which have positive effects on corporate credit ratings.

  16. Radiomorphometric analysis of frontal sinus for sex determination.

    PubMed

    Verma, Saumya; Mahima, V G; Patil, Karthikeya

    2014-09-01

    Sex determination of unknown individuals carries crucial significance in forensic research, in cases where fragments of skull persist with no likelihood of identification based on dental arch. In these instances sex determination becomes important to rule out certain number of possibilities instantly and helps in establishing a biological profile of human remains. The aim of the study is to evaluate a mathematical method based on logistic regression analysis capable of ascertaining the sex of individuals in the South Indian population. The study was conducted in the department of Oral Medicine and Radiology. The right and left areas, maximum height, width of frontal sinus were determined in 100 Caldwell views of 50 women and 50 men aged 20 years and above, with the help of Vernier callipers and a square grid with 1 square measuring 1mm(2) in area. Student's t-test, logistic regression analysis. The mean values of variables were greater in men, based on Student's t-test at 5% level of significance. The mathematical model based on logistic regression analysis gave percentage agreement of total area to correctly predict the female gender as 55.2%, of right area as 60.9% and of left area as 55.2%. The areas of the frontal sinus and the logistic regression proved to be unreliable in sex determination. (Logit = 0.924 - 0.00217 × right area).

  17. Optimizing methods for linking cinematic features to fMRI data.

    PubMed

    Kauttonen, Janne; Hlushchuk, Yevhen; Tikka, Pia

    2015-04-15

    One of the challenges of naturalistic neurosciences using movie-viewing experiments is how to interpret observed brain activations in relation to the multiplicity of time-locked stimulus features. As previous studies have shown less inter-subject synchronization across viewers of random video footage than story-driven films, new methods need to be developed for analysis of less story-driven contents. To optimize the linkage between our fMRI data collected during viewing of a deliberately non-narrative silent film 'At Land' by Maya Deren (1944) and its annotated content, we combined the method of elastic-net regularization with the model-driven linear regression and the well-established data-driven independent component analysis (ICA) and inter-subject correlation (ISC) methods. In the linear regression analysis, both IC and region-of-interest (ROI) time-series were fitted with time-series of a total of 36 binary-valued and one real-valued tactile annotation of film features. The elastic-net regularization and cross-validation were applied in the ordinary least-squares linear regression in order to avoid over-fitting due to the multicollinearity of regressors, the results were compared against both the partial least-squares (PLS) regression and the un-regularized full-model regression. Non-parametric permutation testing scheme was applied to evaluate the statistical significance of regression. We found statistically significant correlation between the annotation model and 9 ICs out of 40 ICs. Regression analysis was also repeated for a large set of cubic ROIs covering the grey matter. Both IC- and ROI-based regression analyses revealed activations in parietal and occipital regions, with additional smaller clusters in the frontal lobe. Furthermore, we found elastic-net based regression more sensitive than PLS and un-regularized regression since it detected a larger number of significant ICs and ROIs. Along with the ISC ranking methods, our regression analysis proved a feasible method for ordering the ICs based on their functional relevance to the annotated cinematic features. The novelty of our method is - in comparison to the hypothesis-driven manual pre-selection and observation of some individual regressors biased by choice - in applying data-driven approach to all content features simultaneously. We found especially the combination of regularized regression and ICA useful when analyzing fMRI data obtained using non-narrative movie stimulus with a large set of complex and correlated features. Copyright © 2015. Published by Elsevier Inc.

  18. Differential gene expression detection and sample classification using penalized linear regression models.

    PubMed

    Wu, Baolin

    2006-02-15

    Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.

  19. Independent contrasts and PGLS regression estimators are equivalent.

    PubMed

    Blomberg, Simon P; Lefevre, James G; Wells, Jessie A; Waterhouse, Mary

    2012-05-01

    We prove that the slope parameter of the ordinary least squares regression of phylogenetically independent contrasts (PICs) conducted through the origin is identical to the slope parameter of the method of generalized least squares (GLSs) regression under a Brownian motion model of evolution. This equivalence has several implications: 1. Understanding the structure of the linear model for GLS regression provides insight into when and why phylogeny is important in comparative studies. 2. The limitations of the PIC regression analysis are the same as the limitations of the GLS model. In particular, phylogenetic covariance applies only to the response variable in the regression and the explanatory variable should be regarded as fixed. Calculation of PICs for explanatory variables should be treated as a mathematical idiosyncrasy of the PIC regression algorithm. 3. Since the GLS estimator is the best linear unbiased estimator (BLUE), the slope parameter estimated using PICs is also BLUE. 4. If the slope is estimated using different branch lengths for the explanatory and response variables in the PIC algorithm, the estimator is no longer the BLUE, so this is not recommended. Finally, we discuss whether or not and how to accommodate phylogenetic covariance in regression analyses, particularly in relation to the problem of phylogenetic uncertainty. This discussion is from both frequentist and Bayesian perspectives.

  20. Identification of molecular markers associated with mite resistance in coconut (Cocos nucifera L.).

    PubMed

    Shalini, K V; Manjunatha, S; Lebrun, P; Berger, A; Baudouin, L; Pirany, N; Ranganath, R M; Prasad, D Theertha

    2007-01-01

    Coconut mite (Aceria guerreronis 'Keifer') has become a major threat to Indian coconut (Coçcos nucifera L.) cultivators and the processing industry. Chemical and biological control measures have proved to be costly, ineffective, and ecologically undesirable. Planting mite-resistant coconut cultivars is the most effective method of preventing yield loss and should form a major component of any integrated pest management stratagem. Coconut genotypes, and mite-resistant and -susceptible accessions were collected from different parts of South India. Thirty-two simple sequence repeat (SSR) and 7 RAPD primers were used for molecular analyses. In single-marker analysis, 9 SSR and 4 RAPD markers associated with mite resistance were identified. In stepwise multiple regression analysis of SSRs, a combination of 6 markers showed 100% association with mite infestation. Stepwise multiple regression analysis for RAPD data revealed that a combination of 3 markers accounted for 83.86% of mite resistance in the selected materials. Combined stepwise multiple regression analysis of RAPD and SSR data showed that a combination of 5 markers explained 100% of the association with mite resistance in coconut. Markers associated with mite resistance are important in coconut breeding programs and will facilitate the selection of mite-resistant plants at an early stage as well as mother plants for breeding programs.

  1. Identification of pesticide varieties by testing microalgae using Visible/Near Infrared Hyperspectral Imaging technology

    NASA Astrophysics Data System (ADS)

    Shao, Yongni; Jiang, Linjun; Zhou, Hong; Pan, Jian; He, Yong

    2016-04-01

    In our study, the feasibility of using visible/near infrared hyperspectral imaging technology to detect the changes of the internal components of Chlorella pyrenoidosa so as to determine the varieties of pesticides (such as butachlor, atrazine and glyphosate) at three concentrations (0.6 mg/L, 3 mg/L, 15 mg/L) was investigated. Three models (partial least squares discriminant analysis combined with full wavelengths, FW-PLSDA; partial least squares discriminant analysis combined with competitive adaptive reweighted sampling algorithm, CARS-PLSDA; linear discrimination analysis combined with regression coefficients, RC-LDA) were built by the hyperspectral data of Chlorella pyrenoidosa to find which model can produce the most optimal result. The RC-LDA model, which achieved an average correct classification rate of 97.0% was more superior than FW-PLSDA (72.2%) and CARS-PLSDA (84.0%), and it proved that visible/near infrared hyperspectral imaging could be a rapid and reliable technique to identify pesticide varieties. It also proved that microalgae can be a very promising medium to indicate characteristics of pesticides.

  2. Fuzzy Regression Prediction and Application Based on Multi-Dimensional Factors of Freight Volume

    NASA Astrophysics Data System (ADS)

    Xiao, Mengting; Li, Cheng

    2018-01-01

    Based on the reality of the development of air cargo, the multi-dimensional fuzzy regression method is used to determine the influencing factors, and the three most important influencing factors of GDP, total fixed assets investment and regular flight route mileage are determined. The system’s viewpoints and analogy methods, the use of fuzzy numbers and multiple regression methods to predict the civil aviation cargo volume. In comparison with the 13th Five-Year Plan for China’s Civil Aviation Development (2016-2020), it is proved that this method can effectively improve the accuracy of forecasting and reduce the risk of forecasting. It is proved that this model predicts civil aviation freight volume of the feasibility, has a high practical significance and practical operation.

  3. Evaluation of in-line Raman data for end-point determination of a coating process: Comparison of Science-Based Calibration, PLS-regression and univariate data analysis.

    PubMed

    Barimani, Shirin; Kleinebudde, Peter

    2017-10-01

    A multivariate analysis method, Science-Based Calibration (SBC), was used for the first time for endpoint determination of a tablet coating process using Raman data. Two types of tablet cores, placebo and caffeine cores, received a coating suspension comprising a polyvinyl alcohol-polyethylene glycol graft-copolymer and titanium dioxide to a maximum coating thickness of 80µm. Raman spectroscopy was used as in-line PAT tool. The spectra were acquired every minute and correlated to the amount of applied aqueous coating suspension. SBC was compared to another well-known multivariate analysis method, Partial Least Squares-regression (PLS) and a simpler approach, Univariate Data Analysis (UVDA). All developed calibration models had coefficient of determination values (R 2 ) higher than 0.99. The coating endpoints could be predicted with root mean square errors (RMSEP) less than 3.1% of the applied coating suspensions. Compared to PLS and UVDA, SBC proved to be an alternative multivariate calibration method with high predictive power. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Reduction of interferences in graphite furnace atomic absorption spectrometry by multiple linear regression modelling

    NASA Astrophysics Data System (ADS)

    Grotti, Marco; Abelmoschi, Maria Luisa; Soggia, Francesco; Tiberiade, Christian; Frache, Roberto

    2000-12-01

    The multivariate effects of Na, K, Mg and Ca as nitrates on the electrothermal atomisation of manganese, cadmium and iron were studied by multiple linear regression modelling. Since the models proved to efficiently predict the effects of the considered matrix elements in a wide range of concentrations, they were applied to correct the interferences occurring in the determination of trace elements in seawater after pre-concentration of the analytes. In order to obtain a statistically significant number of samples, a large volume of the certified seawater reference materials CASS-3 and NASS-3 was treated with Chelex-100 resin; then, the chelating resin was separated from the solution, divided into several sub-samples, each of them was eluted with nitric acid and analysed by electrothermal atomic absorption spectrometry (for trace element determinations) and inductively coupled plasma optical emission spectrometry (for matrix element determinations). To minimise any other systematic error besides that due to matrix effects, accuracy of the pre-concentration step and contamination levels of the procedure were checked by inductively coupled plasma mass spectrometric measurements. Analytical results obtained by applying the multiple linear regression models were compared with those obtained with other calibration methods, such as external calibration using acid-based standards, external calibration using matrix-matched standards and the analyte addition technique. Empirical models proved to efficiently reduce interferences occurring in the analysis of real samples, allowing an improvement of accuracy better than for other calibration methods.

  5. Screening and clustering of sparse regressions with finite non-Gaussian mixtures.

    PubMed

    Zhang, Jian

    2017-06-01

    This article proposes a method to address the problem that can arise when covariates in a regression setting are not Gaussian, which may give rise to approximately mixture-distributed errors, or when a true mixture of regressions produced the data. The method begins with non-Gaussian mixture-based marginal variable screening, followed by fitting a full but relatively smaller mixture regression model to the selected data with help of a new penalization scheme. Under certain regularity conditions, the new screening procedure is shown to possess a sure screening property even when the population is heterogeneous. We further prove that there exists an elbow point in the associated scree plot which results in a consistent estimator of the set of active covariates in the model. By simulations, we demonstrate that the new procedure can substantially improve the performance of the existing procedures in the content of variable screening and data clustering. By applying the proposed procedure to motif data analysis in molecular biology, we demonstrate that the new method holds promise in practice. © 2016, The International Biometric Society.

  6. Pedotransfer functions: bridging the gap between available basic soil data and missing soil hydraulic characteristics

    NASA Astrophysics Data System (ADS)

    Wösten, J. H. M.; Pachepsky, Ya. A.; Rawls, W. J.

    2001-10-01

    Water retention and hydraulic conductivity are crucial input parameters in any modelling study on water flow and solute transport in soils. Due to inherent temporal and spatial variability in these hydraulic characteristics, large numbers of samples are required to properly characterise areas of land. Hydraulic characteristics can be obtained from direct laboratory and field measurements. However, these measurements are time consuming which makes it costly to characterise an area of land. As an alternative, analysis of existing databases of measured soil hydraulic data may result in pedotransfer functions. In practise, these functions often prove to be good predictors for missing soil hydraulic characteristics. Examples are presented of different equations describing hydraulic characteristics and of pedotransfer functions used to predict parameters in these equations. Grouping of data prior to pedotransfer function development is discussed as well as the use of different soil properties as predictors. In addition to regression analysis, new techniques such as artificial neural networks, group methods of data handling, and classification and regression trees are increasingly being used for pedotransfer function development. Actual development of pedotransfer functions is demonstrated by describing a practical case study. Examples are presented of pedotransfer function for predicting other than hydraulic characteristics. Accuracy and reliability of pedotransfer functions are demonstrated and discussed. In this respect, functional evaluation of pedotransfer functions proves to be a good tool to assess the desired accuracy of a pedotransfer function for a specific application.

  7. A Simple and Specific Stability- Indicating RP-HPLC Method for Routine Assay of Adefovir Dipivoxil in Bulk and Tablet Dosage Form.

    PubMed

    Darsazan, Bahar; Shafaati, Alireza; Mortazavi, Seyed Alireza; Zarghi, Afshin

    2017-01-01

    A simple and reliable stability-indicating RP-HPLC method was developed and validated for analysis of adefovir dipivoxil (ADV).The chromatographic separation was performed on a C 18 column using a mixture of acetonitrile-citrate buffer (10 mM at pH 5.2) 36:64 (%v/v) as mobile phase, at a flow rate of 1.5 mL/min. Detection was carried out at 260 nm and a sharp peak was obtained for ADV at a retention time of 5.8 ± 0.01 min. No interferences were observed from its stress degradation products. The method was validated according to the international guidelines. Linear regression analysis of data for the calibration plot showed a linear relationship between peak area and concentration over the range of 0.5-16 μg/mL; the regression coefficient was 0.9999and the linear regression equation was y = 24844x-2941.3. The detection (LOD) and quantification (LOQ) limits were 0.12 and 0.35 μg/mL, respectively. The results proved the method was fast (analysis time less than 7 min), precise, reproducible, and accurate for analysis of ADV over a wide range of concentration. The proposed specific method was used for routine quantification of ADV in pharmaceutical bulk and a tablet dosage form.

  8. Estimation of Sensory Pork Loin Tenderness Using Warner-Bratzler Shear Force and Texture Profile Analysis Measurements

    PubMed Central

    Choe, Jee-Hwan; Choi, Mi-Hee; Rhee, Min-Suk; Kim, Byoung-Chul

    2016-01-01

    This study investigated the degree to which instrumental measurements explain the variation in pork loin tenderness as assessed by the sensory evaluation of trained panelists. Warner-Bratzler shear force (WBS) had a significant relationship with the sensory tenderness variables, such as softness, initial tenderness, chewiness, and rate of breakdown. In a regression analysis, WBS could account variations in these sensory variables, though only to a limited proportion of variation. On the other hand, three parameters from texture profile analysis (TPA)—hardness, gumminess, and chewiness—were significantly correlated with all sensory evaluation variables. In particular, from the result of stepwise regression analysis, TPA hardness alone explained over 15% of variation in all sensory evaluation variables, with the exception of perceptible residue. Based on these results, TPA analysis was found to be better than WBS measurement, with the TPA parameter hardness likely to prove particularly useful, in terms of predicting pork loin tenderness as rated by trained panelists. However, sensory evaluation should be conducted to investigate practical pork tenderness perceived by consumer, because both instrumental measurements could explain only a small portion (less than 20%) of the variability in sensory evaluation. PMID:26954174

  9. Estimation of Sensory Pork Loin Tenderness Using Warner-Bratzler Shear Force and Texture Profile Analysis Measurements.

    PubMed

    Choe, Jee-Hwan; Choi, Mi-Hee; Rhee, Min-Suk; Kim, Byoung-Chul

    2016-07-01

    This study investigated the degree to which instrumental measurements explain the variation in pork loin tenderness as assessed by the sensory evaluation of trained panelists. Warner-Bratzler shear force (WBS) had a significant relationship with the sensory tenderness variables, such as softness, initial tenderness, chewiness, and rate of breakdown. In a regression analysis, WBS could account variations in these sensory variables, though only to a limited proportion of variation. On the other hand, three parameters from texture profile analysis (TPA)-hardness, gumminess, and chewiness-were significantly correlated with all sensory evaluation variables. In particular, from the result of stepwise regression analysis, TPA hardness alone explained over 15% of variation in all sensory evaluation variables, with the exception of perceptible residue. Based on these results, TPA analysis was found to be better than WBS measurement, with the TPA parameter hardness likely to prove particularly useful, in terms of predicting pork loin tenderness as rated by trained panelists. However, sensory evaluation should be conducted to investigate practical pork tenderness perceived by consumer, because both instrumental measurements could explain only a small portion (less than 20%) of the variability in sensory evaluation.

  10. Cascade Optimization Strategy with Neural Network and Regression Approximations Demonstrated on a Preliminary Aircraft Engine Design

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Patnaik, Surya N.

    2000-01-01

    A preliminary aircraft engine design methodology is being developed that utilizes a cascade optimization strategy together with neural network and regression approximation methods. The cascade strategy employs different optimization algorithms in a specified sequence. The neural network and regression methods are used to approximate solutions obtained from the NASA Engine Performance Program (NEPP), which implements engine thermodynamic cycle and performance analysis models. The new methodology is proving to be more robust and computationally efficient than the conventional optimization approach of using a single optimization algorithm with direct reanalysis. The methodology has been demonstrated on a preliminary design problem for a novel subsonic turbofan engine concept that incorporates a wave rotor as a cycle-topping device. Computations of maximum thrust were obtained for a specific design point in the engine mission profile. The results (depicted in the figure) show a significant improvement in the maximum thrust obtained using the new methodology in comparison to benchmark solutions obtained using NEPP in a manual design mode.

  11. TLE uncertainty estimation using robust weighted differencing

    NASA Astrophysics Data System (ADS)

    Geul, Jacco; Mooij, Erwin; Noomen, Ron

    2017-05-01

    Accurate knowledge of satellite orbit errors is essential for many types of analyses. Unfortunately, for two-line elements (TLEs) this is not available. This paper presents a weighted differencing method using robust least-squares regression for estimating many important error characteristics. The method is applied to both classic and enhanced TLEs, compared to previous implementations, and validated using Global Positioning System (GPS) solutions for the GOCE satellite in Low-Earth Orbit (LEO), prior to its re-entry. The method is found to be more accurate than previous TLE differencing efforts in estimating initial uncertainty, as well as error growth. The method also proves more reliable and requires no data filtering (such as outlier removal). Sensitivity analysis shows a strong relationship between argument of latitude and covariance (standard deviations and correlations), which the method is able to approximate. Overall, the method proves accurate, computationally fast, and robust, and is applicable to any object in the satellite catalogue (SATCAT).

  12. The experimental design approach to eluotropic strength of 20 solvents in thin-layer chromatography on silica gel.

    PubMed

    Komsta, Łukasz; Stępkowska, Barbara; Skibiński, Robert

    2017-02-03

    The eluotropic strength on thin-layer silica plates was investigated for 20 chromatographic grade solvents available in current market. 35 model compounds were used as test subjects in the investigation. The use of modern mixture screening design allowed to estimate each solvent as a separate elution coefficient with an acceptable error of estimation (0.0913 of R M value). Additional bootstrapping technique was used to check the distribution and uncertainty of eluotropic estimates, proving very similar confidence intervals to linear regression. Principal component analysis proved that the only one parameter (mean eluotropic strength) is satisfactory to describe the solvent property, as it explains almost 90% of variance of retention. The obtained eluotropic data can be good appendix to earlier published results and their values can be interpreted in context of R M differences. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. The experimental design approach to eluotropic strength of 20 solvents in thin-layer chromatography on silica gel.

    PubMed

    Komsta, Łukasz; Stępkowska, Barbara; Skibiński, Robert

    2017-01-04

    The eluotropic strength on thin-layer silica plates was investigated for 20 chromatographic grade solvents available in current market. 35 model compounds were used as test subjects in the investigation. The use of modern mixture screening design allowed to estimate each solvent as a separate elution coefficient with an acceptable error of estimation (0.0913 of R M value). Additional bootstrapping technique was used to check the distribution and uncertainty of eluotropic estimates, proving very similar confidence intervals to linear regression. Principal component analysis proved that the only one parameter (mean eluotropic strength) is satisfactory to describe the solvent property, as it explains almost 90% of variance of retention. The obtained eluotropic data can be good appendix to earlier published results and their values can be interpreted in context of R M differences. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Analysis of a database to predict the result of allergy testing in vivo in patients with chronic nasal symptoms.

    PubMed

    Lacagnina, Valerio; Leto-Barone, Maria S; La Piana, Simona; Seidita, Aurelio; Pingitore, Giuseppe; Di Lorenzo, Gabriele

    2014-01-01

    This article uses the logistic regression model for diagnostic decision making in patients with chronic nasal symptoms. We studied the ability of the logistic regression model, obtained by the evaluation of a database, to detect patients with positive allergy skin-prick test (SPT) and patients with negative SPT. The model developed was validated using the data set obtained from another medical institution. The analysis was performed using a database obtained from a questionnaire administered to the patients with nasal symptoms containing personal data, clinical data, and results of allergy testing (SPT). All variables found to be significantly different between patients with positive and negative SPT (p < 0.05) were selected for the logistic regression models and were analyzed with backward stepwise logistic regression, evaluated with area under the curve of the receiver operating characteristic curve. A second set of patients from another institution was used to prove the model. The accuracy of the model in identifying, over the second set, both patients whose SPT will be positive and negative was high. The model detected 96% of patients with nasal symptoms and positive SPT and classified 94% of those with negative SPT. This study is preliminary to the creation of a software that could help the primary care doctors in a diagnostic decision making process (need of allergy testing) in patients complaining of chronic nasal symptoms.

  15. A Diagrammatic Exposition of Regression and Instrumental Variables for the Beginning Student

    ERIC Educational Resources Information Center

    Foster, Gigi

    2009-01-01

    Some beginning students of statistics and econometrics have difficulty with traditional algebraic approaches to explaining regression and related techniques. For these students, a simple and intuitive diagrammatic introduction as advocated by Kennedy (2008) may prove a useful framework to support further study. The author presents a series of…

  16. A Time Series Analysis: Weather Factors, Human Migration and Malaria Cases in Endemic Area of Purworejo, Indonesia, 2005–2014

    PubMed Central

    REJEKI, Dwi Sarwani Sri; NURHAYATI, Nunung; AJI, Budi; MURHANDARWATI, E. Elsa Herdiana; KUSNANTO, Hari

    2018-01-01

    Background: Climatic and weather factors become important determinants of vector-borne diseases transmission like malaria. This study aimed to prove relationships between weather factors with considering human migration and previous case findings and malaria cases in endemic areas in Purworejo during 2005–2014. Methods: This study employed ecological time series analysis by using monthly data. The independent variables were the maximum temperature, minimum temperature, maximum humidity, minimum humidity, precipitation, human migration, and previous malaria cases, while the dependent variable was positive malaria cases. Three models of count data regression analysis i.e. Poisson model, quasi-Poisson model, and negative binomial model were applied to measure the relationship. The least Akaike Information Criteria (AIC) value was also performed to find the best model. Negative binomial regression analysis was considered as the best model. Results: The model showed that humidity (lag 2), precipitation (lag 3), precipitation (lag 12), migration (lag1) and previous malaria cases (lag 12) had a significant relationship with malaria cases. Conclusion: Weather, migration and previous malaria cases factors need to be considered as prominent indicators for the increase of malaria case projection. PMID:29900134

  17. Two models of the sound-signal frequency dependence on the animal body size as exemplified by the ground squirrels of Eurasia (mammalia, rodentia).

    PubMed

    Nikol'skii, A A

    2017-11-01

    Dependence of the sound-signal frequency on the animal body length was studied in 14 ground squirrel species (genus Spermophilus) of Eurasia. Regression analysis of the total sample yielded a low determination coefficient (R 2 = 26%), because the total sample proved to be heterogeneous in terms of signal frequency within the dimension classes of animals. When the total sample was divided into two groups according to signal frequency, two statistically significant models (regression equations) were obtained in which signal frequency depended on the body size at high determination coefficients (R 2 = 73 and 94% versus 26% for the total sample). Thus, the problem of correlation between animal body size and the frequency of their vocal signals does not have a unique solution.

  18. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  19. Effect of duration of denervation on outcomes of ansa-recurrent laryngeal nerve reinnervation.

    PubMed

    Li, Meng; Chen, Shicai; Wang, Wei; Chen, Donghui; Zhu, Minhui; Liu, Fei; Zhang, Caiyun; Li, Yan; Zheng, Hongliang

    2014-08-01

    To investigate the efficacy of laryngeal reinnervation with ansa cervicalis among unilateral vocal fold paralysis (UVFP) patients with different denervation durations. We retrospectively reviewed 349 consecutive UVFP cases of delayed ansa cervicalis to the recurrent laryngeal nerve (RLN) anastomosis. Potential influencing factors were analyzed in multivariable logistic regression analysis. Stratification analysis performed was aimed at one of the identified significant variables: denervation duration. Videostroboscopy, perceptual evaluation, acoustic analysis, maximum phonation time (MPT), and laryngeal electromyography (EMG) were performed preoperatively and postoperatively. Gender, age, preoperative EMG status and denervation duration were analyzed in multivariable logistic regression analysis. Stratification analysis was performed on denervation duration, which was divided into three groups according to the interval between RLN injury and reinnervation: group A, 6 to 12 months; group B, 12 to 24 months; and group C, > 24 months. Age, preoperative EMG, and denervation duration were identified as significant variables in multivariable logistic regression analysis. Stratification analysis on denervation duration showed significant differences between group A and C and between group B and C (P < 0.05)-but showed no significant difference between group A and B (P > 0.05) with regard to parameters overall grade, jitter, shimmer, noise-to-harmonics ratio, MPT, and postoperative EMG. In addition, videostroboscopic and laryngeal EMG data, perceptual and acoustic parameters, and MPT values were significantly improved postoperatively in each denervation duration group (P < 0.01). Although delayed laryngeal reinnervation is proved valid for UVFP, surgical outcome is better if the procedure is performed within 2 years after nerve injury than that over 2 years. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  20. Identification method of laser gyro error model under changing physical field

    NASA Astrophysics Data System (ADS)

    Wang, Qingqing; Niu, Zhenzhong

    2018-04-01

    In this paper, the influence mechanism of temperature, temperature changing rate and temperature gradient on the inertial devices is studied. The two-order model of zero bias and the three-order model of the calibration factor of lster gyro under temperature variation are deduced. The calibration scheme of temperature error is designed, and the experiment is carried out. Two methods of stepwise regression analysis and BP neural network are used to identify the parameters of the temperature error model, and the effectiveness of the two methods is proved by the temperature error compensation.

  1. Evaluation and statistical judgement of neural responses to sinusoidal stimulation in cases with superimposed drift and noise.

    PubMed

    Jastreboff, P W

    1979-06-01

    Time histograms of neural responses evoked by sinuosidal stimulation often contain a slow drifting and an irregular noise which disturb Fourier analysis of these responses. Section 2 of this paper evaluates the extent to which a linear drift influences the Fourier analysis, and develops a combined Fourier and linear regression analysis for detecting and correcting for such a linear drift. Usefulness of this correcting method is demonstrated for the time histograms of actual eye movements and Purkinje cell discharges evoked by sinusoidal rotation of rabbits in the horizontal plane. In Sect. 3, the analysis of variance is adopted for estimating the probability of the random occurrence of the response curve extracted by Fourier analysis from noise. This method proved to be useful for avoiding false judgements as to whether the response curve was meaningful, particularly when the response was small relative to the contaminating noise.

  2. Deep ensemble learning of sparse regression models for brain disease diagnosis.

    PubMed

    Suk, Heung-Il; Lee, Seong-Whan; Shen, Dinggang

    2017-04-01

    Recent studies on brain imaging analysis witnessed the core roles of machine learning techniques in computer-assisted intervention for brain disease diagnosis. Of various machine-learning techniques, sparse regression models have proved their effectiveness in handling high-dimensional data but with a small number of training samples, especially in medical problems. In the meantime, deep learning methods have been making great successes by outperforming the state-of-the-art performances in various applications. In this paper, we propose a novel framework that combines the two conceptually different methods of sparse regression and deep learning for Alzheimer's disease/mild cognitive impairment diagnosis and prognosis. Specifically, we first train multiple sparse regression models, each of which is trained with different values of a regularization control parameter. Thus, our multiple sparse regression models potentially select different feature subsets from the original feature set; thereby they have different powers to predict the response values, i.e., clinical label and clinical scores in our work. By regarding the response values from our sparse regression models as target-level representations, we then build a deep convolutional neural network for clinical decision making, which thus we call 'Deep Ensemble Sparse Regression Network.' To our best knowledge, this is the first work that combines sparse regression models with deep neural network. In our experiments with the ADNI cohort, we validated the effectiveness of the proposed method by achieving the highest diagnostic accuracies in three classification tasks. We also rigorously analyzed our results and compared with the previous studies on the ADNI cohort in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Deep ensemble learning of sparse regression models for brain disease diagnosis

    PubMed Central

    Suk, Heung-Il; Lee, Seong-Whan; Shen, Dinggang

    2018-01-01

    Recent studies on brain imaging analysis witnessed the core roles of machine learning techniques in computer-assisted intervention for brain disease diagnosis. Of various machine-learning techniques, sparse regression models have proved their effectiveness in handling high-dimensional data but with a small number of training samples, especially in medical problems. In the meantime, deep learning methods have been making great successes by outperforming the state-of-the-art performances in various applications. In this paper, we propose a novel framework that combines the two conceptually different methods of sparse regression and deep learning for Alzheimer’s disease/mild cognitive impairment diagnosis and prognosis. Specifically, we first train multiple sparse regression models, each of which is trained with different values of a regularization control parameter. Thus, our multiple sparse regression models potentially select different feature subsets from the original feature set; thereby they have different powers to predict the response values, i.e., clinical label and clinical scores in our work. By regarding the response values from our sparse regression models as target-level representations, we then build a deep convolutional neural network for clinical decision making, which thus we call ‘ Deep Ensemble Sparse Regression Network.’ To our best knowledge, this is the first work that combines sparse regression models with deep neural network. In our experiments with the ADNI cohort, we validated the effectiveness of the proposed method by achieving the highest diagnostic accuracies in three classification tasks. We also rigorously analyzed our results and compared with the previous studies on the ADNI cohort in the literature. PMID:28167394

  4. [Research on the method of interference correction for nondispersive infrared multi-component gas analysis].

    PubMed

    Sun, You-Wen; Liu, Wen-Qing; Wang, Shi-Mei; Huang, Shu-Hua; Yu, Xiao-Man

    2011-10-01

    A method of interference correction for nondispersive infrared multi-component gas analysis was described. According to the successive integral gas absorption models and methods, the influence of temperature and air pressure on the integral line strengths and linetype was considered, and based on Lorentz detuning linetypes, the absorption cross sections and response coefficients of H2O, CO2, CO, and NO on each filter channel were obtained. The four dimension linear regression equations for interference correction were established by response coefficients, the absorption cross interference was corrected by solving the multi-dimensional linear regression equations, and after interference correction, the pure absorbance signal on each filter channel was only controlled by the corresponding target gas concentration. When the sample cell was filled with gas mixture with a certain concentration proportion of CO, NO and CO2, the pure absorbance after interference correction was used for concentration inversion, the inversion concentration error for CO2 is 2.0%, the inversion concentration error for CO is 1.6%, and the inversion concentration error for NO is 1.7%. Both the theory and experiment prove that the interference correction method proposed for NDIR multi-component gas analysis is feasible.

  5. ORACLE INEQUALITIES FOR THE LASSO IN THE COX MODEL

    PubMed Central

    Huang, Jian; Sun, Tingni; Ying, Zhiliang; Yu, Yi; Zhang, Cun-Hui

    2013-01-01

    We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox proportional hazards regression models where the number of time-dependent covariates can be larger than the sample size. We establish oracle inequalities based on natural extensions of the compatibility and cone invertibility factors of the Hessian matrix at the true regression coefficients. Similar results based on an extension of the restricted eigenvalue can be also proved by our method. However, the presented oracle inequalities are sharper since the compatibility and cone invertibility factors are always greater than the corresponding restricted eigenvalue. In the Cox regression model, the Hessian matrix is based on time-dependent covariates in censored risk sets, so that the compatibility and cone invertibility factors, and the restricted eigenvalue as well, are random variables even when they are evaluated for the Hessian at the true regression coefficients. Under mild conditions, we prove that these quantities are bounded from below by positive constants for time-dependent covariates, including cases where the number of covariates is of greater order than the sample size. Consequently, the compatibility and cone invertibility factors can be treated as positive constants in our oracle inequalities. PMID:24086091

  6. ORACLE INEQUALITIES FOR THE LASSO IN THE COX MODEL.

    PubMed

    Huang, Jian; Sun, Tingni; Ying, Zhiliang; Yu, Yi; Zhang, Cun-Hui

    2013-06-01

    We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox proportional hazards regression models where the number of time-dependent covariates can be larger than the sample size. We establish oracle inequalities based on natural extensions of the compatibility and cone invertibility factors of the Hessian matrix at the true regression coefficients. Similar results based on an extension of the restricted eigenvalue can be also proved by our method. However, the presented oracle inequalities are sharper since the compatibility and cone invertibility factors are always greater than the corresponding restricted eigenvalue. In the Cox regression model, the Hessian matrix is based on time-dependent covariates in censored risk sets, so that the compatibility and cone invertibility factors, and the restricted eigenvalue as well, are random variables even when they are evaluated for the Hessian at the true regression coefficients. Under mild conditions, we prove that these quantities are bounded from below by positive constants for time-dependent covariates, including cases where the number of covariates is of greater order than the sample size. Consequently, the compatibility and cone invertibility factors can be treated as positive constants in our oracle inequalities.

  7. Industrial and occupational ergonomics in the petrochemical process industry: a regression trees approach.

    PubMed

    Bevilacqua, M; Ciarapica, F E; Giacchetta, G

    2008-07-01

    This work is an attempt to apply classification tree methods to data regarding accidents in a medium-sized refinery, so as to identify the important relationships between the variables, which can be considered as decision-making rules when adopting any measures for improvement. The results obtained using the CART (Classification And Regression Trees) method proved to be the most precise and, in general, they are encouraging concerning the use of tree diagrams as preliminary explorative techniques for the assessment of the ergonomic, management and operational parameters which influence high accident risk situations. The Occupational Injury analysis carried out in this paper was planned as a dynamic process and can be repeated systematically. The CART technique, which considers a very wide set of objective and predictive variables, shows new cause-effect correlations in occupational safety which had never been previously described, highlighting possible injury risk groups and supporting decision-making in these areas. The use of classification trees must not, however, be seen as an attempt to supplant other techniques, but as a complementary method which can be integrated into traditional types of analysis.

  8. Technical note: Fu-Liou-Gu and Corti-Peter model performance evaluation for radiative retrievals from cirrus clouds

    NASA Astrophysics Data System (ADS)

    Lolli, Simone; Campbell, James R.; Lewis, Jasper R.; Gu, Yu; Welton, Ellsworth J.

    2017-06-01

    We compare, for the first time, the performance of a simplified atmospheric radiative transfer algorithm package, the Corti-Peter (CP) model, versus the more complex Fu-Liou-Gu (FLG) model, for resolving top-of-the-atmosphere radiative forcing characteristics from single-layer cirrus clouds obtained from the NASA Micro-Pulse Lidar Network database in 2010 and 2011 at Singapore and in Greenbelt, Maryland, USA, in 2012. Specifically, CP simplifies calculation of both clear-sky longwave and shortwave radiation through regression analysis applied to radiative calculations, which contributes significantly to differences between the two. The results of the intercomparison show that differences in annual net top-of-the-atmosphere (TOA) cloud radiative forcing can reach 65 %. This is particularly true when land surface temperatures are warmer than 288 K, where the CP regression analysis becomes less accurate. CP proves useful for first-order estimates of TOA cirrus cloud forcing, but may not be suitable for quantitative accuracy, including the absolute sign of cirrus cloud daytime TOA forcing that can readily oscillate around zero globally.

  9. The Use of Infrared Thermography for Porosity Assessment of Intact Rock

    NASA Astrophysics Data System (ADS)

    Mineo, S.; Pappalardo, G.

    2016-08-01

    Preliminary results on a new test for the indirect assessment of porosity through infrared thermography are presented. The study of the cooling behavior of rock samples in laboratory, through the analysis of thermograms, proved an innovative tool for the estimation of such an important property, which is one of the main features affecting the mechanical behavior of rocks. A detailed experimentation was performed on artificially heated volcanic rock samples characterized by different porosity values. The cooling trend was described both graphically and numerically, with the help of cooling curves and Cooling Rate Index. The latter, which proved strictly linked to porosity, was employed to find reliable equations for its indirect estimation. Simple and multiple regression analyses returned satisfactory outcomes, highlighting the great match between predicted and measured porosity values, thus confirming the goodness of the proposed model. This study brings a novelty in rock mechanics, laying the foundation for future researches aimed at refining achieved results for the validation of the model in a larger scale.

  10. The effectiveness of manual and mechanical instrumentation for the retreatment of three different root canal filling materials.

    PubMed

    Somma, Francesco; Cammarota, Giuseppe; Plotino, Gianluca; Grande, Nicola M; Pameijer, Cornelis H

    2008-04-01

    The aim of this study was to compare the effectiveness of the Mtwo R (Sweden & Martina, Padova, Italy), ProTaper retreatment files (Dentsply-Maillefer, Ballaigues, Switzerland), and a Hedström manual technique in the removal of three different filling materials (gutta-percha, Resilon [Resilon Research LLC, Madison, CT], and EndoRez [Ultradent Products Inc, South Jordan, UT]) during retreatment. Ninety single-rooted straight premolars were instrumented and randomly divided into 9 groups of 10 teeth each (n = 10) with regards to filling material and instrument used. For all roots, the following data were recorded: procedural errors, time of retreatment, apically extruded material, canal wall cleanliness through optical stereomicroscopy (OSM), and scanning electron microscopy (SEM). A linear regression analysis and three logistic regression analyses were performed to assess the level of significance set at p = 0.05. The results indicated that the overall regression models were statistically significant. The Mtwo R, ProTaper retreatment files, and Resilon filling material had a positive impact in reducing the time for retreatment. Both ProTaper retreatment files and Mtwo R showed a greater extrusion of debris. For both OSM and SEM logistic regression models, the root canal apical third had the greatest impact on the score values. EndoRez filling material resulted in cleaner root canal walls using OSM analysis, whereas Resilon filling material and both engine-driven NiTi rotary techniques resulted in less clean root canal walls according to SEM analysis. In conclusion, all instruments left remnants of filling material and debris on the root canal walls irrespective of the root filling material used. Both the engine-driven NiTi rotary systems proved to be safe and fast devices for the removal of endodontic filling material.

  11. Prediction of coagulation and flocculation processes using ANN models and fuzzy regression.

    PubMed

    Zangooei, Hossein; Delnavaz, Mohammad; Asadollahfardi, Gholamreza

    2016-09-01

    Coagulation and flocculation are two main processes used to integrate colloidal particles into larger particles and are two main stages of primary water treatment. Coagulation and flocculation processes are only needed when colloidal particles are a significant part of the total suspended solid fraction. Our objective was to predict turbidity of water after the coagulation and flocculation process while other parameters such as types and concentrations of coagulants, pH, and influent turbidity of raw water were known. We used a multilayer perceptron (MLP), a radial basis function (RBF) of artificial neural networks (ANNs) and various kinds of fuzzy regression analysis to predict turbidity after the coagulation and flocculation processes. The coagulant used in the pilot plant, which was located in water treatment plant, was poly aluminum chloride. We used existing data, including the type and concentrations of coagulant, pH and influent turbidity, of the raw water because these types of data were available from the pilot plant for simulation and data was collected by the Tehran water authority. The results indicated that ANNs had more ability in simulating the coagulation and flocculation process and predicting turbidity removal with different experimental data than did the fuzzy regression analysis, and may have the ability to reduce the number of jar tests, which are time-consuming and expensive. The MLP neural network proved to be the best network compared to the RBF neural network and fuzzy regression analysis in this study. The MLP neural network can predict the effluent turbidity of the coagulation and the flocculation process with a coefficient of determination (R 2 ) of 0.96 and root mean square error of 0.0106.

  12. Periodic limb movements of sleep: empirical and theoretical evidence supporting objective at-home monitoring

    PubMed Central

    Moro, Marilyn; Goparaju, Balaji; Castillo, Jelina; Alameddine, Yvonne; Bianchi, Matt T

    2016-01-01

    Introduction Periodic limb movements of sleep (PLMS) may increase cardiovascular and cerebrovascular morbidity. However, most people with PLMS are either asymptomatic or have nonspecific symptoms. Therefore, predicting elevated PLMS in the absence of restless legs syndrome remains an important clinical challenge. Methods We undertook a retrospective analysis of demographic data, subjective symptoms, and objective polysomnography (PSG) findings in a clinical cohort with or without obstructive sleep apnea (OSA) from our laboratory (n=443 with OSA, n=209 without OSA). Correlation analysis and regression modeling were performed to determine predictors of periodic limb movement index (PLMI). Markov decision analysis with TreeAge software compared strategies to detect PLMS: in-laboratory PSG, at-home testing, and a clinical prediction tool based on the regression analysis. Results Elevated PLMI values (>15 per hour) were observed in >25% of patients. PLMI values in No-OSA patients correlated with age, sex, self-reported nocturnal leg jerks, restless legs syndrome symptoms, and hypertension. In OSA patients, PLMI correlated only with age and self-reported psychiatric medications. Regression models indicated only a modest predictive value of demographics, symptoms, and clinical history. Decision modeling suggests that at-home testing is favored as the pretest probability of PLMS increases, given plausible assumptions regarding PLMS morbidity, costs, and assumed benefits of pharmacological therapy. Conclusion Although elevated PLMI values were commonly observed, routinely acquired clinical information had only weak predictive utility. As the clinical importance of elevated PLMI continues to evolve, it is likely that objective measures such as PSG or at-home PLMS monitors will prove increasingly important for clinical and research endeavors. PMID:27540316

  13. Convergent Time-Varying Regression Models for Data Streams: Tracking Concept Drift by the Recursive Parzen-Based Generalized Regression Neural Networks.

    PubMed

    Duda, Piotr; Jaworski, Maciej; Rutkowski, Leszek

    2018-03-01

    One of the greatest challenges in data mining is related to processing and analysis of massive data streams. Contrary to traditional static data mining problems, data streams require that each element is processed only once, the amount of allocated memory is constant and the models incorporate changes of investigated streams. A vast majority of available methods have been developed for data stream classification and only a few of them attempted to solve regression problems, using various heuristic approaches. In this paper, we develop mathematically justified regression models working in a time-varying environment. More specifically, we study incremental versions of generalized regression neural networks, called IGRNNs, and we prove their tracking properties - weak (in probability) and strong (with probability one) convergence assuming various concept drift scenarios. First, we present the IGRNNs, based on the Parzen kernels, for modeling stationary systems under nonstationary noise. Next, we extend our approach to modeling time-varying systems under nonstationary noise. We present several types of concept drifts to be handled by our approach in such a way that weak and strong convergence holds under certain conditions. Finally, in the series of simulations, we compare our method with commonly used heuristic approaches, based on forgetting mechanism or sliding windows, to deal with concept drift. Finally, we apply our concept in a real life scenario solving the problem of currency exchange rates prediction.

  14. Using within-day hive weight changes to measure environmental effects on honey bee colonies

    PubMed Central

    Holst, Niels; Weiss, Milagra; Carroll, Mark J.; McFrederick, Quinn S.; Barron, Andrew B.

    2018-01-01

    Patterns in within-day hive weight data from two independent datasets in Arizona and California were modeled using piecewise regression, and analyzed with respect to honey bee colony behavior and landscape effects. The regression analysis yielded information on the start and finish of a colony’s daily activity cycle, hive weight change at night, hive weight loss due to departing foragers and weight gain due to returning foragers. Assumptions about the meaning of the timing and size of the morning weight changes were tested in a third study by delaying the forager departure times from one to three hours using screen entrance gates. A regression of planned vs. observed departure delays showed that the initial hive weight loss around dawn was largely due to foragers. In a similar experiment in Australia, hive weight loss due to departing foragers in the morning was correlated with net bee traffic (difference between the number of departing bees and the number of arriving bees) and from those data the payload of the arriving bees was estimated to be 0.02 g. The piecewise regression approach was then used to analyze a fifth study involving hives with and without access to natural forage. The analysis showed that, during a commercial pollination event, hives with previous access to forage had a significantly higher rate of weight gain as the foragers returned in the afternoon, and, in the weeks after the pollination event, a significantly higher rate of weight loss in the morning, as foragers departed. This combination of continuous weight data and piecewise regression proved effective in detecting treatment differences in foraging activity that other methods failed to detect. PMID:29791462

  15. Regression Rates Following the Treatment of Aggressive Posterior Retinopathy of Prematurity with Bevacizumab Versus Laser: 8-Year Retrospective Analysis

    PubMed Central

    Nicoară, Simona D.; Ştefănuţ, Anne C.; Nascutzy, Constanta; Zaharie, Gabriela C.; Toader, Laura E.; Drugan, Tudor C.

    2016-01-01

    Background Retinopathy is a serious complication related to prematurity and a leading cause of childhood blindness. The aggressive posterior form of retinopathy of prematurity (APROP) has a worse anatomical and functional outcome following laser therapy, as compared with the classic form of the disease. The main outcome measures are the APROP regression rate, structural outcomes, and complications associated with intravitreal bevacizumab (IVB) versus laser photocoagulation in APROP. Material/Methods This is a retrospective case series that includes infants with APROP who received either IVB or laser photocoagulation and had a follow-up of at least 60 weeks (for the laser photocoagulation group) and 80 weeks (for the IVB group). In the first group, laser photocoagulation of the retina was carried out and in the second group, 1 bevacizumab injection was administered intravitreally. The following parameters were analyzed in each group: sex, gestational age, birth weight, postnatal age and postmenstrual age at treatment, APROP regression, sequelae, and complications. Statistical analysis was performed using Microsoft Excel and IBM SPSS (version 23.0). Results The laser photocoagulation group consisted of 6 premature infants (12 eyes) and the IVB group consisted of 17 premature infants (34 eyes). Within the laser photocoagulation group, the evolution was favorable in 9 eyes (75%) and unfavorable in 3 eyes (25%). Within the IVB group, APROP regressed in 29 eyes (85.29%) and failed to regress in 5 eyes (14.71%). These differences are statistically significant, as proved by the McNemar test (P<0.001). Conclusions The IVB group had a statistically significant better outcome compared with the laser photocoagulation group, in APROP in our series. PMID:27062023

  16. Using within-day hive weight changes to measure environmental effects on honey bee colonies.

    PubMed

    Meikle, William G; Holst, Niels; Colin, Théotime; Weiss, Milagra; Carroll, Mark J; McFrederick, Quinn S; Barron, Andrew B

    2018-01-01

    Patterns in within-day hive weight data from two independent datasets in Arizona and California were modeled using piecewise regression, and analyzed with respect to honey bee colony behavior and landscape effects. The regression analysis yielded information on the start and finish of a colony's daily activity cycle, hive weight change at night, hive weight loss due to departing foragers and weight gain due to returning foragers. Assumptions about the meaning of the timing and size of the morning weight changes were tested in a third study by delaying the forager departure times from one to three hours using screen entrance gates. A regression of planned vs. observed departure delays showed that the initial hive weight loss around dawn was largely due to foragers. In a similar experiment in Australia, hive weight loss due to departing foragers in the morning was correlated with net bee traffic (difference between the number of departing bees and the number of arriving bees) and from those data the payload of the arriving bees was estimated to be 0.02 g. The piecewise regression approach was then used to analyze a fifth study involving hives with and without access to natural forage. The analysis showed that, during a commercial pollination event, hives with previous access to forage had a significantly higher rate of weight gain as the foragers returned in the afternoon, and, in the weeks after the pollination event, a significantly higher rate of weight loss in the morning, as foragers departed. This combination of continuous weight data and piecewise regression proved effective in detecting treatment differences in foraging activity that other methods failed to detect.

  17. Principal component analysis-based pattern analysis of dose-volume histograms and influence on rectal toxicity.

    PubMed

    Söhn, Matthias; Alber, Markus; Yan, Di

    2007-09-01

    The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as "eigenmodes," which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe approximately 94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses ( approximately 40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches.

  18. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  19. Prediction of Welded Joint Strength in Plasma Arc Welding: A Comparative Study Using Back-Propagation and Radial Basis Neural Networks

    NASA Astrophysics Data System (ADS)

    Srinivas, Kadivendi; Vundavilli, Pandu R.; Manzoor Hussain, M.; Saiteja, M.

    2016-09-01

    Welding input parameters such as current, gas flow rate and torch angle play a significant role in determination of qualitative mechanical properties of weld joint. Traditionally, it is necessary to determine the weld input parameters for every new welded product to obtain a quality weld joint which is time consuming. In the present work, the effect of plasma arc welding parameters on mild steel was studied using a neural network approach. To obtain a response equation that governs the input-output relationships, conventional regression analysis was also performed. The experimental data was constructed based on Taguchi design and the training data required for neural networks were randomly generated, by varying the input variables within their respective ranges. The responses were calculated for each combination of input variables by using the response equations obtained through the conventional regression analysis. The performances in Levenberg-Marquardt back propagation neural network and radial basis neural network (RBNN) were compared on various randomly generated test cases, which are different from the training cases. From the results, it is interesting to note that for the above said test cases RBNN analysis gave improved training results compared to that of feed forward back propagation neural network analysis. Also, RBNN analysis proved a pattern of increasing performance as the data points moved away from the initial input values.

  20. Guidelines and Procedures for Computing Time-Series Suspended-Sediment Concentrations and Loads from In-Stream Turbidity-Sensor and Streamflow Data

    USGS Publications Warehouse

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.

    2009-01-01

    In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.

  1. Solid-phase cadmium speciation in soil using L3-edge XANES spectroscopy with partial least-squares regression.

    PubMed

    Siebers, Nina; Kruse, Jens; Eckhardt, Kai-Uwe; Hu, Yongfeng; Leinweber, Peter

    2012-07-01

    Cadmium (Cd) has a high toxicity and resolving its speciation in soil is challenging but essential for estimating the environmental risk. In this study partial least-square (PLS) regression was tested for its capability to deconvolute Cd L(3)-edge X-ray absorption near-edge structure (XANES) spectra of multi-compound mixtures. For this, a library of Cd reference compound spectra and a spectrum of a soil sample were acquired. A good coefficient of determination (R(2)) of Cd compounds in mixtures was obtained for the PLS model using binary and ternary mixtures of various Cd reference compounds proving the validity of this approach. In order to describe complex systems like soil, multi-compound mixtures of a variety of Cd compounds must be included in the PLS model. The obtained PLS regression model was then applied to a highly Cd-contaminated soil revealing Cd(3)(PO(4))(2) (36.1%), Cd(NO(3))(2)·4H(2)O (24.5%), Cd(OH)(2) (21.7%), CdCO(3) (17.1%) and CdCl(2) (0.4%). These preliminary results proved that PLS regression is a promising approach for a direct determination of Cd speciation in the solid phase of a soil sample.

  2. Factors affecting metacognition of undergraduate nursing students in a blended learning environment.

    PubMed

    Hsu, Li-Ling; Hsieh, Suh-Ing

    2014-06-01

    This paper is a report of a study to examine the influence of demographic, learning involvement and learning performance variables on metacognition of undergraduate nursing students in a blended learning environment. A cross-sectional, correlational survey design was adopted. Ninety-nine students invited to participate in the study were enrolled in a professional nursing ethics course at a public nursing college. The blended learning intervention is basically an assimilation of classroom learning and online learning. Simple linear regression showed significant associations between frequency of online dialogues, the Case Analysis Attitude Scale scores, the Case Analysis Self Evaluation Scale scores, the Blended Learning Satisfaction Scale scores, and Metacognition Scale scores. Multiple linear regression indicated that frequency of online dialogues, the Case Analysis Self Evaluation Scale and the Blended Learning Satisfaction Scale were significant independent predictors of metacognition. Overall, the model accounted for almost half of the variance in metacognition. The blended learning module developed in this study proved successful in the end as a catalyst for the exercising of metacognitive abilities by the sample of nursing students. Learners are able to develop metacognitive ability in comprehension, argumentation, reasoning and various forms of higher order thinking through the blended learning process. © 2013 Wiley Publishing Asia Pty Ltd.

  3. Dietary consumption patterns and laryngeal cancer risk.

    PubMed

    Vlastarakos, Petros V; Vassileiou, Andrianna; Delicha, Evie; Kikidis, Dimitrios; Protopapas, Dimosthenis; Nikolopoulos, Thomas P

    2016-06-01

    We conducted a case-control study to investigate the effect of diet on laryngeal carcinogenesis. Our study population was made up of 140 participants-70 patients with laryngeal cancer (LC) and 70 controls with a non-neoplastic condition that was unrelated to diet, smoking, or alcohol. A food-frequency questionnaire determined the mean consumption of 113 different items during the 3 years prior to symptom onset. Total energy intake and cooking mode were also noted. The relative risk, odds ratio (OR), and 95% confidence interval (CI) were estimated by multiple logistic regression analysis. We found that the total energy intake was significantly higher in the LC group (p < 0.001), and that the difference remained statistically significant after logistic regression analysis (p < 0.001; OR: 118.70). Notably, meat consumption was higher in the LC group (p < 0.001), and the difference remained significant after logistic regression analysis (p = 0.029; OR: 1.16). LC patients also consumed significantly more fried food (p = 0.036); this difference also remained significant in the logistic regression model (p = 0.026; OR: 5.45). The LC group also consumed significantly more seafood (p = 0.012); the difference persisted after logistic regression analysis (p = 0.009; OR: 2.48), with the consumption of shrimp proving detrimental (p = 0.049; OR: 2.18). Finally, the intake of zinc was significantly higher in the LC group before and after logistic regression analysis (p = 0.034 and p = 0.011; OR: 30.15, respectively). Cereal consumption (including pastas) was also higher among the LC patients (p = 0.043), with logistic regression analysis showing that their negative effect was possibly associated with the sauces and dressings that traditionally accompany pasta dishes (p = 0.006; OR: 4.78). Conversely, a higher consumption of dairy products was found in controls (p < 0.05); logistic regression analysis showed that calcium appeared to be protective at the micronutrient level (p < 0.001; OR: 0.27). We found no difference in the overall consumption of fruits and vegetables between the LC patients and controls; however, the LC patients did have a greater consumption of cooked tomatoes and cooked root vegetables (p = 0.039 for both), and the controls had more consumption of leeks (p = 0.042) and, among controls younger than 65 years, cooked beans (p = 0.037). Lemon (p = 0.037), squeezed fruit juice (p = 0.032), and watermelon (p = 0.018) were also more frequently consumed by the controls. Other differences at the micronutrient level included greater consumption by the LC patients of retinol (p = 0.044), polyunsaturated fats (p = 0.041), and linoleic acid (p = 0.008); LC patients younger than 65 years also had greater intake of riboflavin (p = 0.045). We conclude that the differences in dietary consumption patterns between LC patients and controls indicate a possible role for lifestyle modifications involving nutritional factors as a means of decreasing the risk of laryngeal cancer.

  4. Regression-Based Approach For Feature Selection In Classification Issues. Application To Breast Cancer Detection And Recurrence

    NASA Astrophysics Data System (ADS)

    Belciug, Smaranda; Serbanescu, Mircea-Sebastian

    2015-09-01

    Feature selection is considered a key factor in classifications/decision problems. It is currently used in designing intelligent decision systems to choose the best features which allow the best performance. This paper proposes a regression-based approach to select the most important predictors to significantly increase the classification performance. Application to breast cancer detection and recurrence using publically available datasets proved the efficiency of this technique.

  5. Evaluation of ocular movements in patients with dyslexia.

    PubMed

    Vagge, Aldo; Cavanna, Margherita; Traverso, Carlo Enrico; Iester, Michele

    2015-04-01

    The aims of this study were to analyze the relationship between dyslexia and eye movements and to assess whether this method can be added to the workup of dyslexic patients. The sample was comprised of 11 children with a diagnosis of dyslexia and 11 normal between 8 and 13 years of age. All subjects underwent orthoptic evaluation, ophthalmological examinations, and eye movement analysis, specifically, stability analysis on fixating a still target, tracking saccades, analysis of fixation pauses, speed reading, saccades, and regressions through the reading of a text. Stability analysis on fixating a still target showed a significant (p < 0.001) difference between the two groups showing an increased amount of loss of fixation among dyslexic subjects (5.36 ± 2.5 s and 0.82 ± 2.1, respectively). Tracking saccades (left and right horizontal axis) did not show a significant difference. When reading parameters were looked into (number of saccades, number of regressions, reading time through the reading of a text), a significant (p < 0.001) difference was found between the groups. This study supports the belief that the alteration of eye movement does not depend on oculo-motor dysfunction but is secondary to a defect in the visual processing of linguistic material. Inclusion of assessment of this defect might prove beneficial in determining the presence of dyslexia in young children at a younger age, and an earlier intervention could be initiated.

  6. Predicting The Type Of Pregnancy Using Flexible Discriminate Analysis And Artificial Neural Networks: A Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hooman, A.; Mohammadzadeh, M

    Some medical and epidemiological surveys have been designed to predict a nominal response variable with several levels. With regard to the type of pregnancy there are four possible states: wanted, unwanted by wife, unwanted by husband and unwanted by couple. In this paper, we have predicted the type of pregnancy, as well as the factors influencing it using three different models and comparing them. Regarding the type of pregnancy with several levels, we developed a multinomial logistic regression, a neural network and a flexible discrimination based on the data and compared their results using tow statistical indices: Surface under curvemore » (ROC) and kappa coefficient. Based on these tow indices, flexible discrimination proved to be a better fit for prediction on data in comparison to other methods. When the relations among variables are complex, one can use flexible discrimination instead of multinomial logistic regression and neural network to predict the nominal response variables with several levels in order to gain more accurate predictions.« less

  7. Study on grain quality forecasting method and indicators by using hyperspectral data in wheat

    NASA Astrophysics Data System (ADS)

    Huang, Wenjiang; Wang, Jihua; Liu, Liangyun; Wang, Zhijie; Tan, Changwei; Song, Xiaoyu; Wang, Jingdi

    2005-01-01

    Field experiments were conducted to examine the influence factors of cultivar, nitrogen application and irrigation on grain protein content, gluten content and grain hardness in three winter wheat cultivars under four levels of nitrogen and irrigation treatments. Firstly, the influence of cultivars and environment factors on grain quality were studied, the effective factors were cultivars, irrigation, fertilization, et al. Secondly, total nitrogen content around winter wheat anthesis stage was proved to be significant correlative with grain protein content, and spectral vegetation index significantly correlated to total nitrogen content around anthesis stage were the potential indicators for grain protein content. Accumulation of total nitrogen content and its transfer to grain is the physical link to produce the final grain protein, and total nitrogen content at anthesis stage was proved to be an indicator of final grain protein content. The selected normalized photochemical reflectance index (NPRI) was proved to be able to predict of grain protein content on the close correlation between the ratio of total carotenoid to chlorophyll a and total nitrogen content. The method contributes towards developing optimal procedures for predicting wheat grain quality through analysis of their canopy reflected spectrum at anthesis stage. Regression equations were established for forecasting grain protein and dry gluten content by total nitrogen content at anthesis stage, so it is feasible for forecasting grain quality by establishing correlation equations between biochemical constitutes and canopy reflected spectrum.

  8. Compressive strength of human openwedges: a selection method

    NASA Astrophysics Data System (ADS)

    Follet, H.; Gotteland, M.; Bardonnet, R.; Sfarghiu, A. M.; Peyrot, J.; Rumelhart, C.

    2004-02-01

    A series of 44 samples of bone wedges of human origin, intended for allograft openwedge osteotomy and obtained without particular precautions during hip arthroplasty were re-examined. After viral inactivity chemical treatment, lyophilisation and radio-sterilisation (intended to produce optimal health safety), the compressive strength, independent of age, sex and the height of the sample (or angle of cut), proved to be too widely dispersed [ 10{-}158 MPa] in the first study. We propose a method for selecting samples which takes into account their geometry (width, length, thicknesses, cortical surface area). Statistical methods (Principal Components Analysis PCA, Hierarchical Cluster Analysis, Multilinear regression) allowed final selection of 29 samples having a mean compressive strength σ_{max} =103 MPa ± 26 and with variation [ 61{-}158 MPa] . These results are equivalent or greater than average materials currently used in openwedge osteotomy.

  9. Effect of molecular parameters on the binding of phenoxyacetic acid derivatives to albumins.

    PubMed

    Cserháti, T; Forgács, E; Deyl, Z; Miksík, I

    2001-03-25

    The interaction of 12 phenoxyacetic acid derivatives with human and serum albumin as well as with egg albumin was studied by charge-transfer reversed-phase (RP) thin-layer chromatography (TLC) and the relative strength of interaction was calculated. Each phenoxyacetic acid derivative interacted with human and bovine serum albumins whereas no interaction was observed with egg albumin. Stepwise regression analysis proved that the lipophilicity of the derivatives exert a significant impact on their capacity to bind to serum albumins. This result supports the hypothesis that the binding of phenoxyacetic acid derivatives to albumins may involve hydrophobic forces occurring between the corresponding apolar substructures of these derivatives and the amino acid side chains.

  10. The effect of telehealth systems and satisfaction with health expenditure among patients with metabolic syndrome.

    PubMed

    Uei, Shu-Lin; Tsai, Chung-Hung; Kuo, Yu-Ming

    2016-04-29

    Telehealth cost analysis has become a crucial issue for governments in recent years. In this study, we examined cases of metabolic syndrome in Hualien County, Taiwan. This research adopted the framework proposed by Marchand to establish a study process. In addition, descriptive statistics, a t test, analysis of variance, and regression analysis were employed to analyze 100 questionnaires. The results of the t$ test revealed significant differences in medical health expenditure, number of clinical visits for medical treatment, average amount of time spent commuting to clinics, amount of time spent undergoing medical treatment, and average number of people accompanying patients to medical care facilities or assisting with other tasks in the past one month, indicating that offering telehealth care services can reduce health expenditure. The statistical analysis results revealed that customer satisfaction has a positive effect on reducing health expenditure. Therefore, this study proves that telehealth care systems can effectively reduce health expenditure and directly improve customer satisfaction with medical treatment.

  11. Validity and reliability of dental age estimation of teeth root translucency based on digital luminance determination.

    PubMed

    Ramsthaler, Frank; Kettner, Mattias; Verhoff, Marcel A

    2014-01-01

    In forensic anthropological casework, estimating age-at-death is key to profiling unknown skeletal remains. The aim of this study was to examine the reliability of a new, simple, fast, and inexpensive digital odontological method for age-at-death estimation. The method is based on the original Lamendin method, which is a widely used technique in the repertoire of odontological aging methods in forensic anthropology. We examined 129 single root teeth employing a digital camera and imaging software for the measurement of the luminance of the teeth's translucent root zone. Variability in luminance detection was evaluated using statistical technical error of measurement analysis. The method revealed stable values largely unrelated to observer experience, whereas requisite formulas proved to be camera-specific and should therefore be generated for an individual recording setting based on samples of known chronological age. Multiple regression analysis showed a highly significant influence of the coefficients of the variables "arithmetic mean" and "standard deviation" of luminance for the regression formula. For the use of this primer multivariate equation for age-at-death estimation in casework, a standard error of the estimate of 6.51 years was calculated. Step-by-step reduction of the number of embedded variables to linear regression analysis employing the best contributor "arithmetic mean" of luminance yielded a regression equation with a standard error of 6.72 years (p < 0.001). The results of this study not only support the premise of root translucency as an age-related phenomenon, but also demonstrate that translucency reflects a number of other influencing factors in addition to age. This new digital measuring technique of the zone of dental root luminance can broaden the array of methods available for estimating chronological age, and furthermore facilitate measurement and age classification due to its low dependence on observer experience.

  12. Quantification and regionalization of groundwater recharge in South-Central Kansas: Integrating field characterization, statistical analysis, and GIS

    USGS Publications Warehouse

    Sophocleous, M.

    2000-01-01

    A practical methodology for recharge characterization was developed based on several years of field-oriented research at 10 sites in the Great Bend Prairie of south-central Kansas. This methodology combines the soil-water budget on a storm-by-storm year-round basis with the resulting watertable rises. The estimated 1985-1992 average annual recharge was less than 50mm/year with a range from 15 mm/year (during the 1998 drought) to 178 mm/year (during the 1993 flood year). Most of this recharge occurs during the spring months. To regionalize these site-specific estimates, an additional methodology based on multiple (forward) regression analysis combined with classification and GIS overlay analyses was developed and implemented. The multiple regression analysis showed that the most influential variables were, in order of decreasing importance, total annual precipitation, average maximum springtime soil-profile water storage, average shallowest springtime depth to watertable, and average springtime precipitation rate. Therefore, four GIS (ARC/INFO) data "layers" or coverages were constructed for the study region based on these four variables, and each such coverage was classified into the same number of data classes to avoid biasing the results. The normalized regression coefficients were employed to weigh the class rankings of each recharge-affecting variable. This approach resulted in recharge zonations that agreed well with the site recharge estimates. During the "Great Flood of 1993," when rainfall totals exceeded normal levels by -200% in the northern portion of the study region, the developed regionalization methodology was tested against such extreme conditions, and proved to be both practical, based on readily available or easily measurable data, and robust. It was concluded that the combination of multiple regression and GIS overlay analyses is a powerful and practical approach to regionalizing small samples of recharge estimates.

  13. Automated particle identification through regression analysis of size, shape and colour

    NASA Astrophysics Data System (ADS)

    Rodriguez Luna, J. C.; Cooper, J. M.; Neale, S. L.

    2016-04-01

    Rapid point of care diagnostic tests and tests to provide therapeutic information are now available for a range of specific conditions from the measurement of blood glucose levels for diabetes to card agglutination tests for parasitic infections. Due to a lack of specificity these test are often then backed up by more conventional lab based diagnostic methods for example a card agglutination test may be carried out for a suspected parasitic infection in the field and if positive a blood sample can then be sent to a lab for confirmation. The eventual diagnosis is often achieved by microscopic examination of the sample. In this paper we propose a computerized vision system for aiding in the diagnostic process; this system used a novel particle recognition algorithm to improve specificity and speed during the diagnostic process. We will show the detection and classification of different types of cells in a diluted blood sample using regression analysis of their size, shape and colour. The first step is to define the objects to be tracked by a Gaussian Mixture Model for background subtraction and binary opening and closing for noise suppression. After subtracting the objects of interest from the background the next challenge is to predict if a given object belongs to a certain category or not. This is a classification problem, and the output of the algorithm is a Boolean value (true/false). As such the computer program should be able to "predict" with reasonable level of confidence if a given particle belongs to the kind we are looking for or not. We show the use of a binary logistic regression analysis with three continuous predictors: size, shape and color histogram. The results suggest this variables could be very useful in a logistic regression equation as they proved to have a relatively high predictive value on their own.

  14. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems

    PubMed Central

    Zainudin, Suhaila; Arif, Shereena M.

    2017-01-01

    Gene regulatory network (GRN) reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR) to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C) as a direct interaction (A → C). Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5. PMID:28250767

  15. Simulation of urban land surface temperature based on sub-pixel land cover in a coastal city

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaofeng; Deng, Lei; Feng, Huihui; Zhao, Yanchuang

    2014-11-01

    The sub-pixel urban land cover has been proved to have obvious correlations with land surface temperature (LST). Yet these relationships have seldom been used to simulate LST. In this study we provided a new approach of urban LST simulation based on sub-pixel land cover modeling. Landsat TM/ETM+ images of Xiamen city, China on both the January of 2002 and 2007 were used to acquire land cover and then extract the transformation rule using logistic regression. The transformation possibility was taken as its percent in the same pixel after normalization. And cellular automata were used to acquire simulated sub-pixel land cover on 2007 and 2017. On the other hand, the correlations between retrieved LST and sub-pixel land cover achieved by spectral mixture analysis in 2002 were examined and a regression model was built. Then the regression model was used on simulated 2007 land cover to model the LST of 2007. Finally the LST of 2017 was simulated for urban planning and management. The results showed that our method is useful in LST simulation. Although the simulation accuracy is not quite satisfactory, it provides an important idea and a good start in the modeling of urban LST.

  16. Stability indicating high performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in combined dosage form

    PubMed Central

    Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao

    2011-01-01

    A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride (Rf value of 0.55±0.02) and pantoprazole sodium (Rf value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance–absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9988±0.0012 in the concentration range of 100–400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9990±0.0008 in the concentration range of 200–1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method. PMID:29403710

  17. Stability indicating high performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in combined dosage form.

    PubMed

    Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao

    2011-11-01

    A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F 254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride ( R f value of 0.55±0.02) and pantoprazole sodium ( R f value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance-absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9988±0.0012 in the concentration range of 100-400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9990±0.0008 in the concentration range of 200-1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method.

  18. Systolic time interval v heart rate regression equations using atropine: reproducibility studies.

    PubMed Central

    Kelman, A W; Sumner, D J; Whiting, B

    1981-01-01

    1. Systolic time intervals (STI) were recorded in six normal male subjects over a period of 3 weeks. On one day per week, each subject received incremental doses of atropine intravenously to increase heart rate, allowing the determination of individual STI v HR regression equations. On the other days STI were recorded with the subjects resting, in the supine position. 2. There were highly significant regression relationships between heart rate and both LVET and QS2, but not between heart rate and PEP. 3. The regression relationships showed little intra-subject variability, but a large degree of inter-subject variability: they proved adequate to correct the STI for the daily fluctuations in heart rate. 4. Administration of small doses of atropine intravenously provides a satisfactory and convenient method of deriving individual STI v HR regression equations which can be applied over a period of weeks. PMID:7248136

  19. Systolic time interval v heart rate regression equations using atropine: reproducibility studies.

    PubMed

    Kelman, A W; Sumner, D J; Whiting, B

    1981-07-01

    1. Systolic time intervals (STI) were recorded in six normal male subjects over a period of 3 weeks. On one day per week, each subject received incremental doses of atropine intravenously to increase heart rate, allowing the determination of individual STI v HR regression equations. On the other days STI were recorded with the subjects resting, in the supine position. 2. There were highly significant regression relationships between heart rate and both LVET and QS2, but not between heart rate and PEP. 3. The regression relationships showed little intra-subject variability, but a large degree of inter-subject variability: they proved adequate to correct the STI for the daily fluctuations in heart rate. 4. Administration of small doses of atropine intravenously provides a satisfactory and convenient method of deriving individual STI v HR regression equations which can be applied over a period of weeks.

  20. [Results of the ocular hypertension treatment study and the confocal scanning laser ophthalmoscopy ancillary study and evaluation of the heidelberg retina tomograph].

    PubMed

    Klatt, K; Schmidt, E; Scheuerle, A F

    2008-04-01

    The Ocular Hypertension Treatment Study (OHTS) has shown that analyzing changes of the optic disc configuration is superior to evaluating visual field findings for the early detection of primary open angle glaucoma. The Confocal Scanning Laser Ophthalmoscopy Ancillary Study (CSLO) is the first study to reveal that certain topographic baseline measurements of the optic disc are significantly associated with the development of primary open angle glaucoma in patients with ocular hypertension. An abnormally increased "mean height contour" value proved to be the individual parameter connected with the highest risk. The reliability of the Moorfields Regression Analysis of certain individual sectors during early detection of a primary angle glaucoma is higher than that of the global measurement. The temporal superior and inferior as well as the nasal inferior sectors have the highest positive predictive values and the largest risks in both univariate and multivariate analysis.

  1. Analysis of cohort studies with multivariate and partially observed disease classification data.

    PubMed

    Chatterjee, Nilanjan; Sinha, Samiran; Diver, W Ryan; Feigelson, Heather Spencer

    2010-09-01

    Complex diseases like cancers can often be classified into subtypes using various pathological and molecular traits of the disease. In this article, we develop methods for analysis of disease incidence in cohort studies incorporating data on multiple disease traits using a two-stage semiparametric Cox proportional hazards regression model that allows one to examine the heterogeneity in the effect of the covariates by the levels of the different disease traits. For inference in the presence of missing disease traits, we propose a generalization of an estimating equation approach for handling missing cause of failure in competing-risk data. We prove asymptotic unbiasedness of the estimating equation method under a general missing-at-random assumption and propose a novel influence-function-based sandwich variance estimator. The methods are illustrated using simulation studies and a real data application involving the Cancer Prevention Study II nutrition cohort.

  2. Nomogram Prediction of Overall Survival After Curative Irradiation for Uterine Cervical Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, YoungSeok; Yoo, Seong Yul; Kim, Mi-Sook

    Purpose: The purpose of this study was to develop a nomogram capable of predicting the probability of 5-year survival after radical radiotherapy (RT) without chemotherapy for uterine cervical cancer. Methods and Materials: We retrospectively analyzed 549 patients that underwent radical RT for uterine cervical cancer between March 1994 and April 2002 at our institution. Multivariate analysis using Cox proportional hazards regression was performed and this Cox model was used as the basis for the devised nomogram. The model was internally validated for discrimination and calibration by bootstrap resampling. Results: By multivariate regression analysis, the model showed that age, hemoglobin levelmore » before RT, Federation Internationale de Gynecologie Obstetrique (FIGO) stage, maximal tumor diameter, lymph node status, and RT dose at Point A significantly predicted overall survival. The survival prediction model demonstrated good calibration and discrimination. The bootstrap-corrected concordance index was 0.67. The predictive ability of the nomogram proved to be superior to FIGO stage (p = 0.01). Conclusions: The devised nomogram offers a significantly better level of discrimination than the FIGO staging system. In particular, it improves predictions of survival probability and could be useful for counseling patients, choosing treatment modalities and schedules, and designing clinical trials. However, before this nomogram is used clinically, it should be externally validated.« less

  3. The quantitative structure-insecticidal activity relationships from plant derived compounds against chikungunya and zika Aedes aegypti (Diptera:Culicidae) vector.

    PubMed

    Saavedra, Laura M; Romanelli, Gustavo P; Rozo, Ciro E; Duchowicz, Pablo R

    2018-01-01

    The insecticidal activity of a series of 62 plant derived molecules against the chikungunya, dengue and zika vector, the Aedes aegypti (Diptera:Culicidae) mosquito, is subjected to a Quantitative Structure-Activity Relationships (QSAR) analysis. The Replacement Method (RM) variable subset selection technique based on Multivariable Linear Regression (MLR) proves to be successful for exploring 4885 molecular descriptors calculated with Dragon 6. The predictive capability of the obtained models is confirmed through an external test set of compounds, Leave-One-Out (LOO) cross-validation and Y-Randomization. The present study constitutes a first necessary computational step for designing less toxic insecticides. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  5. Real time study of amalgam formation and mercury adsorption on thin gold film by total internal reflection ellipsometry

    NASA Astrophysics Data System (ADS)

    Paulauskas, A.; Selskis, A.; Bukauskas, V.; Vaicikauskas, V.; Ramanavicius, A.; Balevicius, Z.

    2018-01-01

    Total internal reflection ellipsometry (TIRE) was utilized in its dynamic data acquisition mode to reveal the percentage of mercury present in an amalgam surface layer. In determining the optical constants of the amalgam film, the non-homogeneities of the formed surface layer were taken into account. The composition of the amalgam layer by percentage was determined using the EMA Bruggemann model for the analysis of the TIRE data. Regression results showed that amalgam layer consisted of mercury 16.00 ± 0.43% and gold 84.00 ± 0.43%. This real time TIRE analysis has shown that for these studies method can detect 0.6 ± 0.4% of mercury on a gold surface, proving that this is a suitable optical technique for obtaining real time readouts. The structural analysis of SEM and AFM have shown that the amalgam layer had a dendritic structure, which formation was determined by the weak adhesion of the gold atoms onto its surface.

  6. Exsanguinated blood volume estimation using fractal analysis of digital images.

    PubMed

    Sant, Sonia P; Fairgrieve, Scott I

    2012-05-01

    The estimation of bloodstain volume using fractal analysis of digital images of passive blood stains is presented. Binary digital photos of bloodstains of known volumes (ranging from 1 to 7 mL), dispersed in a defined area, were subjected to image analysis using FracLac V. 2.0 for ImageJ. The box-counting method was used to generate a fractal dimension for each trial. A positive correlation between the generated fractal number and the volume of blood was found (R(2) = 0.99). Regression equations were produced to estimate the volume of blood in blind trials. An error rate ranging from 78% for 1 mL to 7% for 6 mL demonstrated that as the volume increases so does the accuracy of the volume estimation. This method used in the preliminary study proved that bloodstain patterns may be deconstructed into mathematical parameters, thus removing the subjective element inherent in other methods of volume estimation. © 2012 American Academy of Forensic Sciences.

  7. River flow prediction using hybrid models of support vector regression with the wavelet transform, singular spectrum analysis and chaotic approach

    NASA Astrophysics Data System (ADS)

    Baydaroğlu, Özlem; Koçak, Kasım; Duran, Kemal

    2018-06-01

    Prediction of water amount that will enter the reservoirs in the following month is of vital importance especially for semi-arid countries like Turkey. Climate projections emphasize that water scarcity will be one of the serious problems in the future. This study presents a methodology for predicting river flow for the subsequent month based on the time series of observed monthly river flow with hybrid models of support vector regression (SVR). Monthly river flow over the period 1940-2012 observed for the Kızılırmak River in Turkey has been used for training the method, which then has been applied for predictions over a period of 3 years. SVR is a specific implementation of support vector machines (SVMs), which transforms the observed input data time series into a high-dimensional feature space (input matrix) by way of a kernel function and performs a linear regression in this space. SVR requires a special input matrix. The input matrix was produced by wavelet transforms (WT), singular spectrum analysis (SSA), and a chaotic approach (CA) applied to the input time series. WT convolutes the original time series into a series of wavelets, and SSA decomposes the time series into a trend, an oscillatory and a noise component by singular value decomposition. CA uses a phase space formed by trajectories, which represent the dynamics producing the time series. These three methods for producing the input matrix for the SVR proved successful, while the SVR-WT combination resulted in the highest coefficient of determination and the lowest mean absolute error.

  8. Optimal Wavelength Selection on Hyperspectral Data with Fused Lasso for Biomass Estimation of Tropical Rain Forest

    NASA Astrophysics Data System (ADS)

    Takayama, T.; Iwasaki, A.

    2016-06-01

    Above-ground biomass prediction of tropical rain forest using remote sensing data is of paramount importance to continuous large-area forest monitoring. Hyperspectral data can provide rich spectral information for the biomass prediction; however, the prediction accuracy is affected by a small-sample-size problem, which widely exists as overfitting in using high dimensional data where the number of training samples is smaller than the dimensionality of the samples due to limitation of require time, cost, and human resources for field surveys. A common approach to addressing this problem is reducing the dimensionality of dataset. Also, acquired hyperspectral data usually have low signal-to-noise ratio due to a narrow bandwidth and local or global shifts of peaks due to instrumental instability or small differences in considering practical measurement conditions. In this work, we propose a methodology based on fused lasso regression that select optimal bands for the biomass prediction model with encouraging sparsity and grouping, which solves the small-sample-size problem by the dimensionality reduction from the sparsity and the noise and peak shift problem by the grouping. The prediction model provided higher accuracy with root-mean-square error (RMSE) of 66.16 t/ha in the cross-validation than other methods; multiple linear analysis, partial least squares regression, and lasso regression. Furthermore, fusion of spectral and spatial information derived from texture index increased the prediction accuracy with RMSE of 62.62 t/ha. This analysis proves efficiency of fused lasso and image texture in biomass estimation of tropical forests.

  9. Identification of factors affecting birth rate in Czech Republic

    NASA Astrophysics Data System (ADS)

    Zámková, Martina; Blašková, Veronika

    2013-10-01

    This article is concerned with identifying economic factors primarily that affect birth rates in Czech Republic. To find the relationship between the magnitudes, we used the multivariate regression analysis and for modeling, we used a time series of annual values (1994-2011) both economic indicators and indicators related to demographics. Due to potential problems with apparent dependence we first cleansed all series obtained from the Czech Statistical Office using first differences. It is clear from the final model that meets all assumptions that there is a positive correlation between birth rates and the financial situation of households. We described the financial situation of households by GDP per capita, gross wages and consumer price index. As expected a positive correlation was proved for GDP per capita and gross wages and negative dependence was proved for the consumer price index. In addition to these economic variables in the model there were used also demographic characteristics of the workforce and the number of employed people. It can be stated that if the Czech Republic wants to support an increase in the birth rate, it is necessary to consider the financial support for households with small children.

  10. Discrete post-processing of total cloud cover ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian

    2017-04-01

    This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.

  11. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity increase.

  12. Efficient Robust Regression via Two-Stage Generalized Empirical Likelihood

    PubMed Central

    Bondell, Howard D.; Stefanski, Leonard A.

    2013-01-01

    Large- and finite-sample efficiency and resistance to outliers are the key goals of robust statistics. Although often not simultaneously attainable, we develop and study a linear regression estimator that comes close. Efficiency obtains from the estimator’s close connection to generalized empirical likelihood, and its favorable robustness properties are obtained by constraining the associated sum of (weighted) squared residuals. We prove maximum attainable finite-sample replacement breakdown point, and full asymptotic efficiency for normal errors. Simulation evidence shows that compared to existing robust regression estimators, the new estimator has relatively high efficiency for small sample sizes, and comparable outlier resistance. The estimator is further illustrated and compared to existing methods via application to a real data set with purported outliers. PMID:23976805

  13. Do drug treatment variables predict cognitive performance in multidrug-treated opioid-dependent patients? A regression analysis study

    PubMed Central

    2012-01-01

    Background Cognitive deficits and multiple psychoactive drug regimens are both common in patients treated for opioid-dependence. Therefore, we examined whether the cognitive performance of patients in opioid-substitution treatment (OST) is associated with their drug treatment variables. Methods Opioid-dependent patients (N = 104) who were treated either with buprenorphine or methadone (n = 52 in both groups) were given attention, working memory, verbal, and visual memory tests after they had been a minimum of six months in treatment. Group-wise results were analysed by analysis of variance. Predictors of cognitive performance were examined by hierarchical regression analysis. Results Buprenorphine-treated patients performed statistically significantly better in a simple reaction time test than methadone-treated ones. No other significant differences between groups in cognitive performance were found. In each OST drug group, approximately 10% of the attention performance could be predicted by drug treatment variables. Use of benzodiazepine medication predicted about 10% of performance variance in working memory. Treatment with more than one other psychoactive drug (than opioid or BZD) and frequent substance abuse during the past month predicted about 20% of verbal memory performance. Conclusions Although this study does not prove a causal relationship between multiple prescription drug use and poor cognitive functioning, the results are relevant for psychosocial recovery, vocational rehabilitation, and psychological treatment of OST patients. Especially for patients with BZD treatment, other treatment options should be actively sought. PMID:23121989

  14. A monoclonal cytolytic T-lymphocyte response observed in a melanoma patient vaccinated with a tumor-specific antigenic peptide encoded by gene MAGE-3

    PubMed Central

    Coulie, Pierre G.; Karanikas, Vaios; Colau, Didier; Lurquin, Christophe; Landry, Claire; Marchand, Marie; Dorval, Thierry; Brichard, Vincent; Boon, Thierry

    2001-01-01

    Vaccination of melanoma patients with tumor-specific antigens recognized by cytolytic T lymphocytes (CTL) produces significant tumor regressions in a minority of patients. These regressions appear to occur in the absence of massive CTL responses. To detect low-level responses, we resorted to antigenic stimulation of blood lymphocyte cultures in limiting dilution conditions, followed by tetramer analysis, cloning of the tetramer-positive cells, and T-cell receptor (TCR) sequence analysis of the CTL clones that showed strict specificity for the tumor antigen. A monoclonal CTL response against a MAGE-3 antigen was observed in a melanoma patient, who showed partial rejection of a large metastasis after treatment with a vaccine containing only the tumor-specific antigenic peptide. Tetramer analysis after in vitro restimulation indicated that about 1/40,000 postimmunization CD8+ blood lymphocytes were directed against the antigen. The same TCR was present in all of the positive microcultures. TCR evaluation carried out directly on blood lymphocytes by PCR amplification led to a similar frequency estimate after immunization, whereas the TCR was not found among 2.5 × 106 CD8+ lymphocytes collected before immunization. Our results prove unambiguously that vaccines containing only a tumor-specific antigenic peptide can elicit a CTL response. Even though they provide no information about the effector mechanisms responsible for the observed reduction in tumor mass in this patient, they would suggest that low-level CTL responses can initiate tumor rejection. PMID:11517302

  15. Exploring the social determinants of mental health service use using intersectionality theory and CART analysis.

    PubMed

    Cairney, John; Veldhuizen, Scott; Vigod, Simone; Streiner, David L; Wade, Terrance J; Kurdyak, Paul

    2014-02-01

    Fewer than half of individuals with a mental disorder seek formal care in a given year. Much research has been conducted on the factors that influence service use in this population, but the methods generally used cannot easily identify the complex interactions that are thought to exist. In this paper, we examine predictors of subsequent service use among respondents to a population health survey who met criteria for a past-year mood, anxiety or substance-related disorder. To determine service use, we use an administrative database including all physician consultations in the period of interest. To identify predictors, we use classification tree (CART) analysis, a data mining technique with the ability to identify unsuspected interactions. We compare results to those from logistic regression models. We identify 1213 individuals with past-year disorder. In the year after the survey, 24% (n=312) of these had a mental health-related physician consultation. Logistic regression revealed that age, sex and marital status predicted service use. CART analysis yielded a set of rules based on age, sex, marital status and income adequacy, with marital status playing a role among men and by income adequacy important among women. CART analysis proved moderately effective overall, with agreement of 60%, sensitivity of 82% and specificity of 53%. Results highlight the potential of data-mining techniques to uncover complex interactions, and offer support to the view that the intersection of multiple statuses influence health and behaviour in ways that are difficult to identify with conventional statistics. The disadvantages of these methods are also discussed.

  16. Analysis of the inter- and extracellular formation of platinum nanoparticles by Fusarium oxysporum f. sp. lycopersici using response surface methodology

    NASA Astrophysics Data System (ADS)

    Riddin, T. L.; Gericke, M.; Whiteley, C. G.

    2006-07-01

    Fusarium oxysporum fungal strain was screened and found to be successful for the inter- and extracellular production of platinum nanoparticles. Nanoparticle formation was visually observed, over time, by the colour of the extracellular solution and/or the fungal biomass turning from yellow to dark brown, and their concentration was determined from the amount of residual hexachloroplatinic acid measured from a standard curve at 456 nm. The extracellular nanoparticles were characterized by transmission electron microscopy. Nanoparticles of varying size (10-100 nm) and shape (hexagons, pentagons, circles, squares, rectangles) were produced at both extracellular and intercellular levels by the Fusarium oxysporum. The particles precipitate out of solution and bioaccumulate by nucleation either intercellularly, on the cell wall/membrane, or extracellularly in the surrounding medium. The importance of pH, temperature and hexachloroplatinic acid (H2PtCl6) concentration in nanoparticle formation was examined through the use of a statistical response surface methodology. Only the extracellular production of nanoparticles proved to be statistically significant, with a concentration yield of 4.85 mg l-1 estimated by a first-order regression model. From a second-order polynomial regression, the predicted yield of nanoparticles increased to 5.66 mg l-1 and, after a backward step, regression gave a final model with a yield of 6.59 mg l-1.

  17. Properties of added variable plots in Cox's regression model.

    PubMed

    Lindkvist, M

    2000-03-01

    The added variable plot is useful for examining the effect of a covariate in regression models. The plot provides information regarding the inclusion of a covariate, and is useful in identifying influential observations on the parameter estimates. Hall et al. (1996) proposed a plot for Cox's proportional hazards model derived by regarding the Cox model as a generalized linear model. This paper proves and discusses properties of this plot. These properties make the plot a valuable tool in model evaluation. Quantities considered include parameter estimates, residuals, leverage, case influence measures and correspondence to previously proposed residuals and diagnostics.

  18. Appendiceal outer diameter as an indicator for differentiating appendiceal mucocele from appendicitis.

    PubMed

    Lien, Wan-Ching; Huang, Shih-Pei; Chi, Chun-Lin; Liu, Kao-Lang; Lin, Ming-Tsan; Lai, Ting-I; Liu, Yueh-Ping; Wang, Hsiu-Po

    2006-11-01

    Patients with appendiceal mucocele (AM) commonly present with features indicative of acute appendicitis. In emergency departments, accurate preoperative diagnosis is crucial to prompt appropriate treatment. This study investigates the clinical and sonographic characteristics of AM, which may prove useful in preoperatively differentiating AM from appendicitis. This case-control study compares the clinical and sonographic findings of 16 histologically confirmed AM with sex- and age-matched control subjects (n = 64) with appendicitis by a 1:4 ratio. Conditional logistic regression was applied to estimate the odds ratio (OR) and 95% confidence intervals (CI) of clinical and sonographic parameters associated with AM. Univariate analysis demonstrated that the larger appendiceal outer diameter by sonography was positively correlated with diagnosis of AM (OR, 2.31; 95% CI, 1.42-3.72) and right lower quadrant abdominal pain was negatively correlated (OR, 0.38; 95% CI, 0.17-0.82). However, multiple regression analysis suggested that only outer diameter remained significant (OR, 2.21; 95% CI, 1.36-3.59) after adjusting for age, sex, and right lower quadrant pain. An outer diameter of 15 mm or more was predictive of AM diagnosis, with a sensitivity of 83% and specificity of 92%. When the threshold is set at 15 mm, appendiceal outer diameter by sonography is a useful preoperative measurement for differentiating between AM and acute appendicitis.

  19. Understanding handpump sustainability: Determinants of rural water source functionality in the Greater Afram Plains region of Ghana.

    PubMed

    Fisher, Michael B; Shields, Katherine F; Chan, Terence U; Christenson, Elizabeth; Cronk, Ryan D; Leker, Hannah; Samani, Destina; Apoya, Patrick; Lutz, Alexandra; Bartram, Jamie

    2015-10-01

    Safe drinking water is critical to human health and development. In rural sub-Saharan Africa, most improved water sources are boreholes with handpumps; studies suggest that up to one third of these handpumps are nonfunctional at any given time. This work presents findings from a secondary analysis of cross-sectional data from 1509 water sources in 570 communities in the rural Greater Afram Plains (GAP) region of Ghana; one of the largest studies of its kind. 79.4% of enumerated water sources were functional when visited; in multivariable regressions, functionality depended on source age, management, tariff collection, the number of other sources in the community, and the district. A Bayesian network (BN) model developed using the same data set found strong dependencies of functionality on implementer, pump type, management, and the availability of tools, with synergistic effects from management determinants on functionality, increasing the likelihood of a source being functional from a baseline of 72% to more than 97% with optimal management and available tools. We suggest that functionality may be a dynamic equilibrium between regular breakdowns and repairs, with management a key determinant of repair rate. Management variables may interact synergistically in ways better captured by BN analysis than by logistic regressions. These qualitative findings may prove generalizable beyond the study area, and may offer new approaches to understanding and increasing handpump functionality and safe water access.

  20. Fresh Biomass Estimation in Heterogeneous Grassland Using Hyperspectral Measurements and Multivariate Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Darvishzadeh, R.; Skidmore, A. K.; Mirzaie, M.; Atzberger, C.; Schlerf, M.

    2014-12-01

    Accurate estimation of grassland biomass at their peak productivity can provide crucial information regarding the functioning and productivity of the rangelands. Hyperspectral remote sensing has proved to be valuable for estimation of vegetation biophysical parameters such as biomass using different statistical techniques. However, in statistical analysis of hyperspectral data, multicollinearity is a common problem due to large amount of correlated hyper-spectral reflectance measurements. The aim of this study was to examine the prospect of above ground biomass estimation in a heterogeneous Mediterranean rangeland employing multivariate calibration methods. Canopy spectral measurements were made in the field using a GER 3700 spectroradiometer, along with concomitant in situ measurements of above ground biomass for 170 sample plots. Multivariate calibrations including partial least squares regression (PLSR), principal component regression (PCR), and Least-Squared Support Vector Machine (LS-SVM) were used to estimate the above ground biomass. The prediction accuracy of the multivariate calibration methods were assessed using cross validated R2 and RMSE. The best model performance was obtained using LS_SVM and then PLSR both calibrated with first derivative reflectance dataset with R2cv = 0.88 & 0.86 and RMSEcv= 1.15 & 1.07 respectively. The weakest prediction accuracy was appeared when PCR were used (R2cv = 0.31 and RMSEcv= 2.48). The obtained results highlight the importance of multivariate calibration methods for biomass estimation when hyperspectral data are used.

  1. Body composition: A predictive factor of cycle fecundity

    PubMed Central

    Kayatas, Semra; Api, Murat; Kurt, Didar; Eroglu, Mustafa; Arınkan, Sevcan Arzu

    2014-01-01

    Objective To study the effect of body composition on reproduction in women with unexplained infertility treated with a controlled ovarian hyperstimulation and intrauterine insemination programme. Methods This prospective observational study was conducted on 308 unexplained infertile women who were scheduled for a controlled ovarian hyperstimulation and intrauterine insemination programme and were grouped as pregnant and non-pregnant. Anthropometric measurements were performed using TANITA-420MA before the treatment cycle. Body composition was determined using a bioelectrical impedance analysis system. Results Body fat mass was significantly lower in pregnant women than in non-pregnant women (15.61±3.65 vs.18.78±5.97, respectively) (p=0.01). In a multiple regression analysis, body fat mass proved to have a stronger association with fecundity than the percentage of body fat, body mass index, or the waist/hip ratio (standardized regression coefficient≥0.277, t-value≥2.537; p<0.05). The cut-off value of fat mass, which was evaluated using the receiver operating characteristics curve, was 16.65 with a sensitivity of 61.8% and a specificity of 70.2%. Below this cut-off value, the odds of the pregnancy occurrence was found to be 2.5 times more likely. Conclusion Body fat mass can be predictive for pregnancy in patients with unexplained infertility scheduled for a controlled ovarian hyperstimulation and intrauterine insemination programme. PMID:25045631

  2. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    NASA Astrophysics Data System (ADS)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of the model is rather poor, and possible explanations are discussed.

  3. Geospatial techniques for allocating vulnerability zoning of geohazards along the Karakorum Highway, Gilgit-Baltistan-Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.

    2016-12-01

    The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.

  4. A comparison of long-term parallel measurements of sunshine duration obtained with a Campbell-Stokes sunshine recorder and two automated sunshine sensors

    NASA Astrophysics Data System (ADS)

    Baumgartner, D. J.; Pötzi, W.; Freislich, H.; Strutzmann, H.; Veronig, A. M.; Foelsche, U.; Rieder, H. E.

    2017-06-01

    In recent decades, automated sensors for sunshine duration (SD) measurements have been introduced in meteorological networks, thereby replacing traditional instruments, most prominently the Campbell-Stokes (CS) sunshine recorder. Parallel records of automated and traditional SD recording systems are rare. Nevertheless, such records are important to understand the differences/similarities in SD totals obtained with different instruments and how changes in monitoring device type affect the homogeneity of SD records. This study investigates the differences/similarities in parallel SD records obtained with a CS and two automated SD sensors between 2007 and 2016 at the Kanzelhöhe Observatory, Austria. Comparing individual records of daily SD totals, we find differences of both positive and negative sign, with smallest differences between the automated sensors. The larger differences between CS-derived SD totals and those from automated sensors can be attributed (largely) to the higher sensitivity threshold of the CS instrument. Correspondingly, the closest agreement among all sensors is found during summer, the time of year when sensitivity thresholds are least critical. Furthermore, we investigate the performance of various models to create the so-called sensor-type-equivalent (STE) SD records. Our analysis shows that regression models including all available data on daily (or monthly) time scale perform better than simple three- (or four-) point regression models. Despite general good performance, none of the considered regression models (of linear or quadratic form) emerges as the "optimal" model. Although STEs prove useful for relating SD records of individual sensors on daily/monthly time scales, this does not ensure that STE (or joint) records can be used for trend analysis.

  5. One-year test-retest reliability of intrinsic connectivity network fMRI in older adults

    PubMed Central

    Guo, Cong C.; Kurth, Florian; Zhou, Juan; Mayer, Emeran A.; Eickhoff, Simon B; Kramer, Joel H.; Seeley, William W.

    2014-01-01

    “Resting-state” or task-free fMRI can assess intrinsic connectivity network (ICN) integrity in health and disease, suggesting a potential for use of these methods as disease-monitoring biomarkers. Numerous analytical options are available, including model-driven ROI-based correlation analysis and model-free, independent component analysis (ICA). High test-retest reliability will be a necessary feature of a successful ICN biomarker, yet available reliability data remains limited. Here, we examined ICN fMRI test-retest reliability in 24 healthy older subjects scanned roughly one year apart. We focused on the salience network, a disease-relevant ICN not previously subjected to reliability analysis. Most ICN analytical methods proved reliable (intraclass coefficients > 0.4) and could be further improved by wavelet analysis. Seed-based ROI correlation analysis showed high map-wise reliability, whereas graph theoretical measures and temporal concatenation group ICA produced the most reliable individual unit-wise outcomes. Including global signal regression in ROI-based correlation analyses reduced reliability. Our study provides a direct comparison between the most commonly used ICN fMRI methods and potential guidelines for measuring intrinsic connectivity in aging control and patient populations over time. PMID:22446491

  6. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  7. Simulation Analysis of Helicopter Ground Resonance Nonlinear Dynamics

    NASA Astrophysics Data System (ADS)

    Zhu, Yan; Lu, Yu-hui; Ling, Ai-min

    2017-07-01

    In order to accurately predict the dynamic instability of helicopter ground resonance, a modeling and simulation method of helicopter ground resonance considering nonlinear dynamic characteristics of components (rotor lead-lag damper, landing gear wheel and absorber) is presented. The numerical integral method is used to calculate the transient responses of the body and rotor, simulating some disturbance. To obtain quantitative instabilities, Fast Fourier Transform (FFT) is conducted to estimate the modal frequencies, and the mobile rectangular window method is employed in the predictions of the modal damping in terms of the response time history. Simulation results show that ground resonance simulation test can exactly lead up the blade lead-lag regressing mode frequency, and the modal damping obtained according to attenuation curves are close to the test results. The simulation test results are in accordance with the actual accident situation, and prove the correctness of the simulation method. This analysis method used for ground resonance simulation test can give out the results according with real helicopter engineering tests.

  8. A simple scoring system for predicting early major complications in spine surgery: the cumulative effect of age and size of surgery.

    PubMed

    Brasil, Albert Vincent Berthier; Teles, Alisson R; Roxo, Marcelo Ricardo; Schuster, Marcelo Neutzling; Zauk, Eduardo Ballverdu; Barcellos, Gabriel da Costa; Costa, Pablo Ramon Fruett da; Ferreira, Nelson Pires; Kraemer, Jorge Luiz; Ferreira, Marcelo Paglioli; Gobbato, Pedro Luis; Worm, Paulo Valdeci

    2016-10-01

    To analyze the cumulative effect of risk factors associated with early major complications in postoperative spine surgery. Retrospective analysis of 583 surgically-treated patients. Early "major" complications were defined as those that may lead to permanent detrimental effects or require further significant intervention. A balanced risk score was built using multiple logistic regression. Ninety-two early major complications occurred in 76 patients (13%). Age > 60 years and surgery of three or more levels proved to be significant independent risk factors in the multivariate analysis. The balanced scoring system was defined as: 0 points (no risk factor), 2 points (1 factor) or 4 points (2 factors). The incidence of early major complications in each category was 7% (0 points), 15% (2 points) and 29% (4 points) respectively. This balanced scoring system, based on two risk factors, represents an important tool for both surgical indication and for patient counseling before surgery.

  9. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  10. Models based on ultraviolet spectroscopy, polyphenols, oligosaccharides and polysaccharides for prediction of wine astringency.

    PubMed

    Boulet, Jean-Claude; Trarieux, Corinne; Souquet, Jean-Marc; Ducasse, Maris-Agnés; Caillé, Soline; Samson, Alain; Williams, Pascale; Doco, Thierry; Cheynier, Véronique

    2016-01-01

    Astringency elicited by tannins is usually assessed by tasting. Alternative methods involving tannin precipitation have been proposed, but they remain time-consuming. Our goal was to propose a faster method and investigate the links between wine composition and astringency. Red wines covering a wide range of astringency intensities, assessed by sensory analysis, were selected. Prediction models based on multiple linear regression (MLR) were built using UV spectrophotometry (190-400 nm) and chemical analysis (enological analysis, polyphenols, oligosaccharides and polysaccharides). Astringency intensity was strongly correlated (R(2) = 0.825) with tannin precipitation by bovine serum albumin (BSA). Wine absorbances at 230 nm (A230) proved more suitable for astringency prediction (R(2) = 0.705) than A280 (R(2) = 0.56) or tannin concentration estimated by phloroglucinolysis (R(2) = 0.59). Three variable models built with A230, oligosaccharides and polysaccharides presented high R(2) and low errors of cross-validation. These models confirmed that polysaccharides decrease astringency perception and indicated a positive relationship between oligosaccharides and astringency. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Modelling and multi objective optimization of WEDM of commercially Monel super alloy using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu

    2016-09-01

    In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.

  12. Detection and discrimination of microorganisms on various substrates with quantum cascade laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Padilla-Jiménez, Amira C.; Ortiz-Rivera, William; Rios-Velazquez, Carlos; Vazquez-Ayala, Iris; Hernández-Rivera, Samuel P.

    2014-06-01

    Investigations focusing on devising rapid and accurate methods for developing signatures for microorganisms that could be used as biological warfare agents' detection, identification, and discrimination have recently increased significantly. Quantum cascade laser (QCL)-based spectroscopic systems have revolutionized many areas of defense and security including this area of research. In this contribution, infrared spectroscopy detection based on QCL was used to obtain the mid-infrared (MIR) spectral signatures of Bacillus thuringiensis, Escherichia coli, and Staphylococcus epidermidis. These bacteria were used as microorganisms that simulate biothreats (biosimulants) very truthfully. The experiments were conducted in reflection mode with biosimulants deposited on various substrates including cardboard, glass, travel bags, wood, and stainless steel. Chemometrics multivariate statistical routines, such as principal component analysis regression and partial least squares coupled to discriminant analysis, were used to analyze the MIR spectra. Overall, the investigated infrared vibrational techniques were useful for detecting target microorganisms on the studied substrates, and the multivariate data analysis techniques proved to be very efficient for classifying the bacteria and discriminating them in the presence of highly IR-interfering media.

  13. [Stages of behavioral change regarding physical activity in students from a Brazilian town].

    PubMed

    Silva, Diego A S; Smith-Menezes, Aldemir; Almeida-Gomes, Marciusde; de Sousa, Thiago Ferreira

    2010-08-01

    Verifying the association between stages of behavioural change (SBC) for physical activity (PA) and socio-demographic factors, behavioural factors and PA barriers in students from a small town in Brazil. This cross-sectional study's representative sample was formed by 281 high school students from Simão Dias, Sergipe State, in Brazil, having 17.4 (± 1.98) mean age. Socio-demographic information was collected via a self-administered instrument (gender, age, school grade, economic level (EL) and family-head's EL), SBC for PA, behavioural factors (smoking, alcohol and stress) and PA barriers. A hierarchical model was used, involving Poisson regression with respective confidence intervals; significance level was set at 5 % for all analysis. 65.8 % of the participating students were classified in stages referring to inactive physical behaviour. Being female had the probability of presenting 1.37 times higher inactive behaviour (1.14-1.65 95 %CI) when compared to being male in the final regression model; having a low EL remained a risk factor, compared to medium EL students (PR=1.41; 1.15-1.72 95 %CI). These findings may prove useful for developing health promotion programmes in school environments, paying special attention to female and low-EL students.

  14. Pregnancy eHealth and mHealth: user proportions and characteristics of pregnant women using Web-based information sources-a cross-sectional study.

    PubMed

    Wallwiener, Stephanie; Müller, Mitho; Doster, Anne; Laserer, Wolfgang; Reck, Corinna; Pauluschke-Fröhlich, Jan; Brucker, Sara Y; Wallwiener, Christian W; Wallwiener, Markus

    2016-11-01

    To analyze the current proportions and characteristics of women using Internet (eHealth) and smartphone (mHealth) based sources of information during pregnancy and to investigate the influence, this information-seeking behavior has on decision-making. A cross-sectional study was conducted at two major German university hospitals. Questionnaires covering socio-demographic data, medical data and details of Internet, and smartphone application use were administered to 220 pregnant women. Data analysis utilized descriptive statistics and multiple regression analysis. 50.7 % of pregnant women were online information seekers. 22.4 % used an mHealth pregnancy application. Women using eHealth information showed no specific profile, while women using mHealth applications proved to be younger, were more likely to be in their first pregnancy, felt less healthy, and were more likely to be influenced by the retrieved information. Stepwise backward regression analysis explained 25.8 % of the variance of mHealth use. 80.5 % of cases were classified correctly by the identified predictors. All types of Web-based information correlated significantly with decision-making during pregnancy. Pregnant women frequently use the Internet and smartphone applications as a source of information. While Web usage was a common phenomenon, this study revealed specific characteristics of mHealth users during pregnancy. Improved, medically accurate smartphone applications might provide a way to specifically target the mHealth user group. As user influenceability was of major relevance to all types of information, all medical content should be carefully reviewed by a multidisciplinary board of medical specialists.

  15. Identifying changes in dissolved organic matter content and characteristics by fluorescence spectroscopy coupled with self-organizing map and classification and regression tree analysis during wastewater treatment.

    PubMed

    Yu, Huibin; Song, Yonghui; Liu, Ruixia; Pan, Hongwei; Xiang, Liancheng; Qian, Feng

    2014-10-01

    The stabilization of latent tracers of dissolved organic matter (DOM) of wastewater was analyzed by three-dimensional excitation-emission matrix (EEM) fluorescence spectroscopy coupled with self-organizing map and classification and regression tree analysis (CART) in wastewater treatment performance. DOM of water samples collected from primary sedimentation, anaerobic, anoxic, oxic and secondary sedimentation tanks in a large-scale wastewater treatment plant contained four fluorescence components: tryptophan-like (C1), tyrosine-like (C2), microbial humic-like (C3) and fulvic-like (C4) materials extracted by self-organizing map. These components showed good positive linear correlations with dissolved organic carbon of DOM. C1 and C2 were representative components in the wastewater, and they were removed to a higher extent than those of C3 and C4 in the treatment process. C2 was a latent parameter determined by CART to differentiate water samples of oxic and secondary sedimentation tanks from the successive treatment units, indirectly proving that most of tyrosine-like material was degraded by anaerobic microorganisms. C1 was an accurate parameter to comprehensively separate the samples of the five treatment units from each other, indirectly indicating that tryptophan-like material was decomposed by anaerobic and aerobic bacteria. EEM fluorescence spectroscopy in combination with self-organizing map and CART analysis can be a nondestructive effective method for characterizing structural component of DOM fractions and monitoring organic matter removal in wastewater treatment process. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Model-based Bayesian inference for ROC data analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Bae, K. Ty

    2013-03-01

    This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.

  17. Censored quantile regression with recursive partitioning-based weights

    PubMed Central

    Wey, Andrew; Wang, Lan; Rudser, Kyle

    2014-01-01

    Censored quantile regression provides a useful alternative to the Cox proportional hazards model for analyzing survival data. It directly models the conditional quantile of the survival time and hence is easy to interpret. Moreover, it relaxes the proportionality constraint on the hazard function associated with the popular Cox model and is natural for modeling heterogeneity of the data. Recently, Wang and Wang (2009. Locally weighted censored quantile regression. Journal of the American Statistical Association 103, 1117–1128) proposed a locally weighted censored quantile regression approach that allows for covariate-dependent censoring and is less restrictive than other censored quantile regression methods. However, their kernel smoothing-based weighting scheme requires all covariates to be continuous and encounters practical difficulty with even a moderate number of covariates. We propose a new weighting approach that uses recursive partitioning, e.g. survival trees, that offers greater flexibility in handling covariate-dependent censoring in moderately high dimensions and can incorporate both continuous and discrete covariates. We prove that this new weighting scheme leads to consistent estimation of the quantile regression coefficients and demonstrate its effectiveness via Monte Carlo simulations. We also illustrate the new method using a widely recognized data set from a clinical trial on primary biliary cirrhosis. PMID:23975800

  18. GWAS with longitudinal phenotypes: performance of approximate procedures

    PubMed Central

    Sikorska, Karolina; Montazeri, Nahid Mostafavi; Uitterlinden, André; Rivadeneira, Fernando; Eilers, Paul HC; Lesaffre, Emmanuel

    2015-01-01

    Analysis of genome-wide association studies with longitudinal data using standard procedures, such as linear mixed model (LMM) fitting, leads to discouragingly long computation times. There is a need to speed up the computations significantly. In our previous work (Sikorska et al: Fast linear mixed model computations for genome-wide association studies with longitudinal data. Stat Med 2012; 32.1: 165–180), we proposed the conditional two-step (CTS) approach as a fast method providing an approximation to the P-value for the longitudinal single-nucleotide polymorphism (SNP) effect. In the first step a reduced conditional LMM is fit, omitting all the SNP terms. In the second step, the estimated random slopes are regressed on SNPs. The CTS has been applied to the bone mineral density data from the Rotterdam Study and proved to work very well even in unbalanced situations. In another article (Sikorska et al: GWAS on your notebook: fast semi-parallel linear and logistic regression for genome-wide association studies. BMC Bioinformatics 2013; 14: 166), we suggested semi-parallel computations, greatly speeding up fitting many linear regressions. Combining CTS with fast linear regression reduces the computation time from several weeks to a few minutes on a single computer. Here, we explore further the properties of the CTS both analytically and by simulations. We investigate the performance of our proposal in comparison with a related but different approach, the two-step procedure. It is analytically shown that for the balanced case, under mild assumptions, the P-value provided by the CTS is the same as from the LMM. For unbalanced data and in realistic situations, simulations show that the CTS method does not inflate the type I error rate and implies only a minimal loss of power. PMID:25712081

  19. Nonconvex Sparse Logistic Regression With Weakly Convex Regularization

    NASA Astrophysics Data System (ADS)

    Shen, Xinyue; Gu, Yuantao

    2018-06-01

    In this work we propose to fit a sparse logistic regression model by a weakly convex regularized nonconvex optimization problem. The idea is based on the finding that a weakly convex function as an approximation of the $\\ell_0$ pseudo norm is able to better induce sparsity than the commonly used $\\ell_1$ norm. For a class of weakly convex sparsity inducing functions, we prove the nonconvexity of the corresponding sparse logistic regression problem, and study its local optimality conditions and the choice of the regularization parameter to exclude trivial solutions. Despite the nonconvexity, a method based on proximal gradient descent is used to solve the general weakly convex sparse logistic regression, and its convergence behavior is studied theoretically. Then the general framework is applied to a specific weakly convex function, and a necessary and sufficient local optimality condition is provided. The solution method is instantiated in this case as an iterative firm-shrinkage algorithm, and its effectiveness is demonstrated in numerical experiments by both randomly generated and real datasets.

  20. The effects of GeoGebra software on pre-service mathematics teachers' attitudes and views toward proof and proving

    NASA Astrophysics Data System (ADS)

    Zengin, Yılmaz

    2017-11-01

    The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used the 'Attitude Scale Towards Proof and Proving' and an open-ended questionnaire that were administered before and after the intervention as data collection tools. Paired samples t-test analysis was used for the analysis of quantitative data and content and descriptive analyses were utilized for the analysis of qualitative data. As a result of the data analysis, it was determined that GeoGebra software was an effective tool in increasing pre-service teachers' attitudes towards proof and proving.

  1. Improvement of Stand Jig Sealer and Its Increased Production Capacity

    NASA Astrophysics Data System (ADS)

    Soebandrija, K. E. N.; Astuti, S. W. D.

    2014-03-01

    This paper has the objective to prove that improvement of Stand Jig Sealer can lead to the cycle time target as part of Improvement efforts and its Productivity. Prior researches through prior journals both classics journal such as Quesnay (1766) and Solow (1957) and updated journal such as Reikard (2011) researches, are mentioned and elaborated. Precisely, the research is narrowed down and specified into automotive industry and eventually the software related of SPSS and Structural Equation Modeling ( SEM ). The analysis and its method are conducted through the calculation working time. The mentioned calculation are reinforced with the hypothesis test using SPSS Version 19 and involve parameters of production efficiency, productivity calculation, and the calculation of financial investments. The results obtained are augmented achievement of cycle time target ≤ 80 seconds posterior to improvement stand jig sealer. The result from calculation of SPSS-19 version comprise the following aspects: the one-sided hypothesis test is rejection of Ho:μ≥80 seconds, the correlation rs=0.84, regression y = 0.159+0.642x, validity R table = 0.4438, reliability value of Cronbach's alpha = 0.885>0.70, independence (Chi Square) Asymp. Sig=0.028<0.05, 95% efficiency, increase productivity 11%, financial analysis (NPV 2,340,596>0, PI 2.04>1, IRR 45.56%>i=12.68%, PP=1.86). The Mentioned calculation results support the hypothesis and ultimately align with the objective of this paper to prove that improvement of Stand Jig Sealer and its relation toward the cycle time target. Precisely, the improvement of production capacity of PT. Astra Daihatsu Motor.

  2. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors (RSS) from the three quantitative analyses were compared. In methane gas analysis, SWLS yielded the lowest SEP and RSS among the three methods. In methane/toluene mixture gas analysis, a modification of the SWLS has been presented to tackle the bias error from other components. The SWLS without modification presents the lowest SEP in all cases but not bias and RSS. The modification of SWLS reduced the bias, which showed a lower RSS than CLS, especially for small components.

  3. Linear models for calculating digestibile energy for sheep diets.

    PubMed

    Fonnesbeck, P V; Christiansen, M L; Harris, L E

    1981-05-01

    Equations for estimating the digestible energy (DE) content of sheep diets were generated from the chemical contents and a factorial description of diets fed to lambs in digestion trials. The diet factors were two forages (alfalfa and grass hay), harvested at three stages of maturity (late vegetative, early bloom and full bloom), fed in two ingredient combinations (all hay or a 50:50 hay and corn grain mixture) and prepared by two forage texture processes (coarsely chopped or finely chopped and pelleted). The 2 x 3 x 2 x 2 factorial arrangement produced 24 diet treatments. These were replicated twice, for a total of 48 lamb digestion trials. In model 1 regression equations, DE was calculated directly from chemical composition of the diet. In model 2, regression equations predicted the percentage of digested nutrient from the chemical contents of the diet and then DE of the diet was calculated as the sum of the gross energy of the digested organic components. Expanded forms of model 1 and model 2 were also developed that included diet factors as qualitative indicator variables to adjust the regression constant and regression coefficients for the diet description. The expanded forms of the equations accounted for significantly more variation in DE than did the simple models and more accurately estimated DE of the diet. Information provided by the diet description proved as useful as chemical analyses for the prediction of digestibility of nutrients. The statistics indicate that, with model 1, neutral detergent fiber and plant cell wall analyses provided as much information for the estimation of DE as did model 2 with the combined information from crude protein, available carbohydrate, total lipid, cellulose and hemicellulose. Regression equations are presented for estimating DE with the most currently analyzed organic components, including linear and curvilinear variables and diet factors that significantly reduce the standard error of the estimate. To estimate De of a diet, the user utilizes the equation that uses the chemical analysis information and diet description most effectively.

  4. Disentangling the Correlates of Drug Use in a Clinic and Community Sample: A Regression Analysis of the Associations between Drug Use, Years-of-School, Impulsivity, IQ, Working Memory, and Psychiatric Symptoms.

    PubMed

    Heyman, Gene M; Dunn, Brian J; Mignone, Jason

    2014-01-01

    Years-of-school is negatively correlated with illicit drug use. However, educational attainment is positively correlated with IQ and negatively correlated with impulsivity, two traits that are also correlated with drug use. Thus, the negative correlation between education and drug use may reflect the correlates of schooling, not schooling itself. To help disentangle these relations we obtained measures of working memory, simple memory, IQ, disposition (impulsivity and psychiatric status), years-of-school and frequency of illicit and licit drug use in methadone clinic and community drug users. We found strong zero-order correlations between all measures, including IQ, impulsivity, years-of-school, psychiatric symptoms, and drug use. However, multiple regression analyses revealed a different picture. The significant predictors of illicit drug use were gender, involvement in a methadone clinic, and years-of-school. That is, psychiatric symptoms, impulsivity, cognition, and IQ no longer predicted illicit drug use in the multiple regression analyses. Moreover, high risk subjects (low IQ and/or high impulsivity) who spent 14 or more years in school used stimulants and opiates less than did low risk subjects who had spent <14 years in school. Smoking and drinking had a different correlational structure. IQ and years-of-school predicted whether someone ever became a smoker, whereas impulsivity predicted the frequency of drinking bouts, but years-of-school did not. Many subjects reported no use of one or more drugs, resulting in a large number of "zeroes" in the data sets. Cragg's Double-Hurdle regression method proved the best approach for dealing with this problem. To our knowledge, this is the first report to show that years-of-school predicts lower levels of illicit drug use after controlling for IQ and impulsivity. This paper also highlights the advantages of Double-Hurdle regression methods for analyzing the correlates of drug use in community samples.

  5. Relation between burnout syndrome and job satisfaction among mental health workers.

    PubMed

    Ogresta, Jelena; Rusac, Silvia; Zorec, Lea

    2008-06-01

    To identify predictors of burnout syndrome, such as job satisfaction and manifestations of occupational stress, in mental health workers. The study included a snowball sample of 174 mental health workers in Croatia. The following measurement instruments were used: Maslach Burnout Inventory, Manifestations of Occupational Stress Survey, and Job Satisfaction Survey. We correlated dimensions of burnout syndrome with job satisfaction and manifestations of occupational stress dimensions. We also performed multiple regression analysis using three dimensions of burnout syndrome--emotional exhaustion, depersonalization, and personal accomplishment. Stepwise multiple regression analysis showed that pay and rewards satisfaction (beta=-0.37), work climate (beta=-0.18), advancement opportunities (beta=0.17), the degree of psychological (beta=0.41), and physical manifestations of occupational stress (beta=0.29) were significant predictors of emotional exhaustion (R=0.76; F=30.02; P<0.001). The frequency of negative emotional and behavioral reactions toward patients and colleagues (beta=0.48), psychological (beta=0.27) and physical manifestations of occupational stress (beta=0.24), and pay and rewards satisfaction (beta=0.22) were significant predictors of depersonalization (R=0.57; F=13,01; P<0.001). Satisfaction with the work climate (beta=-0.20) was a significant predictor of lower levels of personal accomplishment (R=0.20; F=5.06; P<0.005). Mental health workers exhibited a moderate degree of burnout syndrome, but there were no significant differences regarding their occupation. Generally, both dimensions of job satisfaction and manifestations of occupational stress proved to be relevant predictors of burnout syndrome.

  6. Paediatric bacterial keratitis cases in Shanghai: microbiological profile, antibiotic susceptibility and visual outcomes

    PubMed Central

    Hong, J; Chen, J; Sun, X; Deng, S X; Chen, L; Gong, L; Cao, W; Yu, X; Xu, J

    2012-01-01

    Purpose The purpose of this study was to review the microbiological profile, in vitro antibiotic susceptibility and visual outcomes of paediatric microbial keratitis in Shanghai, China over the past 6 years. Methods Medical records of patients aged ≤16 years were reviewed, who were diagnosed as having bacterial keratitis between 1 January 2005 and 31 December 2010. Bacterial culture results and in vitro antibiotic susceptibility were analysed. A logistic regression analysis was conducted to evaluate the relationship between visual impairment and possible risk factors. Results Eighty consecutive cases of paediatric bacterial keratitis cases were included, among which 59 were identified as having positive culture. Staphylococcus epidermidis was the most commonly isolated organism (n=23; 39.0%), followed by Streptococcus pneumoniae (n=11; 18.6%) and Pseudomonas aeruginosa (n=6; 10.2%). Antibiotic sensitivities revealed that tested bacteria had low resistance rates to fluoroquinolones and aminoglycosides (8.3–18.4% and 12.5–24.4%, respectively). Multivariate logistic regression analysis proved that visual impairment was significantly associated with Gram-negative bacterial infection (odds ratio (OR)=7.626; P=0.043) and an increasing number of resistant antibiotics (OR=0.385; P=0.040). Conclusions S. epidermidis was the most common isolated organism in Shanghai paediatric keratitis. The fluoroquinolones and aminoglycosides remained good choices for treating these patients. Gram-negative bacterial infection and an increasing number of resistant antibiotics were associated with worse visual prognoses in paediatric keratitis. PMID:23079751

  7. Understanding handpump sustainability: Determinants of rural water source functionality in the Greater Afram Plains region of Ghana†

    PubMed Central

    Shields, Katherine F.; Chan, Terence U.; Christenson, Elizabeth; Cronk, Ryan D.; Leker, Hannah; Samani, Destina; Apoya, Patrick; Lutz, Alexandra

    2015-01-01

    Abstract Safe drinking water is critical to human health and development. In rural sub‐Saharan Africa, most improved water sources are boreholes with handpumps; studies suggest that up to one third of these handpumps are nonfunctional at any given time. This work presents findings from a secondary analysis of cross‐sectional data from 1509 water sources in 570 communities in the rural Greater Afram Plains (GAP) region of Ghana; one of the largest studies of its kind. 79.4% of enumerated water sources were functional when visited; in multivariable regressions, functionality depended on source age, management, tariff collection, the number of other sources in the community, and the district. A Bayesian network (BN) model developed using the same data set found strong dependencies of functionality on implementer, pump type, management, and the availability of tools, with synergistic effects from management determinants on functionality, increasing the likelihood of a source being functional from a baseline of 72% to more than 97% with optimal management and available tools. We suggest that functionality may be a dynamic equilibrium between regular breakdowns and repairs, with management a key determinant of repair rate. Management variables may interact synergistically in ways better captured by BN analysis than by logistic regressions. These qualitative findings may prove generalizable beyond the study area, and may offer new approaches to understanding and increasing handpump functionality and safe water access. PMID:27667863

  8. Understanding handpump sustainability: Determinants of rural water source functionality in the Greater Afram Plains region of Ghana

    NASA Astrophysics Data System (ADS)

    Fisher, Michael B.; Shields, Katherine F.; Chan, Terence U.; Christenson, Elizabeth; Cronk, Ryan D.; Leker, Hannah; Samani, Destina; Apoya, Patrick; Lutz, Alexandra; Bartram, Jamie

    2015-10-01

    Safe drinking water is critical to human health and development. In rural sub-Saharan Africa, most improved water sources are boreholes with handpumps; studies suggest that up to one third of these handpumps are nonfunctional at any given time. This work presents findings from a secondary analysis of cross-sectional data from 1509 water sources in 570 communities in the rural Greater Afram Plains (GAP) region of Ghana; one of the largest studies of its kind. 79.4% of enumerated water sources were functional when visited; in multivariable regressions, functionality depended on source age, management, tariff collection, the number of other sources in the community, and the district. A Bayesian network (BN) model developed using the same data set found strong dependencies of functionality on implementer, pump type, management, and the availability of tools, with synergistic effects from management determinants on functionality, increasing the likelihood of a source being functional from a baseline of 72% to more than 97% with optimal management and available tools. We suggest that functionality may be a dynamic equilibrium between regular breakdowns and repairs, with management a key determinant of repair rate. Management variables may interact synergistically in ways better captured by BN analysis than by logistic regressions. These qualitative findings may prove generalizable beyond the study area, and may offer new approaches to understanding and increasing handpump functionality and safe water access. This article was corrected on 11 Nov 2015. See the end of the full text for details.

  9. Modeling Verdict Outcomes Using Social Network Measures: The Watergate and Caviar Network Cases.

    PubMed

    Masías, Víctor Hugo; Valle, Mauricio; Morselli, Carlo; Crespo, Fernando; Vargas, Augusto; Laengle, Sigifredo

    2016-01-01

    Modelling criminal trial verdict outcomes using social network measures is an emerging research area in quantitative criminology. Few studies have yet analyzed which of these measures are the most important for verdict modelling or which data classification techniques perform best for this application. To compare the performance of different techniques in classifying members of a criminal network, this article applies three different machine learning classifiers-Logistic Regression, Naïve Bayes and Random Forest-with a range of social network measures and the necessary databases to model the verdicts in two real-world cases: the U.S. Watergate Conspiracy of the 1970's and the now-defunct Canada-based international drug trafficking ring known as the Caviar Network. In both cases it was found that the Random Forest classifier did better than either Logistic Regression or Naïve Bayes, and its superior performance was statistically significant. This being so, Random Forest was used not only for classification but also to assess the importance of the measures. For the Watergate case, the most important one proved to be betweenness centrality while for the Caviar Network, it was the effective size of the network. These results are significant because they show that an approach combining machine learning with social network analysis not only can generate accurate classification models but also helps quantify the importance social network variables in modelling verdict outcomes. We conclude our analysis with a discussion and some suggestions for future work in verdict modelling using social network measures.

  10. Test Planning, Collection, and Analysis of Pressure Data Resulting from Army Weapon Systems. Volume IV. Data Analysis of the M198 and M109 May 1979 Firings.

    DTIC Science & Technology

    1980-05-01

    the M203 charge during May 1979 at Aberdeen Proving Ground . The data collection and analysis effort is part of a continuing program undertaken by...May to 18 May 1979 the M198 towed howitzer and the M109 self- propelled howitzer were fired with the 14203 charge at the Aberdeen Proving Grounds ...howitzer and the M109 self- propeiled howitzer were fired with the M203 charge at the Aberdeen Proving Grounds . This section of the report gives the

  11. M1A2 Adjunct Analysis (POSNOV Volume)

    DTIC Science & Technology

    1989-12-01

    MD 20814-2797 Director 2 U.S. Army Materiel Systems Analysis Activity ATTN: AMXSY-CS, AMXSY-GA Aberden Proving Grounds , MD 21005-5071 U.S. Army...Leonard Wood, MO Commander U.S. Army Ordnance Center & School ATTN: ATSL-CD-CS Aberdeen Proving Ground , MD 21005 Commander 2 U.S. Army Soldier Support...NJ Commander U.S. Army Test and Evaluation Command ATrN: AMSTE-CM-R Aberdeen Proving Ground , MD 21005 Commander U.S. Army Tank Automotive Command

  12. Reduced opsin gene expression in a cave-dwelling fish

    PubMed Central

    Tobler, Michael; Coleman, Seth W.; Perkins, Brian D.; Rosenthal, Gil G.

    2010-01-01

    Regressive evolution of structures associated with vision in cave-dwelling organisms is the focus of intense research. Most work has focused on differences between extreme visual phenotypes: sighted, surface animals and their completely blind, cave-dwelling counterparts. We suggest that troglodytic systems, comprising multiple populations that vary along a gradient of visual function, may prove critical in understanding the mechanisms underlying initial regression in visual pathways. Gene expression assays of natural and laboratory-reared populations of the Atlantic molly (Poecilia mexicana) revealed reduced opsin expression in cave-dwelling populations compared with surface-dwelling conspecifics. Our results suggest that the reduction in opsin expression in cave-dwelling populations is not phenotypically plastic but reflects a hardwired system not rescued by exposure to light during retinal ontogeny. Changes in opsin gene expression may consequently represent a first evolutionary step in the regression of eyes in cave organisms. PMID:19740890

  13. Ballistic Analysis of Firing Table Data for 155MM, M825 Smoke Projectile

    DTIC Science & Technology

    1990-09-01

    PROVING GROUND , MARYLAND I I 4 .i. NOTICES Destroy this report when it is no longer needed. DO NOT return it to the originator. Additional copies of this...ADDRESS(ES) 10. SPONSORING MONITORING U.S. Army Ballistic Research Laboratory AGENCY REPORT NUMBER ATTN: SLCBR-DD-T BRL-R-3865 Aberdeen Proving Ground ...thru September 1988 at Dugway Proving Ground . Such an analysis will consider whether the M825 MOD PIP Base projectile is ballistically matched or

  14. [Contraceptive practices among university students: the use of emergency contraception].

    PubMed

    Borges, Ana Luiza Vilela; Fujimori, Elizabeth; Hoga, Luiza Akiko Komura; Contin, Marcelo Vieira

    2010-04-01

    This study investigated contraceptive practices and especially the use of emergency contraception by 487 young students at a public university in São Paulo State. A structured questionnaire was sent by e-mail and completed online in December 2007. Contraceptive methods and use of emergency contraception were investigated. Female and male students reported a high proportion of contraceptive use, mainly condoms and the pill. Half of the students had already used emergency contraception, often when already using some other highly effective method. Among female students, multiple regression analysis showed that current age, age at sexual initiation, not having used condoms in sexual relations, condom failure, and knowing someone that has used emergency contraception were associated with use of the latter. The option for emergency contraception proved to be more closely related to inconsistencies in the use of regular methods than to lack of their use, and can thus be considered a marker for discontinuity in regular contraception.

  15. Total body water and lean body mass estimated by ethanol dilution

    NASA Technical Reports Server (NTRS)

    Loeppky, J. A.; Myhre, L. G.; Venters, M. D.; Luft, U. C.

    1977-01-01

    A method for estimating total body water (TBW) using breath analyses of blood ethanol content is described. Regression analysis of ethanol concentration curves permits determination of a theoretical concentration that would have existed if complete equilibration had taken place immediately upon ingestion of the ethanol; the water fraction of normal blood may then be used to calculate TBW. The ethanol dilution method is applied to 35 subjects, and comparison with a tritium dilution method of determining TBW indicates that the correlation between the two procedures is highly significant. Lean body mass and fat fraction were determined by hydrostatic weighing, and these data also prove compatible with results obtained from the ethanol dilution method. In contrast to the radioactive tritium dilution method, the ethanol dilution method can be repeated daily with its applicability ranging from diseased individuals to individuals subjected to thermal stress, strenuous exercise, water immersion, or the weightless conditions of space flights.

  16. Resolving the percentage of component terrains within single resolution elements

    NASA Technical Reports Server (NTRS)

    Marsh, S. E.; Switzer, P.; Kowalik, W. S.; Lyon, R. J. P.

    1980-01-01

    An approximate maximum likelihood technique employing a widely available discriminant analysis program is discussed that has been developed for resolving the percentage of component terrains within single resolution elements. The method uses all four channels of Landsat data simultaneously and does not require prior knowledge of the percentage of components in mixed pixels. It was tested in five cases that were chosen to represent mixtures of outcrop, soil and vegetation which would typically be encountered in geologic studies with Landsat data. For all five cases, the method proved to be superior to single band weighted average and linear regression techniques and permitted an estimate of the total area occupied by component terrains to within plus or minus 6% of the true area covered. Its major drawback is a consistent overestimation of the pixel component percent of the darker materials (vegetation) and an underestimation of the pixel component percent of the brighter materials (sand).

  17. A new feature constituting approach to detection of vocal fold pathology

    NASA Astrophysics Data System (ADS)

    Hariharan, M.; Polat, Kemal; Yaacob, Sazali

    2014-08-01

    In the last two decades, non-invasive methods through acoustic analysis of voice signal have been proved to be excellent and reliable tool to diagnose vocal fold pathologies. This paper proposes a new feature vector based on the wavelet packet transform and singular value decomposition for the detection of vocal fold pathology. k-means clustering based feature weighting is proposed to increase the distinguishing performance of the proposed features. In this work, two databases Massachusetts Eye and Ear Infirmary (MEEI) voice disorders database and MAPACI speech pathology database are used. Four different supervised classifiers such as k-nearest neighbour (k-NN), least-square support vector machine, probabilistic neural network and general regression neural network are employed for testing the proposed features. The experimental results uncover that the proposed features give very promising classification accuracy of 100% for both MEEI database and MAPACI speech pathology database.

  18. Analytical solution of Luedeking-Piret equation for a batch fermentation obeying Monod growth kinetics.

    PubMed

    Garnier, Alain; Gaillet, Bruno

    2015-12-01

    Not so many fermentation mathematical models allow analytical solutions of batch process dynamics. The most widely used is the combination of the logistic microbial growth kinetics with Luedeking-Piret bioproduct synthesis relation. However, the logistic equation is principally based on formalistic similarities and only fits a limited range of fermentation types. In this article, we have developed an analytical solution for the combination of Monod growth kinetics with Luedeking-Piret relation, which can be identified by linear regression and used to simulate batch fermentation evolution. Two classical examples are used to show the quality of fit and the simplicity of the method proposed. A solution for the combination of Haldane substrate-limited growth model combined with Luedeking-Piret relation is also provided. These models could prove useful for the analysis of fermentation data in industry as well as academia. © 2015 Wiley Periodicals, Inc.

  19. The effectiveness of nutrition education and labeling in Dutch supermarkets.

    PubMed

    Steenhuis, Ingrid; van Assema, Patricia; van Breukelen, Gerard; Glanz, Karen

    2004-01-01

    Nutrition education and labeling may help consumers to eat less fat. The purpose of this study is to assess the effect of nutrition education with and without shelf labeling on reduced fat intake in Dutch supermarkets. The design consisted of a randomized, pretest-posttest, experimental control group design. In total, 2203 clients of 13 supermarkets were included in the sample. Total fat intake of clients and behavioral determinants of eating less fat were measured by a questionnaire. A mixed-effect regression model was used for the analysis. No significant effects were found for the educational intervention, alone or with the labeling, on total fat intake and the psychosocial determinants of eating less fat. Nutrition education and labeling of low-fat food products in supermarkets did not prove to be effective strategies. The fact that the supermarket is a highly competitive environment may have accounted for this lack of effect.

  20. [Sanitation and racial inequality conditions in urban Brazil: an analysis focused on the indigenous population based on the 2010 Population Census].

    PubMed

    Raupp, Ludimila; Fávaro, Thatiana Regina; Cunha, Geraldo Marcelo; Santos, Ricardo Ventura

    2017-01-01

    The aims of this study were to analyze and describe the presence and infrastructure of basic sanitation in the urban areas of Brazil, contrasting indigenous with non-indigenous households. Methods: A cross-sectional study based on microdata from the 2010 Census was conducted. The analyses were based on descriptive statistics (prevalence) and the construction of multiple logistic regression models (adjusted by socioeconomic and demographic covariates). The odds ratios were estimated for the association between the explanatory variables (covariates) and the outcome variables (water supply, sewage, garbage collection, and adequate sanitation). The statistical significance level established was 5%. Among the analyzed services, sewage proved to be the most precarious. Regarding race or color, indigenous households presented the lowest rate of sanitary infrastructure in Urban Brazil. The adjusted regression showed that, in general, indigenous households were at a disadvantage when compared to other categories of race or color, especially in terms of the presence of garbage collection services. These inequalities were much more pronounced in the South and Southeastern regions. The analyses of this study not only confirm the profile of poor conditions and infrastructure of the basic sanitation of indigenous households in urban areas, but also demonstrate the persistence of inequalities associated with race or color in the country.

  1. Photo diagnosis of early pre cancer (LSIL) in genital tissue

    NASA Astrophysics Data System (ADS)

    Vaitkuviene, A.; Andersen-Engels, S.; Auksorius, E.; Bendsoe, N.; Gavriushin, V.; Gustafsson, U.; Oyama, J.; Palsson, S.; Soto Thompson, M.; Stenram, U.; Svanberg, K.; Viliunas, V.; De Weert, M. J.

    2005-11-01

    Permanent infections recognized as oncogenic factor. STD is common concomitant diseases in early precancerous genital tract lesions. Simple optical detection of early regressive pre cancer in cervix is the aim of this study. Hereditary immunosupression most likely is risk factor for cervical cancer development. Light induced fluorescence point monitoring fitted to live cervical tissue diagnostics in 42 patients. Human papilloma virus DNR in cervix tested by means of Hybrid Capture II method. Ultraviolet (337 nm) laser excited fluorescence spectra in the live cervical tissue analyzed by Principal Component (PrC) regression method and spectra decomposition method. PCr method best discriminated pathology group "CIN I and inflammation"(AUC=75%) related to fluorescence emission in short wave region. Spectra decomposition method suggested a few possible fluorophores in a long wave region. Ultraviolet (398 nm) light excitation of live cervix proved sharp selective spectra intensity enhancement in region above 600nm for High-grade cervical lesion. Conclusion: PC analysis of UV (337 nm) light excitation fluorescence spectra gives opportunity to obtain local immunity and Low-grade cervical lesion related information. Addition of shorter and longer wavelengths is promising for multi wave LIF point monitoring method progress in cervical pre-cancer diagnostics and utility for cancer prevention especially in developing countries.

  2. PACSIN2 polymorphism is associated with thiopurine-induced hematological toxicity in children with acute lymphoblastic leukaemia undergoing maintenance therapy.

    PubMed

    Smid, Alenka; Karas-Kuzelicki, Natasa; Jazbec, Janez; Mlinaric-Rascan, Irena

    2016-07-25

    Adequate maintenance therapy for childhood acute lymphoblastic leukemia (ALL), with 6-mercaptopurine as an essential component, is necessary for retaining durable remission. Interruptions or discontinuations of the therapy due to drug-related toxicities, which can be life threatening, may result in an increased risk of relapse. In this retrospective study including 305 paediatric ALL patients undergoing maintenance therapy, we systematically investigated the individual and combined effects of genetic variants of folate pathway enzymes, as well as of polymorphisms in PACSIN2 and ITPA, on drug-induced toxicities by applying a multi-analytical approach including logistic regression (LR), classification and regression tree (CART) and generalized multifactor dimensionality reduction (GMDR). In addition to the TPMT genotype, confirmed to be a major determinant of drug related toxicities, we identified the PACSIN2 rs2413739TT genotype as being a significant risk factor for 6-MP-induced toxicity in wild-type TPMT patients. A gene-gene interaction between MTRR (rs1801394) and MTHFR (rs1801133) was detected by GMDR and proved to have an independent effect on the risk of stomatitis, as shown by LR analysis. To our knowledge, this is the first study showing PACSIN2 genotype association with hematological toxicity in ALL patients undergoing maintenance therapy.

  3. Parsimonious estimation of the Wechsler Memory Scale, Fourth Edition demographically adjusted index scores: immediate and delayed memory.

    PubMed

    Miller, Justin B; Axelrod, Bradley N; Schutte, Christian

    2012-01-01

    The recent release of the Wechsler Memory Scale Fourth Edition contains many improvements from a theoretical and administration perspective, including demographic corrections using the Advanced Clinical Solutions. Although the administration time has been reduced from previous versions, a shortened version may be desirable in certain situations given practical time limitations in clinical practice. The current study evaluated two- and three-subtest estimations of demographically corrected Immediate and Delayed Memory index scores using both simple arithmetic prorating and regression models. All estimated values were significantly associated with observed index scores. Use of Lin's Concordance Correlation Coefficient as a measure of agreement showed a high degree of precision and virtually zero bias in the models, although the regression models showed a stronger association than prorated models. Regression-based models proved to be more accurate than prorated estimates with less dispersion around observed values, particularly when using three subtest regression models. Overall, the present research shows strong support for estimating demographically corrected index scores on the WMS-IV in clinical practice with an adequate performance using arithmetically prorated models and a stronger performance using regression models to predict index scores.

  4. Attitude towards intimate partner violence against women and risky sexual choices of Jamaican males.

    PubMed

    Gibbison, G A

    2007-01-01

    For young Jamaican men, it is necessary to prove their virility to their peers and prove to their parents that they are of heterosexual orientation. These demands have produced a society in which men are sexually aggressive, even to the point of using violence to control the sexual choices of women. This paper examines whether Jamaican men who support intimate partner violence (IPV) against women are more likely to have unsafe sexual practices and social attitudes that could increase women 's risk of contracting sexually transmitted infections. Men who responded 'yes' to violence against women are more likely themselves to have multiple sexual partners and less likely to use condoms consistently. They are also more likely to have forced a partner to have sex within the last year. Multivariate regression analysis shows that men who responded 'yes' to IPV are likely to be young, less educated and living in urban areas. Clearly, women in certain regions or subpopulations face an increased risk of contracting sexually transmitted infections due to the sexual choices of their partners. Intervention programmes to reduce sexually transmitted infections need to be developed with specific aspects of the cultural context of sexual relationships in mind. It seems especially important that male sexual choices and attitudes be directly addressed. Specific suggestions are made about an approach that has a proven record of success in reducing risky practices in high risk groups.

  5. Prevalence of headache in adolescents and association with use of computer and videogames.

    PubMed

    Xavier, Michelle Katherine Andrade; Pitangui, Ana Carolina Rodarti; Silva, Georgia Rodrigues Reis; Oliveira, Valéria Mayaly Alves de; Beltrão, Natália Barros; Araújo, Rodrigo Cappato de

    2015-11-01

    The aim of this study was to determine the prevalence of headache in adolescents and its association with excessive use of electronic devices and games. The sample comprised 954 adolescents of both sexes (14 to 19 years) who answered a questionnaire about use of computers and electronic games, presence of headache and physical activity. The binary and multinomial logistic regression, with significance level of 5% was used for inferential analysis. The prevalence of headache was 80.6%. The excessive use of electronics devices proved to be a risk factor (OR = 1.21) for headache. Subjects aged between 14 and 16 years were less likely to report headache (OR = 0.64). Regarding classification, 17.9% of adolescents had tension-type headache, 19.3% had migraine and 43.4% other types of headache. The adolescents aged form 14 to 16 years had lower chance (OR ≤ 0.68) to report the tension-type headache and other types of headache. The excessive use of digital equipment, electronic games and attending the third year of high school proved to be risk factors for migraine-type development (OR ≥ 1.84). There was a high prevalence of headache in adolescents and high-time use of electronic devices. We observed an association between excessive use of electronic devices and the presence of headache, and this habit is considered a risk factor, especially for the development of migraine-type.

  6. Different brain activations between own- and other-race face categorization: an fMRI study using group independent component analysis

    NASA Astrophysics Data System (ADS)

    Wei, Wenjuan; Liu, Jiangang; Dai, Ruwei; Feng, Lu; Li, Ling; Tian, Jie

    2014-03-01

    Previous behavioral research has proved that individuals process own- and other-race faces differently. One well-known effect is the other-race effect (ORE), which indicates that individuals categorize other-race faces more accurately and faster than own-race faces. The existed functional magnetic resonance imaging (fMRI) studies of the other-race effect mainly focused on the racial prejudice and the socio-affective differences towards own- and other-race face. In the present fMRI study, we adopted a race-categorization task to determine the activation level differences between categorizing own- and other-race faces. Thirty one Chinese participants who live in China with Chinese as the majority and who had no direct contact with Caucasian individual were recruited in the present study. We used the group independent component analysis (ICA), which is a method of blind source signal separation that has proven to be promising for analysis of fMRI data. We separated the entail data into 56 components which is estimated based on one subject using the Minimal Description Length (MDL) criteria. The components sorted based on the multiple linear regression temporal sorting criteria, and the fit regression parameters were used in performing statistical test to evaluate the task-relatedness of the components. The one way anova was performed to test the significance of the component time course in different conditions. Our result showed that the areas, which coordinates is similar to the right FFA coordinates that previous studies reported, were greater activated for own-race faces than other-race faces, while the precuneus showed greater activation for other-race faces than own-race faces.

  7. Satellite-based prediction of rainfall interception by tropical forest stands of a human-dominated landscape in Central Sulawesi, Indonesia

    NASA Astrophysics Data System (ADS)

    Nieschulze, Jens; Erasmi, Stefan; Dietz, Johannes; Hölscher, Dirk

    2009-01-01

    SummaryRainforest conversion to other land use types drastically alters the hydrological cycle in which changes in rainfall interception contribute significantly to the observed differences. However, little is known about the effects of more gradual changes in forest structure and at regional scales. We studied land use types ranging from natural forest over selectively-logged forest to cacao agroforest in a lower montane region in Central Sulawesi, Indonesia, and tested the suitability of high-resolution optical satellite imagery for modeling observed interception patterns. Investigated characteristics indicating canopy structure were mean and standard deviation of reflectance values, local maxima, and self-similarity measures based on the grey level co-occurrence matrix and geostatistical variogram analysis. Previously studied and published rainfall interception data comprised twelve plots and median values per land use type ranged from 30% in natural forest to 18% in cacao agroforests. A linear regression model with local maxima, mean contrast and normalized digital vegetation index (NDVI) as regressors was able to explain more than 84% ( Radj2) of the variation encountered in the data. Other investigated characteristics did not prove significant in the regression analysis. The model yielded stable results with respect to cross-validation and also produced realistic values and spatial patterns when applied at the landscape level (783.6 ha). High values of interception were rare and localized in natural forest stands distant to villages, whereas low interception characterized the intensively used sites close to settlements. We conclude that forest use intensity significantly reduced rainfall interception and satellite image analysis can successfully be applied for its regional prediction, and most forest in the study region has already been subject to human-induced structural changes.

  8. Systematic on-site monitoring of compliance dust samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grayson, R.L.; Gandy, J.R.

    1996-12-31

    Maintaining compliance with U.S. respirable coal mine dust standards can be difficult on high-productivity longwall panels. Comprehensive and systematic analysis of compliance dust sample data, coupled with access to the U.S. Bureau of Mines (USBM) DUSTPRO, can yield important information for use in maintaining compliance. The objective of this study was to develop and apply a customized software for the collection, storage, modification, and analysis of respirable dust data while providing for flexible export of data and linking with the USBM`s expert advisory system on dust control. An executable, IBM-compatible software was created and customized for use by the personmore » in charge of collecting, submitting, analyzing, and monitoring respirable dust compliance samples. Both descriptive statistics and multiple regression analysis were incorporated. The software allows ASCH files to be exported and directly links with DUSTPRO. After development and validation of the software, longwall compliance data from two different mines was analyzed to evaluate the value of the software. Data included variables on respirable dust concentration, tons produced, the existence of roof/floor rock (dummy variable), and the sampling cycle (dummy variables). Because of confidentiality, specific data will not be presented, only the equations and ANOVA tables. The final regression models explained 83.8% and 61.1% of the variation in the data for the two panels. Important correlations among variables within sampling cycles showed the value of using dummy variables for sampling cycles. The software proved flexible and fast for its intended use. The insights obtained from use improved the systematic monitoring of respirable dust compliance data, especially for pinpointing the most effective dust control methods during specific sampling cycles.« less

  9. Predictors of Readmission after Inpatient Plastic Surgery

    PubMed Central

    Jain, Umang; Salgado, Christopher; Mioton, Lauren; Rambachan, Aksharananda

    2014-01-01

    Background Understanding risk factors that increase readmission rates may help enhance patient education and set system-wide expectations. We aimed to provide benchmark data on causes and predictors of readmission following inpatient plastic surgery. Methods The 2011 National Surgical Quality Improvement Program dataset was reviewed for patients with both "Plastics" as their recorded surgical specialty and inpatient status. Readmission was tracked through the "Unplanned Readmission" variable. Patient characteristics and outcomes were compared using chi-squared analysis and Student's t-tests for categorical and continuous variables, respectively. Multivariate regression analysis was used for identifying predictors of readmission. Results A total of 3,671 inpatient plastic surgery patients were included. The unplanned readmission rate was 7.11%. Multivariate regression analysis revealed a history of chronic obstructive pulmonary disease (COPD) (odds ratio [OR], 2.01; confidence interval [CI], 1.12-3.60; P=0.020), previous percutaneous coronary intervention (PCI) (OR, 2.69; CI, 1.21-5.97; P=0.015), hypertension requiring medication (OR, 1.65; CI, 1.22-2.24; P<0.001), bleeding disorders (OR, 1.70; CI, 1.01-2.87; P=0.046), American Society of Anesthesiologists (ASA) class 3 or 4 (OR, 1.57; CI, 1.15-2.15; P=0.004), and obesity (body mass index ≥30) (OR, 1.43; CI, 1.09-1.88, P=0.011) to be significant predictors of readmission. Conclusions Inpatient plastic surgery has an associated 7.11% unplanned readmission rate. History of COPD, previous PCI, hypertension, ASA class 3 or 4, bleeding disorders, and obesity all proved to be significant risk factors for readmission. These findings will help to benchmark inpatient readmission rates and manage patient and hospital system expectations. PMID:24665418

  10. Exploring public databases to characterize urban flood risks in Amsterdam

    NASA Astrophysics Data System (ADS)

    Gaitan, Santiago; ten Veldhuis, Marie-claire; van de Giesen, Nick

    2015-04-01

    Cities worldwide are challenged by increasing urban flood risks. Precise and realistic measures are required to decide upon investment to reduce their impacts. Obvious flooding factors affecting flood risk include sewer systems performance and urban topography. However, currently implemented sewer and topographic models do not provide realistic predictions of local flooding occurrence during heavy rain events. Assessing other factors such as spatially distributed rainfall and socioeconomic characteristics may help to explain probability and impacts of urban flooding. Several public databases were analyzed: complaints about flooding made by citizens, rainfall depths (15 min and 100 Ha spatio-temporal resolution), grids describing number of inhabitants, income, and housing price (1Ha and 25Ha resolution); and buildings age. Data analysis was done using Python and GIS programming, and included spatial indexing of data, cluster analysis, and multivariate regression on the complaints. Complaints were used as a proxy to characterize flooding impacts. The cluster analysis, run for all the variables except the complaints, grouped part of the grid-cells of central Amsterdam into a highly differentiated group, covering 10% of the analyzed area, and accounting for 25% of registered complaints. The configuration of the analyzed variables in central Amsterdam coincides with a high complaint count. Remaining complaints were evenly dispersed along other groups. An adjusted R2 of 0.38 in the multivariate regression suggests that explaining power can improve if additional variables are considered. While rainfall intensity explained 4% of the incidence of complaints, population density and building age significantly explained around 20% each. Data mining of public databases proved to be a valuable tool to identify factors explaining variability in occurrence of urban pluvial flooding, though additional variables must be considered to fully explain flood risk variability.

  11. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    NASA Astrophysics Data System (ADS)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2016-07-01

    Drought monitoring and early warning is an important measure to enhance resilience towards drought. While there are numerous operational systems using different drought indicators, there is no consensus on which indicator best represents drought impact occurrence for any given sector. Furthermore, thresholds are widely applied in these indicators but, to date, little empirical evidence exists as to which indicator thresholds trigger impacts on society, the economy, and ecosystems. The main obstacle for evaluating commonly used drought indicators is a lack of information on drought impacts. Our aim was therefore to exploit text-based data from the European Drought Impact report Inventory (EDII) to identify indicators that are meaningful for region-, sector-, and season-specific impact occurrence, and to empirically determine indicator thresholds. In addition, we tested the predictability of impact occurrence based on the best-performing indicators. To achieve these aims we applied a correlation analysis and an ensemble regression tree approach, using Germany and the UK (the most data-rich countries in the EDII) as test beds. As candidate indicators we chose two meteorological indicators (Standardized Precipitation Index, SPI, and Standardized Precipitation Evaporation Index, SPEI) and two hydrological indicators (streamflow and groundwater level percentiles). The analysis revealed that accumulation periods of SPI and SPEI best linked to impact occurrence are longer for the UK compared with Germany, but there is variability within each country, among impact categories and, to some degree, seasons. The median of regression tree splitting values, which we regard as estimates of thresholds of impact occurrence, was around -1 for SPI and SPEI in the UK; distinct differences between northern/northeastern vs. southern/central regions were found for Germany. Predictions with the ensemble regression tree approach yielded reasonable results for regions with good impact data coverage. The predictions also provided insights into the EDII, in particular highlighting drought events where missing impact reports may reflect a lack of recording rather than true absence of impacts. Overall, the presented quantitative framework proved to be a useful tool for evaluating drought indicators, and to model impact occurrence. In summary, this study demonstrates the information gain for drought monitoring and early warning through impact data collection and analysis. It highlights the important role that quantitative analysis with impact data can have in providing "ground truth" for drought indicators, alongside more traditional stakeholder-led approaches.

  12. Intercomparison of infrared cavity leak-out spectroscopy and gas chromatography-flame ionization for trace analysis of ethane.

    PubMed

    Thelen, Sven; Miekisch, Wolfram; Halmer, Daniel; Schubert, Jochen; Hering, Peter; Mürtz, Manfred

    2008-04-15

    Comparison of two different methods for the measurement of ethane at the parts-per-billion (ppb) level is reported. We used cavity leak-out spectroscopy (CALOS) in the 3 microm wavelength region and gas chromatography-flame ionization detection (GC-FID) for the analysis of various gas samples containing ethane fractions in synthetic air. Intraday and interday reproducibilities were studied. Intercomparing the results of two series involving seven samples with ethane mixing ratios ranging from 0.5 to 100 ppb, we found a reasonable agreement between both methods. The scatter plot of GC-FID data versus CALOS data yields a linear regression slope of 1.07 +/- 0.03. Furthermore, some of the ethane mixtures were checked over the course of 1 year, which proved the long-term stability of the ethane mixing ratio. We conclude that CALOS shows equivalent ethane analysis precision compared to GC-FID, with the significant advantage of a much higher time resolution (<1 s) since there is no requirement for sample preconcentration. This opens new analytical possibilities, e.g., for real-time monitoring of ethane traces in exhaled human breath.

  13. Predictive factors of clinical response in steroid-refractory ulcerative colitis treated with granulocyte-monocyte apheresis

    PubMed Central

    D'Ovidio, Valeria; Meo, Donatella; Viscido, Angelo; Bresci, Giampaolo; Vernia, Piero; Caprilli, Renzo

    2011-01-01

    AIM: To identify factors predicting the clinical response of ulcerative colitis patients to granulocyte-monocyte apheresis (GMA). METHODS: Sixty-nine ulcerative colitis patients (39 F, 30 M) dependent upon/refractory to steroids were treated with GMA. Steroid dependency, clinical activity index (CAI), C reactive protein (CRP) level, erythrocyte sedimentation rate (ESR), values at baseline, use of immunosuppressant, duration of disease, and age and extent of disease were considered for statistical analysis as predictive factors of clinical response. Univariate and multivariate logistic regression models were used. RESULTS: In the univariate analysis, CAI (P = 0.039) and ESR (P = 0.017) levels at baseline were singled out as predictive of clinical remission. In the multivariate analysis steroid dependency [Odds ratio (OR) = 0.390, 95% Confidence interval (CI): 0.176-0.865, Wald 5.361, P = 0.0160] and low CAI levels at baseline (4 < CAI < 7) (OR = 0.770, 95% CI: 0.425-1.394, Wald 3.747, P = 0.028) proved to be effective as factors predicting clinical response. CONCLUSION: GMA may be a valid therapeutic option for steroid-dependent ulcerative colitis patients with mild-moderate disease and its clinical efficacy seems to persist for 12 mo. PMID:21528055

  14. In-Flight Performance Evaluation of Experimental Information Displays

    DTIC Science & Technology

    1979-05-01

    Chemical Systems Laboratory Experimentation Command Aberden Proving Ground ,MD Technical Library 21010 (1) Box 22 Fort Ord, CA 93941 (1) 21 US Amy Materiel...US Army Missile R&D Command Library, Bldg 3071 Redstone Arsenal, AL 35809 (1) ATTN: ATSL-DOSL Aberdeen Proving Ground , MD US Army Yuma Proving Ground ...Systems Chief Analysis Agency Benet Weapons Laboratory ATTN: Reports Distribution LCWSL, USA ARRADCOH Aberdeen Proving Ground , MD ATTN: DRDAR-LCB-TL

  15. Reasoning and Proving Opportunities in Textbooks: A Comparative Analysis

    ERIC Educational Resources Information Center

    Hong, Dae S.; Choi, Kyong Mi

    2018-01-01

    In this study, we analyzed and compared reasoning and proving opportunities in geometry lessons from American standard-based textbooks and Korean textbooks to understand how these textbooks provide student opportunities to engage in reasoning and proving activities. Overall, around 40% of exercise problems in Core Plus Mathematics Project (CPMP)…

  16. Rumination in migraine: Mediating effects of brooding and reflection between migraine and psychological distress

    PubMed Central

    Kokonyei, Gyongyi; Szabo, Edina; Kocsel, Natalia; Edes, Andrea; Eszlari, Nora; Pap, Dorottya; Magyar, Mate; Kovacs, David; Zsombok, Terezia; Elliott, Rebecca; Anderson, Ian Muir; William Deakin, John Francis; Bagdy, Gyorgy; Juhasz, Gabriella

    2016-01-01

    Objective: The relationship between migraine and psychological distress has been consistently reported in cross-sectional and longitudinal studies. We hypothesised that a stable tendency to perseverative thoughts such as rumination would mediate the relationship between migraine and psychological distress. Design and Main Outcomes Measures: Self-report questionnaires measuring depressive rumination, current psychological distress and migraine symptoms in two independent European population cohorts, recruited from Budapest (N = 1139) and Manchester (N = 2004), were used. Structural regression analysis within structural equation modelling was applied to test the mediational role of brooding and reflection, the components of rumination, between migraine and psychological distress. Sex, age and lifetime depression were controlled for in the analysis. Results: Migraine predicted higher brooding and reflection scores, and brooding proved to be a mediator between migraine and psychological distress in both samples, while reflection mediated the relationship significantly only in the Budapest sample. Conclusions: Elevated psychological distress in migraine is partially attributed to ruminative response style. Further studies are needed to expand our findings to clinical samples and to examine how rumination links to the adjustment to migraine. PMID:27616579

  17. On the validity of within-nuclear-family genetic association analysis in samples of extended families.

    PubMed

    Bureau, Alexandre; Duchesne, Thierry

    2015-12-01

    Splitting extended families into their component nuclear families to apply a genetic association method designed for nuclear families is a widespread practice in familial genetic studies. Dependence among genotypes and phenotypes of nuclear families from the same extended family arises because of genetic linkage of the tested marker with a risk variant or because of familial specificity of genetic effects due to gene-environment interaction. This raises concerns about the validity of inference conducted under the assumption of independence of the nuclear families. We indeed prove theoretically that, in a conditional logistic regression analysis applicable to disease cases and their genotyped parents, the naive model-based estimator of the variance of the coefficient estimates underestimates the true variance. However, simulations with realistic effect sizes of risk variants and variation of this effect from family to family reveal that the underestimation is negligible. The simulations also show the greater efficiency of the model-based variance estimator compared to a robust empirical estimator. Our recommendation is therefore, to use the model-based estimator of variance for inference on effects of genetic variants.

  18. Portable visible and near-infrared spectrophotometer for triglyceride measurements.

    PubMed

    Kobayashi, Takanori; Kato, Yukiko Hakariya; Tsukamoto, Megumi; Ikuta, Kazuyoshi; Sakudo, Akikazu

    2009-01-01

    An affordable and portable machine is required for the practical use of visible and near-infrared (Vis-NIR) spectroscopy. A portable fruit tester comprising a Vis-NIR spectrophotometer was modified for use in the transmittance mode and employed to quantify triglyceride levels in serum in combination with a chemometric analysis. Transmittance spectra collected in the 600- to 1100-nm region were subjected to a partial least-squares regression analysis and leave-out cross-validation to develop a chemometrics model for predicting triglyceride concentrations in serum. The model yielded a coefficient of determination in cross-validation (R2VAL) of 0.7831 with a standard error of cross-validation (SECV) of 43.68 mg/dl. The detection limit of the model was 148.79 mg/dl. Furthermore, masked samples predicted by the model yielded a coefficient of determination in prediction (R2PRED) of 0.6856 with a standard error of prediction (SEP) and detection limit of 61.54 and 159.38 mg/dl, respectively. The portable Vis-NIR spectrophotometer may prove convenient for the measurement of triglyceride concentrations in serum, although before practical use there remain obstacles, which are discussed.

  19. The impact of social and family-related factors on women's stress experience in household and family work.

    PubMed

    Sperlich, Stefanie; Geyer, Siegfried

    2015-03-01

    This study explores the contribution of social and family-related factors to women's experience of an effort-reward imbalance (ERI) in household and family work. Using a population-based sample of German mothers (n = 3,129), we performed stepwise logistic regression analysis in order to determine the relative impact of social and family-related factors on ERI. All factors investigated showed a significant association with at least one ERI component. Considering all predictors simultaneously in the multivariate analysis resulted in a decrease in significance of socioeconomic status in explaining the effort-reward ratio while the impact on low reward partly remained significant. In addition, age of youngest child, number of children, lower levels of perceived social support, domestic work inequity and negative work-to-family spillover, irrespective of being half- or full-time employed, revealed to be important in predicting ERI. The experience of ERI in domestic work is influenced by the social and family environment. Particularly among socially disadvantaged mothers, lack of social recognition for household and family work proved to be a relevant source of psychosocial stress.

  20. Social support network, mental health and quality of life: a cross-sectional study in primary care.

    PubMed

    Portugal, Flávia Batista; Campos, Mônica Rodrigues; Correia, Celina Ragoni; Gonçalves, Daniel Almeida; Ballester, Dinarte; Tófoli, Luis Fernando; Mari, Jair de Jesus; Gask, Linda; Dowrick, Christopher; Bower, Peter; Fortes, Sandra

    2016-12-22

    The objective of this study was to identify the association between emotional distress and social support networks with quality of life in primary care patients. This was a cross-sectional study involving 1,466 patients in the cities of São Paulo and Rio de Janeiro, Brazil, in 2009/2010. The General Health Questionnaire, the Hospital Anxiety and Depression Scale and the brief version of the World Health Organization Quality of Life Instrument were used. The Social Support Network Index classified patients with the highest and lowest index as socially integrated or isolated. A bivariate analysis and four multiple linear regressions were conducted for each quality of life outcome. The means scores for the physical, psychological, social relations, and environment domains were, respectively, 64.7; 64.2; 68.5 and 49.1. In the multivariate analysis, the psychological domain was negatively associated with isolation, whereas the social relations and environment domains were positively associated with integration. Integration and isolation proved to be important factors for those in emotional distress as they minimize or maximize negative effects on quality of life.

  1. Association of adiposity indices with bone density and bone turnover in the Chinese population.

    PubMed

    Wang, J; Yan, D; Hou, X; Chen, P; Sun, Q; Bao, Y; Hu, C; Zhang, Z; Jia, W

    2017-09-01

    Associations of adiposity indices with bone mineral density (BMD) and bone turnover markers were evaluated in Chinese participants. Body mass index, fat mass, and lean mass are positively related to BMD in both genders. Subcutaneous fat area was proved to be negatively associated with BMD and positively correlated with osteocalcin in postmenopausal females. Obesity is highly associated with osteoporosis, but the effect of adipose tissue on bone is contradictory. Our study aimed to assess the associations of adiposity indices with bone mineral density (BMD) and bone turnover markers (BTMs) in the Chinese population. Our study recruited 5215 participants from the Shanghai area, evaluated related anthropometric and biochemical traits in all participants, tested serum BTMs, calculated fat distribution using magnetic resonance imaging (MRI) images and image analysis software, and tested BMD with dual-energy X-ray absorptiometry. When controlled for age, all adiposity indices were positively correlated with BMD of all sites for both genders. As for the stepwise regression analysis, body mass index (BMI), fat mass, and lean mass were protective for BMD in both genders. However, subcutaneous fat area (SFA) was detrimental for BMD of the L1-4 and femoral neck (β ± SE -0.0742 ± 0.0174; p = 2.11E-05; β ± SE -0.0612 ± 0.0147; p = 3.07E-05). Adiposity indices showed a negative correlation with BTMs adjusting for age, especially with osteocalcin. In the stepwise regression analysis, fat mass was negatively correlated with osteocalcin (β ± SE -8.8712 ± 1.4902; p = 4.17E-09) and lean mass showed a negative correlation with N-terminal procollagen of type I collagen (PINP) for males (β ± SE -0.3169 ± 0.0917; p = 0.0006). In females, BMI and visceral fat area (VFA) were all negatively associated with osteocalcin (β ± SE -0.4423 ± 0.0663; p = 2.85E-11; β ± SE -7.1982 ± 1.1094; p = 9.95E-11), while SFA showed a positive correlation with osteocalcin (β ± SE: 5.5993 ± 1.1753; p = 1.98E-06). BMI, fat mass, and lean mass are proved to be beneficial for BMD in both males and postmenopausal females. SFA is negatively associated with BMD and positively correlated with osteocalcin in postmenopausal females.

  2. Dental age estimation in the living after completion of third molar mineralization: new data for Gustafson's criteria.

    PubMed

    Timme, M; Timme, W H; Olze, A; Ottow, C; Ribbecke, S; Pfeiffer, H; Dettmeyer, R; Schmeling, A

    2017-03-01

    There is a need for dental age estimation methods after completion of the third molar mineralization. Degenerative dental characteristics appear to be suitable for forensic age diagnostics beyond the 18th year of life. In 2012, Olze et al. investigated the criteria studied by Gustafson using orthopantomograms. The objective of this study was to prove the applicability and reliability of this method with a large cohort and a wide age range, including older individuals. For this purpose, 2346 orthopantomograms of 1167 female and 1179 male Germans aged 15 to 70 years were reviewed. The characteristics of secondary dentin formation, cementum apposition, periodontal recession and attrition were evaluated in all the mandibular premolars. The correlation of the individual characteristics with the chronological age was examined by means of a stepwise multiple regression analysis, in which the chronological age formed the dependent variable. Following those results, R 2 values amounted to 0.73 to 0.8; the standard error of estimate was 6.8 to 8.2 years. Fundamentally, the recommendation for conducting age estimations in the living by these methods can be shared. The values for the quality of the regression are, however, not precise enough for a reliable age estimation around regular retirement date ages. More precise regression formulae for the age group of 15 to 40 years of life are separately presented in this study. Further research should investigate the influence of ethnicity, dietary habits and modern health care on the degenerative characteristics in question.

  3. Prediction of Mass Spectral Response Factors from Predicted Chemometric Data for Druglike Molecules

    NASA Astrophysics Data System (ADS)

    Cramer, Christopher J.; Johnson, Joshua L.; Kamel, Amin M.

    2017-02-01

    A method is developed for the prediction of mass spectral ion counts of drug-like molecules using in silico calculated chemometric data. Various chemometric data, including polar and molecular surface areas, aqueous solvation free energies, and gas-phase and aqueous proton affinities were computed, and a statistically significant relationship between measured mass spectral ion counts and the combination of aqueous proton affinity and total molecular surface area was identified. In particular, through multilinear regression of ion counts on predicted chemometric data, we find that log10(MS ion counts) = -4.824 + c 1•PA + c 2•SA, where PA is the aqueous proton affinity of the molecule computed at the SMD(aq)/M06-L/MIDI!//M06-L/MIDI! level of electronic structure theory, SA is the total surface area of the molecule in its conjugate base form, and c 1 and c 2 have values of -3.912 × 10-2 mol kcal-1 and 3.682 × 10-3 Å-2. On a 66-molecule training set, this regression exhibits a multiple R value of 0.791 with p values for the intercept, c 1, and c 2 of 1.4 × 10-3, 4.3 × 10-10, and 2.5 × 10-6, respectively. Application of this regression to an 11-molecule test set provides a good correlation of prediction with experiment ( R = 0.905) albeit with a systematic underestimation of about 0.2 log units. This method may prove useful for semiquantitative analysis of drug metabolites for which MS response factors or authentic standards are not readily available.

  4. Quantitative skeletal maturation estimation using cone-beam computed tomography-generated cervical vertebral images: a pilot study in 5- to 18-year-old Japanese children.

    PubMed

    Byun, Bo-Ram; Kim, Yong-Il; Yamaguchi, Tetsutaro; Maki, Koutaro; Ko, Ching-Chang; Hwang, Dea-Seok; Park, Soo-Byung; Son, Woo-Sung

    2015-11-01

    The purpose of this study was to establish multivariable regression models for the estimation of skeletal maturation status in Japanese boys and girls using the cone-beam computed tomography (CBCT)-based cervical vertebral maturation (CVM) assessment method and hand-wrist radiography. The analyzed sample consisted of hand-wrist radiographs and CBCT images from 47 boys and 57 girls. To quantitatively evaluate the correlation between the skeletal maturation status and measurement ratios, a CBCT-based CVM assessment method was applied to the second, third, and fourth cervical vertebrae. Pearson's correlation coefficient analysis and multivariable regression analysis were used to determine the ratios for each of the cervical vertebrae (p < 0.05). Four characteristic parameters ((OH2 + PH2)/W2, (OH2 + AH2)/W2, D2, AH3/W3), as independent variables, were used to build the multivariable regression models: for the Japanese boys, the skeletal maturation status according to the CBCT-based quantitative cervical vertebral maturation (QCVM) assessment was 5.90 + 99.11 × AH3/W3 - 14.88 × (OH2 + AH2)/W2 + 13.24 × D2; for the Japanese girls, it was 41.39 + 59.52 × AH3/W3 - 15.88 × (OH2 + PH2)/W2 + 10.93 × D2. The CBCT-generated CVM images proved very useful to the definition of the cervical vertebral body and the odontoid process. The newly developed CBCT-based QCVM assessment method showed a high correlation between the derived ratios from the second cervical vertebral body and odontoid process. There are high correlations between the skeletal maturation status and the ratios of the second cervical vertebra based on the remnant of dentocentral synchondrosis.

  5. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    NASA Technical Reports Server (NTRS)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component analysis is proposed to suppress the multicollinearities in order to make these Jacobians robust and physically meaningful.

  6. Marginal analysis in assessing factors contributing time to physician in the Emergency Department using operations data.

    PubMed

    Pathan, Sameer A; Bhutta, Zain A; Moinudheen, Jibin; Jenkins, Dominic; Silva, Ashwin D; Sharma, Yogdutt; Saleh, Warda A; Khudabakhsh, Zeenat; Irfan, Furqan B; Thomas, Stephen H

    2016-01-01

    Background: Standard Emergency Department (ED) operations goals include minimization of the time interval (tMD) between patients' initial ED presentation and initial physician evaluation. This study assessed factors known (or suspected) to influence tMD with a two-step goal. The first step was generation of a multivariate model identifying parameters associated with prolongation of tMD at a single study center. The second step was the use of a study center-specific multivariate tMD model as a basis for predictive marginal probability analysis; the marginal model allowed for prediction of the degree of ED operations benefit that would be affected with specific ED operations improvements. Methods: The study was conducted using one month (May 2015) of data obtained from an ED administrative database (EDAD) in an urban academic tertiary ED with an annual census of approximately 500,000; during the study month, the ED saw 39,593 cases. The EDAD data were used to generate a multivariate linear regression model assessing the various demographic and operational covariates' effects on the dependent variable tMD. Predictive marginal probability analysis was used to calculate the relative contributions of key covariates as well as demonstrate the likely tMD impact on modifying those covariates with operational improvements. Analyses were conducted with Stata 14MP, with significance defined at p  < 0.05 and confidence intervals (CIs) reported at the 95% level. Results: In an acceptable linear regression model that accounted for just over half of the overall variance in tMD (adjusted r 2 0.51), important contributors to tMD included shift census ( p  = 0.008), shift time of day ( p  = 0.002), and physician coverage n ( p  = 0.004). These strong associations remained even after adjusting for each other and other covariates. Marginal predictive probability analysis was used to predict the overall tMD impact (improvement from 50 to 43 minutes, p  < 0.001) of consistent staffing with 22 physicians. Conclusions: The analysis identified expected variables contributing to tMD with regression demonstrating significance and effect magnitude of alterations in covariates including patient census, shift time of day, and number of physicians. Marginal analysis provided operationally useful demonstration of the need to adjust physician coverage numbers, prompting changes at the study ED. The methods used in this analysis may prove useful in other EDs wishing to analyze operations information with the goal of predicting which interventions may have the most benefit.

  7. Quantitative analysis of the major constituents of St John's wort with HPLC-ESI-MS.

    PubMed

    Chandrasekera, Dhammitha H; Welham, Kevin J; Ashton, David; Middleton, Richard; Heinrich, Michael

    2005-12-01

    A method was developed to profile the major constituents of St John's wort extracts using high-performance liquid chromatography-electrospray mass spectrometry (HPLC-ESI-MS). The objective was to simultaneously separate, identify and quantify hyperforin, hypericin, pseudohypericin, rutin, hyperoside, isoquercetrin, quercitrin and chlorogenic acid using HPLC-MS. Quantification was performed using an external standardisation method with reference standards. The method consisted of two protocols: one for the analysis of flavonoids and glycosides and the other for the analysis of the more lipophilic hypericins and hyperforin. Both protocols used a reverse phase Luna phenyl hexyl column. The separation of the flavonoids and glycosides was achieved within 35 min and that of the hypericins and hyperforin within 9 min. The linear response range in ESI-MS was established for each compound and all had linear regression coefficient values greater than 0.97. Both protocols proved to be very specific for the constituents analysed. MS analysis showed no other signals within the analyte peaks. The method was robust and applicable to alcoholic tinctures, tablet/capsule extracts in various solvents and herb extracts. The method was applied to evaluate the phytopharmaceutical quality of St John's wort preparations available in the UK in order to test the method and investigate if they contain at least the main constituents and at what concentrations.

  8. Principal component regression analysis with SPSS.

    PubMed

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  9. A Computer Program to Implement the Chen Method of Dimensional Analysis

    DTIC Science & Technology

    1990-01-01

    Director: AXHE-S (m. B Corna)U.S. Army TRADOX Systems Analysis Activity ATTdN: AXrE-IS (Mr. B. Corona) ATM: ATOR-TSL Aberden Proving Ground , MD 21005-5001...Laboratory I Aberdeen Proving Ground , MD 21005-5066 ATTN: AMSMI-ROC Redstone Arsenal, AL 35898-5242 Direct or D U.S. Army Human Engineering Laboratory 1...Kokinakis) U.S. Army Missile Laboratory Aberdeen Proving Ground , MD 21005-5066 ReTN AMSMI-R C1edstone Arsenal, AL 35898-5242 Director Director 1 U.S. Army

  10. Detection of relationships between SUDOSCAN with estimated glomerular filtration rate (eGFR) in Chinese patients with type 2 diabetes.

    PubMed

    Mao, Fei; Zhu, Xiaoming; Lu, Bin; Li, Yiming

    2018-04-01

    SUDOSCAN (Impeto Medical, Paris, France) has been proved to be a new and non-invasive method in detecting renal dysfunction in type 2 diabetes mellitus (T2DM) patients. In this study, we sought to compare the result of diabetic kidney dysfunction score (DKD-score) of SUDOSCAN with estimated glomerular filtration rate (eGFR) by using quantile regression analysis, which was completely different from previous studies. A total number of 223 Chinese T2DM patients were enrolled in the study. SUDOSCAN, renal function test (including blood urea nitrogen, creatinine and uric acid) and 99 mTc-diethylenetriamine pentaacetic acid ( 99 mTc-DTPA) renal dynamic imaging were performed in all T2DM patients. DKD-score of SUDOSCAN was compared with eGFR detected by 99 mTc-DTPA renal dynamic imaging through quantile regression analysis. Its validation and utility was further determined through bias and precision test. The quantile regression analysis demonstrated the relationship with eGFR was inverse and significant for almost all percentiles of DKD-score. The coefficients decreased as the percentile of DKD-score increased. And in validation data set, both the bias and precision were increased with the eGFR (median difference, -21.2 ml/min/1.73 m 2 for all individuals vs. -4.6 ml/min/1.73 m 2 for eGFR between 0 and 59 ml/min/1.73 m 2 ; interquartile range [IQR] for the difference, -25.4 ml/min/1.73 m 2 vs. -14.7 ml/min/1.73 m 2 ). The eGFR category misclassification rate were 10% in eGFR 0-59 ml/min/1.73 m 2 group, 57.3% in 60-90 group, and 87.2% in eGFR > 90 ml/min/1.73 m 2 group. DKD-score of SUDOSCAN could be used to detect renal dysfunction in T2DM patients. A higher prognostic value of DKD-score was detected when eGFR level was lower. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Evaluation of injury criteria for the prediction of commotio cordis from lacrosse ball impacts.

    PubMed

    Dau, Nathan; Cavanaugh, John; Bir, Cynthia; Link, Mark

    2011-11-01

    Commotio Cordis (CC) is the second leading cause of mortality in youth sports. Impacts occurring directly over the left ventricle (LV) during a vulnerable period of the cardiac cycle can cause ventricular fibrillation (VF), which results in CC. In order to better understand the pathophysiology of CC, and develop a mechanical model for CC, appropriate injury criteria need to be developed. This effort consisted of impacts to seventeen juvenile porcine specimens (mass 21-45 kg). Impacts were delivered over the cardiac silhouette during the venerable period of the cardiac cycle. Four impact speeds were used: 13.4, 17.9, 22.4, and 26.8 m/s. The impactor was a lacrosse ball on an aluminum shaft instrumented with an accelerometer (mass 188 g-215 g). The impacts were recorded using high-speed video. LV pressure was measured with a catheter. Univariate binary logistic regression analyses were performed to evaluate the predictive ability of ten injury criteria. A total of 187 impacts were used in the analysis. The criteria were evaluated on their predictive ability based on Somers' D (D) and Goodman-Kruskal gamma (γ). Injury risk functions were created for all criteria using a 2-parameter Weibull distribution using survival analysis. The best criteria for predicting CC were impact force (D=0.52, and γ=0.52) force*compression (D=0.49, and γ=0.49), and impact power (D=0.49, and γ=0.49). All of these criteria proved significant in predicting the probability of CC from projectile impacts in youth sports (p<0.01). Force proved to be the most predictive of the ten criteria evaluated.

  12. Decreased levels of serum complement C3 and natural killer cells add to the predictive value of total immunoglobulin G for severe infection in heart transplant recipients.

    PubMed

    Sarmiento, E; del Pozo, N; Gallego, A; Fernández-Yañez, J; Palomo, J; Villa, A; Ruiz, M; Muñoz, P; Rodríguez, C; Rodríguez-Molina, J; Navarro, J; Kotsch, K; Fernandez-Cruz, E; Carbone, J

    2012-10-01

    Infection remains a source of mortality in heart recipients. We previously reported that post-transplant immunoglobulin G (IgG) quantification can help identify the risk for infection. We assessed whether other standardized parameters of humoral and cellular immunity could prove useful when identifying patients at risk of infection. We prospectively studied 133 heart recipients over a 12-month period. Forty-eight patients had at least one episode of severe infection. An event was defined as an infection requiring intravenous antimicrobial therapy. Cox regression analysis revealed an association between the risk of developing infection and the following: lower IgG2 subclass levels (day 7: relative hazard [RH] 1.71; day 30: RH 1.76), lower IgA levels (day 7: RH 1.61; day 30: RH 1.91), lower complement C3 values (day 7: RH 1.25), lower CD3 absolute counts (day 30: RH 1.10), lower absolute natural killer [NK] cell count (day 7: RH 1.24), and lower IgG concentrations (day 7: RH 1.31; day 30: RH 1.36). Cox regression bivariate analysis revealed that lower day 7 C3 levels, IgG2 concentration, and absolute NK cell count remained significant after adjustment for total IgG levels. Data suggest that early immune monitoring including C3, IgG2, and NK cell testing in addition to IgG concentrations is useful when attempting to identify the risk of infection in heart transplant recipients. © 2012 John Wiley & Sons A/S.

  13. Multilevel Effects of Wealth on Women's Contraceptive Use in Mozambique

    PubMed Central

    Dias, José G.; de Oliveira, Isabel Tiago

    2015-01-01

    Objective This paper analyzes the impact of wealth on the use of contraception in Mozambique unmixing the contextual effects due to community wealth from the individual effects associated with the women's situation within the community of residence. Methods Data from the 2011 Mozambican Demographic and Health Survey on women who are married or living together are analyzed for the entire country and also for the rural and urban areas separately. We used single level and multilevel probit regression models. Findings A single level probit regression reveals that region, religion, age, previous fertility, education, and wealth impact contraceptive behavior. The multilevel analysis shows that average community wealth and the women’s relative socioeconomic position within the community have significant positive effects on the use of modern contraceptives. The multilevel framework proved to be necessary in rural settings but not relevant in urban areas. Moreover, the contextual effects due to community wealth are greater in rural than in urban areas and this feature is associated with the higher socioeconomic heterogeneity within the richest communities. Conclusion This analysis highlights the need for the studies on contraceptive behavior to specifically address the individual and contextual effects arising from the poverty-wealth dimension in rural and urban areas separately. The inclusion in a particular community of residence is not relevant in urban areas, but it is an important feature in rural areas. Although the women's individual position within the community of residence has a similar effect on contraceptive adoption in rural and urban settings, the impact of community wealth is greater in rural areas and smaller in urban areas. PMID:25786228

  14. Positive effect of human milk feeding during NICU hospitalization on 24 month neurodevelopment of very low birth weight infants: an Italian cohort study.

    PubMed

    Gibertoni, Dino; Corvaglia, Luigi; Vandini, Silvia; Rucci, Paola; Savini, Silvia; Alessandroni, Rosina; Sansavini, Alessandra; Fantini, Maria Pia; Faldella, Giacomo

    2015-01-01

    The aim of this study was to determine the effect of human milk feeding during NICU hospitalization on neurodevelopment at 24 months of corrected age in very low birth weight infants. A cohort of 316 very low birth weight newborns (weight ≤ 1500 g) was prospectively enrolled in a follow-up program on admission to the Neonatal Intensive Care Unit of S. Orsola Hospital, Bologna, Italy, from January 2005 to June 2011. Neurodevelopment was evaluated at 24 months corrected age using the Griffiths Mental Development Scale. The effect of human milk nutrition on neurodevelopment was first investigated using a multiple linear regression model, to adjust for the effects of gestational age, small for gestational age, complications at birth and during hospitalization, growth restriction at discharge and socio-economic status. Path analysis was then used to refine the multiple regression model, taking into account the relationships among predictors and their temporal sequence. Human milk feeding during NICU hospitalization and higher socio-economic status were associated with better neurodevelopment at 24 months in both models. In the path analysis model intraventricular hemorrhage-periventricular leukomalacia and growth restriction at discharge proved to be directly and independently associated with poorer neurodevelopment. Gestational age and growth restriction at birth had indirect significant effects on neurodevelopment, which were mediated by complications that occurred at birth and during hospitalization, growth restriction at discharge and type of feeding. In conclusion, our findings suggest that mother's human milk feeding during hospitalization can be encouraged because it may improve neurodevelopment at 24 months corrected age.

  15. Career-Success Scale – A new instrument to assess young physicians' academic career steps

    PubMed Central

    Buddeberg-Fischer, Barbara; Stamm, Martina; Buddeberg, Claus; Klaghofer, Richard

    2008-01-01

    Background Within the framework of a prospective cohort study of Swiss medical school graduates, a Career-Success Scale (CSS) was constructed in a sample of young physicians choosing different career paths in medicine. Furthermore the influence of personality factors, the participants' personal situation, and career related factors on their career success was investigated. Methods 406 residents were assessed in terms of career aspired to, and their career progress. The Career-Success Scale, consisting of 7 items, was developed and validated, addressing objective criteria of academic career advancement. The influence of gender and career aspiration was investigated by a two-factorial analysis of variance, the relationships between personality factors, personal situation, career related factors and the Career-Success Scale by a multivariate linear regression analysis. Results The unidimensional Career-Success Scale has an internal consistency of 0.76. It is significantly correlated at the bivariate level with gender, instrumentality, and all career related factors, particularly with academic career and received mentoring. In multiple regression, only gender, academic career, surgery as chosen specialty, and received mentoring are significant predictors. The highest values were observed in participants aspiring to an academic career, followed by those pursuing a hospital career and those wanting to run a private practice. Independent of the career aspired to, female residents have lower scores than their male colleagues. Conclusion The Career-Success Scale proved to be a short, reliable and valid instrument to measure career achievements. As mentoring is an independent predictor of career success, mentoring programs could be an important instrument to specifically enhance careers of female physicians in academia. PMID:18518972

  16. Career-success scale - a new instrument to assess young physicians' academic career steps.

    PubMed

    Buddeberg-Fischer, Barbara; Stamm, Martina; Buddeberg, Claus; Klaghofer, Richard

    2008-06-02

    Within the framework of a prospective cohort study of Swiss medical school graduates, a Career-Success Scale (CSS) was constructed in a sample of young physicians choosing different career paths in medicine. Furthermore the influence of personality factors, the participants' personal situation, and career related factors on their career success was investigated. 406 residents were assessed in terms of career aspired to, and their career progress. The Career-Success Scale, consisting of 7 items, was developed and validated, addressing objective criteria of academic career advancement. The influence of gender and career aspiration was investigated by a two-factorial analysis of variance, the relationships between personality factors, personal situation, career related factors and the Career-Success Scale by a multivariate linear regression analysis. The unidimensional Career-Success Scale has an internal consistency of 0.76. It is significantly correlated at the bivariate level with gender, instrumentality, and all career related factors, particularly with academic career and received mentoring. In multiple regression, only gender, academic career, surgery as chosen specialty, and received mentoring are significant predictors. The highest values were observed in participants aspiring to an academic career, followed by those pursuing a hospital career and those wanting to run a private practice. Independent of the career aspired to, female residents have lower scores than their male colleagues. The Career-Success Scale proved to be a short, reliable and valid instrument to measure career achievements. As mentoring is an independent predictor of career success, mentoring programs could be an important instrument to specifically enhance careers of female physicians in academia.

  17. Cancer patients use hospital-based care until death: a further analysis of the Dutch Bone Metastasis Study.

    PubMed

    Meeuse, Jan J; van der Linden, Yvette M; Post, Wendy J; Wanders, Rinus; Gans, Rijk O B; Leer, Jan Willem H; Reyners, Anna K L

    2011-10-01

    To describe health care utilization (HCU) at the end of life in cancer patients. These data are relevant to plan palliative care services, and to develop training programs for involved health care professionals. The Dutch Bone Metastasis Study (DBMS) was a nationwide study proving equal effectiveness of single fraction palliative radiotherapy compared with multiple fractions for painful bone metastases in 1157 patients. The 860 (74%) patients who died during follow-up were included in the current analysis. The main outcome was the frequency of hospital-based (outpatient contact or admission) and/or general practitioner (GP) contact during the last 12 weeks of life. Changes in HCU towards death were related to data on quality of life and pain intensity using a multilevel regression model. Hospital-based HCU was reported in 1801 (63%) returned questionnaires, whereas GP contact was stated in 1246 (43%). In 573 (20%) questionnaires, both types of HCU were reported. In multilevel regression analyses, the frequency of outpatient contacts remained constant during the weeks towards death, whereas the frequency of GP contacts increased. Lower valuation of quality of life was related to both GP- and hospital-based HCU. There was a high consumption of hospital-based HCU in the last 12 weeks of life of cancer patients with bone metastases. Hospital-based HCU did not decrease during the weeks towards death, despite an increase in GP contacts. Future planning of palliative care and training programs should encompass close collaboration between medical specialists and GPs to optimize end-of-life care.

  18. A new look at patient satisfaction: learning from self-organizing maps.

    PubMed

    Voutilainen, Ari; Kvist, Tarja; Sherwood, Paula R; Vehviläinen-Julkunen, Katri

    2014-01-01

    To some extent, results always depend on the methods used, and the complete picture of the phenomenon of interest can be drawn only by combining results of different data processing techniques. This emphasizes the use of a wide arsenal of methods for processing and analyzing patient satisfaction surveys. The purpose of this study was to introduce the self-organizing map (SOM) to nursing science and to illustrate the use of the SOM with patient satisfaction data. The SOM is a widely used artificial neural network suitable for clustering and exploring all kind of data sets. The study was partly a secondary analysis of data collected for the Attractive and Safe Hospital Study from four Finnish hospitals in 2008 and 2010 using the Revised Humane Caring Scale. The sample consisted of 5,283 adult patients. The SOM was used to cluster the data set according to (a) respondents and (b) questionnaire items. The SOM was also used as a preprocessor for multinomial logistic regression. An analysis of missing data was carried out to improve the data interpretation. Combining results of the two SOMs and the logistic regression revealed associations between the level of satisfaction, different components of satisfaction, and item nonresponse. The common conception that the relationship between patient satisfaction and age is positive may partly be due to positive association between the tendency of item nonresponse and age. The SOM proved to be a useful method for clustering a questionnaire data set even when the data set was low dimensional per se. Inclusion of empty responses in analyses may help to detect possible misleading noncausative relationships.

  19. Modeling Verdict Outcomes Using Social Network Measures: The Watergate and Caviar Network Cases

    PubMed Central

    2016-01-01

    Modelling criminal trial verdict outcomes using social network measures is an emerging research area in quantitative criminology. Few studies have yet analyzed which of these measures are the most important for verdict modelling or which data classification techniques perform best for this application. To compare the performance of different techniques in classifying members of a criminal network, this article applies three different machine learning classifiers–Logistic Regression, Naïve Bayes and Random Forest–with a range of social network measures and the necessary databases to model the verdicts in two real–world cases: the U.S. Watergate Conspiracy of the 1970’s and the now–defunct Canada–based international drug trafficking ring known as the Caviar Network. In both cases it was found that the Random Forest classifier did better than either Logistic Regression or Naïve Bayes, and its superior performance was statistically significant. This being so, Random Forest was used not only for classification but also to assess the importance of the measures. For the Watergate case, the most important one proved to be betweenness centrality while for the Caviar Network, it was the effective size of the network. These results are significant because they show that an approach combining machine learning with social network analysis not only can generate accurate classification models but also helps quantify the importance social network variables in modelling verdict outcomes. We conclude our analysis with a discussion and some suggestions for future work in verdict modelling using social network measures. PMID:26824351

  20. JPSS Preparations at the Satellite Proving Ground for Marine, Precipitation, and Satellite Analysis

    NASA Technical Reports Server (NTRS)

    Folmer, Michael J.; Berndt, E.; Clark, J.; Orrison, A.; Kibler, J.; Sienkiewicz, J.; Nelson, J.; Goldberg, M.; Sjoberg, W.

    2016-01-01

    The ocean prediction center at the national hurricane center's tropical analysis and forecast Branch, the Weather Prediction center and the Satellite analysis branch of NESDIS make up the Satellite Proving Ground for Marine, Precipitation and Satellite Analysis. These centers had early exposure to JPSS products using the S-NPP Satellite that was launched in 2011. Forecasters continue to evaluate new products in anticipation for the launch of JPSS-1 sometime in 2017.

  1. Regression Analysis by Example. 5th Edition

    ERIC Educational Resources Information Center

    Chatterjee, Samprit; Hadi, Ali S.

    2012-01-01

    Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

  2. Generating and Using Examples in the Proving Process

    ERIC Educational Resources Information Center

    Sandefur, J.; Mason, J.; Stylianides, G. J.; Watson, A.

    2013-01-01

    We report on our analysis of data from a dataset of 26 videotapes of university students working in groups of 2 and 3 on different proving problems. Our aim is to understand the role of example generation in the proving process, focusing on deliberate changes in representation and symbol manipulation. We suggest and illustrate four aspects of…

  3. Commognitive Analysis of Undergraduate Mathematics Students' Responses in Proving Subgroup's Non-Emptiness

    ERIC Educational Resources Information Center

    Ioannou, Marios

    2016-01-01

    Proving that a given set is indeed a subgroup, one needs to show that it is non-empty, and closed under operation and inverses. This study focuses on the first condition, analysing students' responses to this task. Results suggest that there are three distinct problematic responses: the total absence of proving this condition, the problematic…

  4. Cancer regression induced by modified CTL therapy is regulated by HLA class II and class I antigens in Japanese patients with advanced cancer.

    PubMed

    Araki, K; Noguchi, Y; Hirouchi, T; Yoshikawa, E; Kataoka, S; Silverni, L; Miyazawa, H; Kuzuhara, H; Suzuki, C; Shimada, Y; Hamasato, S; Maeda, N; Shimamura, Y; Ogawa, Y; Ohtsuki, Y; Fujimoto, S

    2000-12-01

    Autologous cancer-specific bulk CTLs are unlikely to be induced by in vitro CTL generation (ivtCTLG) using peripheral blood mononuclear cells (PBMCs) of cancer patients when autologous cancer cells are used as in vitro stimulators. However, autologous cancer-specific bulk CTLs are frequently activated when allogeneic cancer cells are used as in vitro stimulators, regardless of the type of cancer cell. We have developed a cancer-specific immunotherapy called modified CTL therapy, which involves adoptive immunotherapy of autologous cancer-specific bulk CTLs after active immunization of autologous or allogeneic cancer cells screened as in vitro stimulators according to their ability to induce autologous cancer-specific CTLs (ACS. CTLs). Cancer did not regress in patients in whom ACS.CTLs were not induced by ivtCTLG using the patients' PBMCs in therapy. Cancer regression, albeit temporary, occurred solely in patients under the immunological condition that ACS.CTLs were induced by ivtCTLG using PBMCs through the therapy. The induction of ACS.CTLs by ivtCTLG using patient PBMCs in therapy was related to patients' HLA class II antigens. HLA DR8 was seen more frequently in ACS.CTL-inducible patients than in ACS.CTL-uninducible patients (P=0.051). On the contrary, HLA DQ3 was seen more frequently in ACS.CTL-uninducible patients (P=0.055). On the other hand, the success in therapy, albeit temporary, was related mainly to patients' HLA class I antigens. HLA B61 was seen more frequently in patients whose therapy proved effective than in patients whose therapy proved ineffective (P=0.018). HLA Cw7 was seen more frequently in therapy-ineffective patients (P=0.040).

  5. Papillary type 2 versus clear cell renal cell carcinoma: Survival outcomes.

    PubMed

    Simone, G; Tuderti, G; Ferriero, M; Papalia, R; Misuraca, L; Minisola, F; Costantini, M; Mastroianni, R; Sentinelli, S; Guaglianone, S; Gallucci, M

    2016-11-01

    To compare the cancer specific survival (CSS) between p2-RCC and a Propensity Score Matched (PSM) cohort of cc-RCC patients. Fifty-five (4.6%) patients with p2-RCC and 920 cc-RCC patients were identified within a prospectively maintained institutional dataset of 1205 histologically proved RCC patients treated with either RN or PN. Univariable and multivariable Cox regression analyses were used to identify predictors of CSS after surgical treatment. A 1:2 PSM analysis based on independent predictors of oncologic outcomes was employed and CSS was compared between PSM selected cc-RCC patients using Kaplan-Meier and Cox regression analysis. Overall, 55 (4.6%) p2-RCC and 920 (76.3%) cc-RCC patients were selected from the database; p2-RCC were significantly larger (p = 0.001), more frequently locally advanced (p < 0.001) and node positive (p < 0.001) and had significantly higher Fuhrman grade (p < 0.001) than cc-RCC. On multivariable Cox regression analysis age (p = 0.025), histologic subtype (p = 0.029), pN stage (p = 0.006), size, pT stage, cM stage, sarcomatoid features and Fuhrman grade (all p < 0.001) were independent predictors of CSS. After applying the PSM, 82 cc-RCC selected cases were comparable to 41 p2-RCC for age (p = 0.81), tumor size (p = 0.39), pT (p = 1.00) and pN (p = 0.62) stages, cM stage (p = 0.71) and Fuhrman grade (p = 1). In this PSM cohort, 5 yr CSS was significantly lower in the p2-RCC (63% vs 72.4%; p = 0.047). At multivariable Cox analysis p2 histology was an independent predictor of CSM (HR 2.46, 95% CI 1.04-5.83; p = 0.041). We confirmed the tendency of p2-RCC to present as locally advanced and metastatic disease more frequently than cc-RCC and demonstrated p2-RCC histology as an independent predictor of worse oncologic outcomes. Copyright © 2016 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  6. Novel and successful free comments method for sensory characterization of chocolate ice cream: A comparative study between pivot profile and comment analysis.

    PubMed

    Fonseca, Fernando G A; Esmerino, Erick A; Filho, Elson R Tavares; Ferraz, Juliana P; da Cruz, Adriano G; Bolini, Helena M A

    2016-05-01

    Rapid sensory profiling methods have gained space in the sensory evaluation field. Techniques using direct analysis of the terms generated by consumers are considered easy to perform, without specific training requirements, thus improving knowledge about consumer perceptions on various products. This study aimed to determine the sensory profile of different commercial samples of chocolate ice cream, labeled as conventional and light or diet, using the "comment analysis" and "pivot profile" methods, based on consumers' perceptions. In the comment analysis task, consumers responded to 2 separate open questions describing the sensory attributes they liked or disliked in each sample. In the pivot profile method, samples were served in pairs (consisting of a coded sample and pivot), and consumers indicated the higher and lower intensity attributes in the target sample compared with the pivot. We observed that both methods were able to characterize the different chocolate ice cream samples using consumer perception, with high correlation results and configurational similarity (regression vector coefficient=0.917) between them. However, it is worth emphasizing that comment analysis is performed intuitively by consumers, whereas the pivot profile method showed high analytical and discriminative power even using consumers, proving to be a promising technique for routine application when classical descriptive methods cannot be used. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. Analysis of Moisture Content in Beetroot using Fourier Transform Infrared Spectroscopy and by Principal Component Analysis.

    PubMed

    Nesakumar, Noel; Baskar, Chanthini; Kesavan, Srinivasan; Rayappan, John Bosco Balaguru; Alwarappan, Subbiah

    2018-05-22

    The moisture content of beetroot varies during long-term cold storage. In this work, we propose a strategy to identify the moisture content and age of beetroot using principal component analysis coupled Fourier transform infrared spectroscopy (FTIR). Frequent FTIR measurements were recorded directly from the beetroot sample surface over a period of 34 days for analysing its moisture content employing attenuated total reflectance in the spectral ranges of 2614-4000 and 1465-1853 cm -1 with a spectral resolution of 8 cm -1 . In order to estimate the transmittance peak height (T p ) and area under the transmittance curve [Formula: see text] over the spectral ranges of 2614-4000 and 1465-1853 cm -1 , Gaussian curve fitting algorithm was performed on FTIR data. Principal component and nonlinear regression analyses were utilized for FTIR data analysis. Score plot over the ranges of 2614-4000 and 1465-1853 cm -1 allowed beetroot quality discrimination. Beetroot quality predictive models were developed by employing biphasic dose response function. Validation experiment results confirmed that the accuracy of the beetroot quality predictive model reached 97.5%. This research work proves that FTIR spectroscopy in combination with principal component analysis and beetroot quality predictive models could serve as an effective tool for discriminating moisture content in fresh, half and completely spoiled stages of beetroot samples and for providing status alerts.

  8. Quantitative contrast-enhanced ultrasound evaluation of pathological complete response in patients with locally advanced breast cancer receiving neoadjuvant chemotherapy.

    PubMed

    Wan, Cai-Feng; Liu, Xue-Song; Wang, Lin; Zhang, Jie; Lu, Jin-Song; Li, Feng-Hua

    2018-06-01

    To clarify whether the quantitative parameters of contrast-enhanced ultrasound (CEUS) can be used to predict pathological complete response (pCR) in patients with locally advanced breast cancer receiving neoadjuvant chemotherapy (NAC). Fifty-one patients with histologically proved locally advanced breast cancer scheduled for NAC were enrolled. The quantitative data for CEUS and the tumor diameter were collected at baseline and before surgery, and compared with the pathological response. Multiple logistic regression analysis was performed to examine quantitative parameters at CEUS and the tumor diameter to predict the pCR, and receiver operating characteristic (ROC) curve analysis was used as a summary statistic. Multiple logistic regression analysis revealed that PEAK (the maximum intensity of the time-intensity curve during bolus transit), PEAK%, TTP% (time to peak), and diameter% were significant independent predictors of pCR, and the area under the ROC curve was 0.932(Az 1 ), and the sensitivity and specificity to predict pCR were 93.7% and 80.0%. The area under the ROC curve for the quantitative parameters was 0.927(Az 2 ), and the sensitivity and specificity to predict pCR were 81.2% and 94.3%. For diameter%, the area under the ROC curve was 0.786 (Az 3 ), and the sensitivity and specificity to predict pCR were 93.8% and 54.3%. The values of Az 1 and Az 2 were significantly higher than that of Az 3 (P = 0.027 and P = 0.034, respectively). However, there was no significant difference between the values of Az 1 and Az 2 (P = 0.825). Quantitative analysis of tumor blood perfusion with CEUS is superior to diameter% to predict pCR, and can be used as a functional technique to evaluate tumor response to NAC. Copyright © 2018. Published by Elsevier B.V.

  9. A Dilemma That Underlies an Existence Proof in Geometry

    ERIC Educational Resources Information Center

    Samper, Carmen; Perry, Patricia; Camargo, Leonor; Sáenz-Ludlow, Adalira; Molina, Óscar

    2016-01-01

    Proving an existence theorem is less intuitive than proving other theorems. This article presents a semiotic analysis of significant fragments of classroom meaning-making which took place during the class-session in which the existence of the midpoint of a line-segment was proven. The purpose of the analysis is twofold. First follow the evolution…

  10. Using average cost methods to estimate encounter-level costs for medical-surgical stays in the VA.

    PubMed

    Wagner, Todd H; Chen, Shuo; Barnett, Paul G

    2003-09-01

    The U.S. Department of Veterans Affairs (VA) maintains discharge abstracts, but these do not include cost information. This article describes the methods the authors used to estimate the costs of VA medical-surgical hospitalizations in fiscal years 1998 to 2000. They estimated a cost regression with 1996 Medicare data restricted to veterans receiving VA care in an earlier year. The regression accounted for approximately 74 percent of the variance in cost-adjusted charges, and it proved to be robust to outliers and the year of input data. The beta coefficients from the cost regression were used to impute costs of VA medical-surgical hospital discharges. The estimated aggregate costs were reconciled with VA budget allocations. In addition to the direct medical costs, their cost estimates include indirect costs and physician services; both of these were allocated in proportion to direct costs. They discuss the method's limitations and application in other health care systems.

  11. An ultra low power feature extraction and classification system for wearable seizure detection.

    PubMed

    Page, Adam; Pramod Tim Oates, Siddharth; Mohsenin, Tinoosh

    2015-01-01

    In this paper we explore the use of a variety of machine learning algorithms for designing a reliable and low-power, multi-channel EEG feature extractor and classifier for predicting seizures from electroencephalographic data (scalp EEG). Different machine learning classifiers including k-nearest neighbor, support vector machines, naïve Bayes, logistic regression, and neural networks are explored with the goal of maximizing detection accuracy while minimizing power, area, and latency. The input to each machine learning classifier is a 198 feature vector containing 9 features for each of the 22 EEG channels obtained over 1-second windows. All classifiers were able to obtain F1 scores over 80% and onset sensitivity of 100% when tested on 10 patients. Among five different classifiers that were explored, logistic regression (LR) proved to have minimum hardware complexity while providing average F-1 score of 91%. Both ASIC and FPGA implementations of logistic regression are presented and show the smallest area, power consumption, and the lowest latency when compared to the previous work.

  12. Automatic prediction of solar flares and super geomagnetic storms

    NASA Astrophysics Data System (ADS)

    Song, Hui

    Space weather is the response of our space environment to the constantly changing Sun. As the new technology advances, mankind has become more and more dependent on space system, satellite-based services. A geomagnetic storm, a disturbance in Earth's magnetosphere, may produce many harmful effects on Earth. Solar flares and Coronal Mass Ejections (CMEs) are believed to be the major causes of geomagnetic storms. Thus, establishing a real time forecasting method for them is very important in space weather study. The topics covered in this dissertation are: the relationship between magnetic gradient and magnetic shear of solar active regions; the relationship between solar flare index and magnetic features of solar active regions; based on these relationships a statistical ordinal logistic regression model is developed to predict the probability of solar flare occurrences in the next 24 hours; and finally the relationship between magnetic structures of CME source regions and geomagnetic storms, in particular, the super storms when the D st index decreases below -200 nT is studied and proved to be able to predict those super storms. The results are briefly summarized as follows: (1) There is a significant correlation between magnetic gradient and magnetic shear of active region. Furthermore, compared with magnetic shear, magnetic gradient might be a better proxy to locate where a large flare occurs. It appears to be more accurate in identification of sources of X-class flares than M-class flares; (2) Flare index, defined by weighting the SXR flares, is proved to have positive correlation with three magnetic features of active region; (3) A statistical ordinal logistic regression model is proposed for solar flare prediction. The results are much better than those data published in the NASA/SDAC service, and comparable to the data provided by the NOAA/SEC complicated expert system. To our knowledge, this is the first time that logistic regression model has been applied in solar physics to predict flare occurrences; (4) The magnetic orientation angle [straight theta], determined from a potential field model, is proved to be able to predict the probability of super geomagnetic storms (D= st <=-200nT). The results show that those active regions associated with | [straight theta]| < 90° are more likely to cause a super geomagnetic storm.

  13. Multiplicative Multitask Feature Learning

    PubMed Central

    Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu

    2016-01-01

    We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735

  14. Assessing eco-efficiency and the determinants of horticultural family-farming in southeast Spain.

    PubMed

    Godoy-Durán, Ángeles; Galdeano-Gómez, Emilio; Pérez-Mesa, Juan C; Piedra-Muñoz, Laura

    2017-12-15

    Eco-efficiency is currently receiving ever increasing interest as an indicator of sustainability, as it links environmental and economic performances in productive activities. In agriculture these indicators and their determinants prove relevant due to the close ties in this activity between the use of often limited natural resources and the provision of basic goods for society. The present paper analyzes eco-efficiency at micro-level, focusing on small-scale family farms as the principal decision-making units (DMUs) of horticulture in southeast Spain, which represents over 30% of fresh vegetables produced in the country. To this end, Data Envelopment Analysis (DEA) framework is applied, computing several combinations of environmental pressures (water usage, phytosanitary contamination, waste management, etc.) and economic value added. In a second stage we analyze the influence of family farms' socio-economic and environmental features on eco-efficiency indicators, as endogenous variables, by using truncated regression and bootstrapping techniques. The results show major inefficiency in aspects such as waste management, among others, while there is relatively minor inefficiency in water usage and nitrogen balance. On the other hand, features such as product specialization, adoption of quality certifications, and belonging to a cooperative all have a positive influence on eco-efficiency. These results are deemed to be of interest to agri-food systems structured on small-scale producers, and they may prove useful to policy-makers as regards managing public environmental programs in agriculture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Weighted Regressions on Time, Discharge, and Season (WRTDS), with an application to Chesapeake Bay River inputs

    USGS Publications Warehouse

    Hirsch, Robert M.; Moyer, Douglas; Archfield, Stacey A.

    2010-01-01

    A new approach to the analysis of long-term surface water-quality data is proposed and implemented. The goal of this approach is to increase the amount of information that is extracted from the types of rich water-quality datasets that now exist. The method is formulated to allow for maximum flexibility in representations of the long-term trend, seasonal components, and discharge-related components of the behavior of the water-quality variable of interest. It is designed to provide internally consistent estimates of the actual history of concentrations and fluxes as well as histories that eliminate the influence of year-to-year variations in streamflow. The method employs the use of weighted regressions of concentrations on time, discharge, and season. Finally, the method is designed to be useful as a diagnostic tool regarding the kinds of changes that are taking place in the watershed related to point sources, groundwater sources, and surface-water nonpoint sources. The method is applied to datasets for the nine large tributaries of Chesapeake Bay from 1978 to 2008. The results show a wide range of patterns of change in total phosphorus and in dissolved nitrate plus nitrite. These results should prove useful in further examination of the causes of changes, or lack of changes, and may help inform decisions about future actions to reduce nutrient enrichment in the Chesapeake Bay and its watershed.

  16. CALIBRATING NON-CONVEX PENALIZED REGRESSION IN ULTRA-HIGH DIMENSION.

    PubMed

    Wang, Lan; Kim, Yongdai; Li, Runze

    2013-10-01

    We investigate high-dimensional non-convex penalized regression, where the number of covariates may grow at an exponential rate. Although recent asymptotic theory established that there exists a local minimum possessing the oracle property under general conditions, it is still largely an open problem how to identify the oracle estimator among potentially multiple local minima. There are two main obstacles: (1) due to the presence of multiple minima, the solution path is nonunique and is not guaranteed to contain the oracle estimator; (2) even if a solution path is known to contain the oracle estimator, the optimal tuning parameter depends on many unknown factors and is hard to estimate. To address these two challenging issues, we first prove that an easy-to-calculate calibrated CCCP algorithm produces a consistent solution path which contains the oracle estimator with probability approaching one. Furthermore, we propose a high-dimensional BIC criterion and show that it can be applied to the solution path to select the optimal tuning parameter which asymptotically identifies the oracle estimator. The theory for a general class of non-convex penalties in the ultra-high dimensional setup is established when the random errors follow the sub-Gaussian distribution. Monte Carlo studies confirm that the calibrated CCCP algorithm combined with the proposed high-dimensional BIC has desirable performance in identifying the underlying sparsity pattern for high-dimensional data analysis.

  17. Relationships between social competence, psychopathology and work performance and their predictive value for vocational rehabilitation of schizophrenic outpatients.

    PubMed

    Hoffmann, H; Kupper, Z

    1997-01-17

    Earlier studies suggest that social competence has a higher predictive value for vocational outcome than psychopathology. These studies, however, show methodological shortcomings, including the fact that the instruments used for assessing social competence, psychopathology and work performance are strongly interrelated. The present study, involving a population of 34 chronically schizophrenic outpatients enrolled in a vocational rehabilitation program, was conducted in order to determine: (1) how closely the Role Play Test, the Positive and Negative Syndrome Scale and the Work Behavior Assessment Scale are related to each other; and (2) whether social competence is a better predictor of work performance and outcome of vocational rehabilitation than psychopathology. Factor analysis has revealed that the instruments are interrelated, mainly in the dimensions of negative symptoms, social relationships, non-verbal measures of social competence and conceptual disorganization. In backward regression analyses, psychopathological indicators proved to be the best predictors of work performance both cross-sectionally as well as in the longterm course. In the traditional two-syndrome model of schizophrenic psychopathology only negative symptoms were left in the regression model. In a four-dimension model the disorder of relating and the conceptual disorganization dimension were the best predictors. Differences between disorder of relating and social competence, assessed by the Role Play Test, are discussed here as well as the implications of this study for rehabilitation.

  18. Using perinatal morbidity scoring tools as a primary study outcome.

    PubMed

    Hutcheon, Jennifer A; Bodnar, Lisa M; Platt, Robert W

    2017-11-01

    Perinatal morbidity scores are tools that score or weight different adverse events according to their relative severity. Perinatal morbidity scores are appealing for maternal-infant health researchers because they provide a way to capture a broad range of adverse events to mother and newborn while recognising that some events are considered more serious than others. However, they have proved difficult to implement as a primary outcome in applied research studies because of challenges in testing if the scores are significantly different between two or more study groups. We outline these challenges and describe a solution, based on Poisson regression, that allows differences in perinatal morbidity scores to be formally evaluated. The approach is illustrated using an existing maternal-neonatal scoring tool, the Adverse Outcome Index, to evaluate the safety of labour and delivery before and after the closure of obstetrical services in small rural communities. Applying the proposed Poisson regression to the case study showed a protective risk ratio for adverse outcome following closures as compared with the original analysis, where no difference was found. This approach opens the door for considerably broader use of perinatal morbidity scoring tools as a primary outcome in applied population and clinical maternal-infant health research studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. CALIBRATING NON-CONVEX PENALIZED REGRESSION IN ULTRA-HIGH DIMENSION

    PubMed Central

    Wang, Lan; Kim, Yongdai; Li, Runze

    2014-01-01

    We investigate high-dimensional non-convex penalized regression, where the number of covariates may grow at an exponential rate. Although recent asymptotic theory established that there exists a local minimum possessing the oracle property under general conditions, it is still largely an open problem how to identify the oracle estimator among potentially multiple local minima. There are two main obstacles: (1) due to the presence of multiple minima, the solution path is nonunique and is not guaranteed to contain the oracle estimator; (2) even if a solution path is known to contain the oracle estimator, the optimal tuning parameter depends on many unknown factors and is hard to estimate. To address these two challenging issues, we first prove that an easy-to-calculate calibrated CCCP algorithm produces a consistent solution path which contains the oracle estimator with probability approaching one. Furthermore, we propose a high-dimensional BIC criterion and show that it can be applied to the solution path to select the optimal tuning parameter which asymptotically identifies the oracle estimator. The theory for a general class of non-convex penalties in the ultra-high dimensional setup is established when the random errors follow the sub-Gaussian distribution. Monte Carlo studies confirm that the calibrated CCCP algorithm combined with the proposed high-dimensional BIC has desirable performance in identifying the underlying sparsity pattern for high-dimensional data analysis. PMID:24948843

  20. The use of segmented regression in analysing interrupted time series studies: an example in pre-hospital ambulance care.

    PubMed

    Taljaard, Monica; McKenzie, Joanne E; Ramsay, Craig R; Grimshaw, Jeremy M

    2014-06-19

    An interrupted time series design is a powerful quasi-experimental approach for evaluating effects of interventions introduced at a specific point in time. To utilize the strength of this design, a modification to standard regression analysis, such as segmented regression, is required. In segmented regression analysis, the change in intercept and/or slope from pre- to post-intervention is estimated and used to test causal hypotheses about the intervention. We illustrate segmented regression using data from a previously published study that evaluated the effectiveness of a collaborative intervention to improve quality in pre-hospital ambulance care for acute myocardial infarction (AMI) and stroke. In the original analysis, a standard regression model was used with time as a continuous variable. We contrast the results from this standard regression analysis with those from segmented regression analysis. We discuss the limitations of the former and advantages of the latter, as well as the challenges of using segmented regression in analysing complex quality improvement interventions. Based on the estimated change in intercept and slope from pre- to post-intervention using segmented regression, we found insufficient evidence of a statistically significant effect on quality of care for stroke, although potential clinically important effects for AMI cannot be ruled out. Segmented regression analysis is the recommended approach for analysing data from an interrupted time series study. Several modifications to the basic segmented regression analysis approach are available to deal with challenges arising in the evaluation of complex quality improvement interventions.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dierauf, Timothy; Kurtz, Sarah; Riley, Evan

    This paper provides a recommended method for evaluating the AC capacity of a photovoltaic (PV) generating station. It also presents companion guidance on setting the facilitys capacity guarantee value. This is a principles-based approach that incorporates plant fundamental design parameters such as loss factors, module coefficients, and inverter constraints. This method has been used to prove contract guarantees for over 700 MW of installed projects. The method is transparent, and the results are deterministic. In contrast, current industry practices incorporate statistical regression where the empirical coefficients may only characterize the collected data. Though these methods may work well when extrapolationmore » is not required, there are other situations where the empirical coefficients may not adequately model actual performance.This proposed Fundamentals Approach method provides consistent results even where regression methods start to lose fidelity.« less

  2. Standardized Regression Coefficients as Indices of Effect Sizes in Meta-Analysis

    ERIC Educational Resources Information Center

    Kim, Rae Seon

    2011-01-01

    When conducting a meta-analysis, it is common to find many collected studies that report regression analyses, because multiple regression analysis is widely used in many fields. Meta-analysis uses effect sizes drawn from individual studies as a means of synthesizing a collection of results. However, indices of effect size from regression analyses…

  3. Nonlinear Constitutive Modeling of Piezoelectric Ceramics

    NASA Astrophysics Data System (ADS)

    Xu, Jia; Li, Chao; Wang, Haibo; Zhu, Zhiwen

    2017-12-01

    Nonlinear constitutive modeling of piezoelectric ceramics is discussed in this paper. Van der Pol item is introduced to explain the simple hysteretic curve. Improved nonlinear difference items are used to interpret the hysteresis phenomena of piezoelectric ceramics. The fitting effect of the model on experimental data is proved by the partial least-square regression method. The results show that this method can describe the real curve well. The results of this paper are helpful to piezoelectric ceramics constitutive modeling.

  4. [Aboveground biomass of three conifers in Qianyanzhou plantation].

    PubMed

    Li, Xuanran; Liu, Qijing; Chen, Yongrui; Hu, Lile; Yang, Fengting

    2006-08-01

    In this paper, the regressive models of the aboveground biomass of Pinus elliottii, P. massoniana and Cunninghamia lanceolata in Qianyanzhou of subtropical China were established, and the regression analysis on the dry weight of leaf biomass and total biomass against branch diameter (d), branch length (L), d3 and d2L was conducted with linear, power and exponent functions. Power equation with single parameter (d) was proved to be better than the rests for P. massoniana and C. lanceolata, and linear equation with parameter (d3) was better for P. elliottii. The canopy biomass was derived by the regression equations for all branches. These equations were also used to fit the relationships of total tree biomass, branch biomass and foliage biomass with tree diameter at breast height (D), tree height (H), D3 and D2H, respectively. D2H was found to be the best parameter for estimating total biomass. For foliage-and branch biomass, both parameters and equation forms showed some differences among species. Correlations were highly significant (P <0.001) for foliage-, branch-and total biomass, with the highest for total biomass. By these equations, the aboveground biomass and its allocation were estimated, with the aboveground biomass of P. massoniana, P. elliottii, and C. lanceolata forests being 83.6, 72. 1 and 59 t x hm(-2), respectively, and more stem biomass than foliage-and branch biomass. According to the previous studies, the underground biomass of these three forests was estimated to be 10.44, 9.42 and 11.48 t x hm(-2), and the amount of fixed carbon was 47.94, 45.14 and 37.52 t x hm(-2), respectively.

  5. Effective Surfactants Blend Concentration Determination for O/W Emulsion Stabilization by Two Nonionic Surfactants by Simple Linear Regression.

    PubMed

    Hassan, A K

    2015-01-01

    In this work, O/W emulsion sets were prepared by using different concentrations of two nonionic surfactants. The two surfactants, tween 80(HLB=15.0) and span 80(HLB=4.3) were used in a fixed proportions equal to 0.55:0.45 respectively. HLB value of the surfactants blends were fixed at 10.185. The surfactants blend concentration is starting from 3% up to 19%. For each O/W emulsion set the conductivity was measured at room temperature (25±2°), 40, 50, 60, 70 and 80°. Applying the simple linear regression least squares method statistical analysis to the temperature-conductivity obtained data determines the effective surfactants blend concentration required for preparing the most stable O/W emulsion. These results were confirmed by applying the physical stability centrifugation testing and the phase inversion temperature range measurements. The results indicated that, the relation which represents the most stable O/W emulsion has the strongest direct linear relationship between temperature and conductivity. This relationship is linear up to 80°. This work proves that, the most stable O/W emulsion is determined via the determination of the maximum R² value by applying of the simple linear regression least squares method to the temperature-conductivity obtained data up to 80°, in addition to, the true maximum slope is represented by the equation which has the maximum R² value. Because the conditions would be changed in a more complex formulation, the method of the determination of the effective surfactants blend concentration was verified by applying it for more complex formulations of 2% O/W miconazole nitrate cream and the results indicate its reproducibility.

  6. Pan-arctic trends in terrestrial dissolved organic matter from optical measurements

    NASA Astrophysics Data System (ADS)

    Mann, Paul; Spencer, Robert; Hernes, Peter; Six, Johan; Aiken, George; Tank, Suzanne; McClelland, James; Butler, Kenna; Dyda, Rachael; Holmes, Robert

    2016-03-01

    Climate change is causing extensive warming across arctic regions resulting in permafrost degradation, alterations to regional hydrology, and shifting amounts and composition of dissolved organic matter (DOM) transported by streams and rivers. Here, we characterize the DOM composition and optical properties of the six largest arctic rivers draining into the Arctic Ocean to examine the ability of optical measurements to provide meaningful insights into terrigenous carbon export patterns and biogeochemical cycling. The chemical composition of aquatic DOM varied with season, spring months were typified by highest lignin phenol and dissolved organic carbon (DOC) concentrations with greater hydrophobic acid content, and lower proportions of hydrophilic compounds, relative to summer and winter months. Chromophoric DOM (CDOM) spectral slope (S275-295) tracked seasonal shifts in DOM composition across river basins. Fluorescence and parallel factor analysis identified seven components across the six Arctic rivers. The ratios of 'terrestrial humic-like' versus 'marine humic-like' fluorescent components co-varied with lignin monomer ratios over summer and winter months, suggesting fluorescence may provide information on the age and degradation state of riverine DOM. CDOM absorbance (a350) proved a sensitive proxy for lignin phenol concentrations across all six river basins and over the hydrograph, enabling for the first time the development of a single pan-arctic relationship between a350 and terrigenous DOC (R2 = 0.93). Combining this lignin proxy with high-resolution monitoring of a350, pan-arctic estimates of annual lignin flux were calculated to range from 156 to 185 Gg, resulting in shorter and more constrained estimates of terrigenous DOM residence times in the Arctic Ocean (spanning 7 months to 2½ years). Furthermore, multiple linear regression models incorporating both absorbance and fluorescence variables proved capable of explaining much of the variability in lignin composition across rivers and seasons. Our findings suggest that synoptic, high-resolution optical measurements can provide improved understanding of northern high-latitude organic matter cycling and flux, and prove an important technique for capturing future climate-driven changes.

  7. Are skills learned in nursing transferable to other careers?

    PubMed

    Duffield, Christine; O'Brien-Pallas, Linda; Aitken, Leanne M

    2005-01-01

    To determine the influence of skills gained in nursing on the transition to a non-nursing career. Little is known about the impact that nursing skills have on the transition to new careers or about the transferability of nursing skills to professions outside nursing. A postal questionnaire was mailed to respondents who had left nursing. The questionnaire included demographic, nursing education and practice information, reasons for entering and leaving nursing, perceptions of the skills gained in nursing and the ease of adjustment to a new career. Data analysis included exploratory and confirmatory factor analysis, Pearson product moment correlations and linear and multiple regression analysis. Skills learned as a nurse that were valuable in acquiring a career outside nursing formed two factors, including "management of self and others" and "knowledge and skills learned," explaining 32% of the variation. The highest educational achievement while working as a nurse, choosing nursing as a "default choice," leaving nursing because of "worklife/homelife balance" and the skills of "management of self and others" and "knowledge and skills" had a significant relationship with difficulty adjusting to a non-nursing work role and, overall, explained 28% of the variation in this difficulty adjusting. General knowledge and skills learned in nursing prove beneficial in adjusting to roles outside nursing.

  8. Pulse Rate and Transit Time Analysis to Predict Hypotension Events After Spinal Anesthesia During Programmed Cesarean Labor.

    PubMed

    Bolea, Juan; Lázaro, Jesús; Gil, Eduardo; Rovira, Eva; Remartínez, José M; Laguna, Pablo; Pueyo, Esther; Navarro, Augusto; Bailón, Raquel

    2017-09-01

    Prophylactic treatment has been proved to reduce hypotension incidence after spinal anesthesia during cesarean labor. However, the use of pharmacological prophylaxis could carry out undesirable side-effects on mother and fetus. Thus, the prediction of hypotension becomes an important challenge. Hypotension events are hypothesized to be related to a malfunctioning of autonomic nervous system (ANS) regulation of blood pressure. In this work, ANS responses to positional changes of 51 pregnant women programmed for a cesarean labor were explored for hypotension prediction. Lateral and supine decubitus, and sitting position were considered while electrocardiographic and pulse photoplethysmographic signals were recorded. Features based on heart rate variability, pulse rate variability (PRV) and pulse transit time (PTT) analysis were used in a logistic regression classifier. The results showed that PRV irregularity changes, assessed by approximate entropy, from supine to lateral decubitus, and standard deviation of PTT in supine decubitus were found as the combination of features that achieved the best classification results sensitivity of 76%, specificity of 70% and accuracy of 72%, being normotensive the positive class. Peripheral regulation and blood pressure changes, measured by PRV and PTT analysis, could help to predict hypotension events reducing prophylactic side-effects in the low-risk population.

  9. Effect of noise in principal component analysis with an application to ozone pollution

    NASA Astrophysics Data System (ADS)

    Tsakiri, Katerina G.

    This thesis analyzes the effect of independent noise in principal components of k normally distributed random variables defined by a covariance matrix. We prove that the principal components as well as the canonical variate pairs determined from joint distribution of original sample affected by noise can be essentially different in comparison with those determined from the original sample. However when the differences between the eigenvalues of the original covariance matrix are sufficiently large compared to the level of the noise, the effect of noise in principal components and canonical variate pairs proved to be negligible. The theoretical results are supported by simulation study and examples. Moreover, we compare our results about the eigenvalues and eigenvectors in the two dimensional case with other models examined before. This theory can be applied in any field for the decomposition of the components in multivariate analysis. One application is the detection and prediction of the main atmospheric factor of ozone concentrations on the example of Albany, New York. Using daily ozone, solar radiation, temperature, wind speed and precipitation data, we determine the main atmospheric factor for the explanation and prediction of ozone concentrations. A methodology is described for the decomposition of the time series of ozone and other atmospheric variables into the global term component which describes the long term trend and the seasonal variations, and the synoptic scale component which describes the short term variations. By using the Canonical Correlation Analysis, we show that solar radiation is the only main factor between the atmospheric variables considered here for the explanation and prediction of the global and synoptic scale component of ozone. The global term components are modeled by a linear regression model, while the synoptic scale components by a vector autoregressive model and the Kalman filter. The coefficient of determination, R2, for the prediction of the synoptic scale ozone component was found to be the highest when we consider the synoptic scale component of the time series for solar radiation and temperature. KEY WORDS: multivariate analysis; principal component; canonical variate pairs; eigenvalue; eigenvector; ozone; solar radiation; spectral decomposition; Kalman filter; time series prediction

  10. Directed Incremental Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz

    2011-01-01

    The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

  11. Using Dominance Analysis to Determine Predictor Importance in Logistic Regression

    ERIC Educational Resources Information Center

    Azen, Razia; Traxel, Nicole

    2009-01-01

    This article proposes an extension of dominance analysis that allows researchers to determine the relative importance of predictors in logistic regression models. Criteria for choosing logistic regression R[superscript 2] analogues were determined and measures were selected that can be used to perform dominance analysis in logistic regression. A…

  12. FT-IR imaging for quantitative determination of liver fat content in non-alcoholic fatty liver.

    PubMed

    Kochan, K; Maslak, E; Chlopicki, S; Baranska, M

    2015-08-07

    In this work we apply FT-IR imaging of large areas of liver tissue cross-section samples (∼5 cm × 5 cm) for quantitative assessment of steatosis in murine model of Non-Alcoholic Fatty Liver (NAFLD). We quantified the area of liver tissue occupied by lipid droplets (LDs) by FT-IR imaging and Oil Red O (ORO) staining for comparison. Two alternative FT-IR based approaches are presented. The first, straightforward method, was based on average spectra from tissues and provided values of the fat content by using a PLS regression model and the reference method. The second one – the chemometric-based method – enabled us to determine the values of the fat content, independently of the reference method by means of k-means cluster (KMC) analysis. In summary, FT-IR images of large size liver sections may prove to be useful for quantifying liver steatosis without the need of tissue staining.

  13. Neuropsychological impairments predict the clinical course in schizophrenia.

    PubMed

    Wölwer, Wolfgang; Brinkmeyer, Jürgen; Riesbeck, Mathias; Freimüller, Lena; Klimke, Ansgar; Wagner, Michael; Möller, Hans-Jürgen; Klingberg, Stefan; Gaebel, Wolfgang

    2008-11-01

    To add to the open question whether cognitive impairments predict clinical outcome in schizophrenia, a sample of 125 first episode patients was assessed at the onset and over one year of controlled long-term treatment within a study of the German Research Network on Schizophrenia. No relapse according to predefined criteria occurred within the first year, but a total of 29 patients fulfilled post-hoc criteria of "clinical deterioration". Impairments in cognitive functioning assessed by the Trail-Making Test B at the onset of long-term treatment differentiated between patients with vs. without later clinical deterioration and proved to be a significant predictor of the clinical course in a regression analysis outperforming initial clinical status as predictor. However, low sensitivity (72%) and specificity (51%) limit possibilities of a transfer to individual predictions. As a linear combination of neuropsychological and psychopathological variables obtained highest predictive validity, such a combination may improve the prediction of the course of schizophrenic disorders and may ultimately lead to a more efficient and comprehensive treatment planning.

  14. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  15. Depression and Identity: Are Self-Constructions Negative or Conflictual?

    PubMed Central

    Montesano, Adrián; Feixas, Guillem; Caspar, Franz; Winter, David

    2017-01-01

    Negative self-views have proved to be a consistent marker of vulnerability for depression. However, recent research has shown that a particular kind of cognitive conflict, implicative dilemma, is highly prevalent in depression. In this study, the relevance of these conflicts is assessed as compared to the cognitive model of depression of a negative view of the self. In so doing, 161 patients with major depression and 110 controls were assessed to explore negative self-construing (self-ideal discrepancy) and conflicts (implicative dilemmas), as well as severity of symptoms. Results showed specificity for the clinical group indicating a pattern of mixed positive and negative self-descriptions with a high rate of conflict. Regression analysis lent support to the conflict hypothesis in relation to clinically relevant indicators such as symptom severity, global functioning. However, self-ideal discrepancy was a stronger predictor of group membership. The findings showed the relevance of cognitive conflicts to compliment the well-consolidated theory of negative self-views. Clinical implications for designing interventions are discussed. PMID:28611716

  16. Prevalence of Metabolic Syndrome according to Sasang Constitutional Medicine in Korean Subjects

    PubMed Central

    Song, Kwang Hoon; Yu, Sung-Gon; Kim, Jong Yeol

    2012-01-01

    Metabolic syndrome (MS) is a complex disorder defined by a cluster of abdominal obesity, atherogenic dyslipidemia, hyperglycemia, and hypertension; the condition is recognized as a risk factor for diabetes and cardiovascular disease. This study assessed the effects of the Sasang constitution group (SCG) on the risk of MS in Korean subjects. We have analyzed 1,617 outpatients of Korean oriental medicine hospitals who were classified into three SCGs, So-Yang, So-Eum, and Tae-Eum. Significant differences were noted in the prevalence of MS and the frequencies of all MS risk factors among the three SCGs. The odds ratios for MS as determined via multiple logistic regression analysis were 2.004 for So-Yang and 4.521 for Tae-Eum compared with So-Eum. These results indicate that SCG may function as a significant risk factor of MS; comprehensive knowledge of Sasang constitutional medicine may prove helpful in predicting susceptibility and developing preventive care techniques for MS. PMID:22454673

  17. Prevalence of Metabolic Syndrome according to Sasang Constitutional Medicine in Korean Subjects.

    PubMed

    Song, Kwang Hoon; Yu, Sung-Gon; Kim, Jong Yeol

    2012-01-01

    Metabolic syndrome (MS) is a complex disorder defined by a cluster of abdominal obesity, atherogenic dyslipidemia, hyperglycemia, and hypertension; the condition is recognized as a risk factor for diabetes and cardiovascular disease. This study assessed the effects of the Sasang constitution group (SCG) on the risk of MS in Korean subjects. We have analyzed 1,617 outpatients of Korean oriental medicine hospitals who were classified into three SCGs, So-Yang, So-Eum, and Tae-Eum. Significant differences were noted in the prevalence of MS and the frequencies of all MS risk factors among the three SCGs. The odds ratios for MS as determined via multiple logistic regression analysis were 2.004 for So-Yang and 4.521 for Tae-Eum compared with So-Eum. These results indicate that SCG may function as a significant risk factor of MS; comprehensive knowledge of Sasang constitutional medicine may prove helpful in predicting susceptibility and developing preventive care techniques for MS.

  18. [What motivates smoking and alcohol drinking of young people? A behavioural epidemiologic study].

    PubMed

    Pikó, Bettina; Varga, Szabolcs

    2014-01-19

    Adolescence is a life period of trying harmful habits. It is helpful for prevention to map youth's motivations. The main goal of the present study was to investigate high school students' motivations related to alcohol and cigarette use. A questionnaire survey was performed in Debrecen including students from four high schools (n = 501; age range, between 14 and 22 years; mean age, 16.4 years; 34% boys and 66% girls). Beyond descriptive statistics, logistic regression analysis was used to detect odds ratios explaining relationships between substance use and motivations. Besides a slight difference in gender, there were significant differences by substance user status in the structure of motivations. In case of alcohol use, social motivation proved to be a predictor. In case of cigarette smoking, besides social motivation, boredom relief and affect regulation (coping) were also significant. These data suggest that young people start to smoke cigarette and drink alcohol in social situations due to peer pressure. Therefore, prevention strategies should be built on social skills training.

  19. Depression and Identity: Are Self-Constructions Negative or Conflictual?

    PubMed

    Montesano, Adrián; Feixas, Guillem; Caspar, Franz; Winter, David

    2017-01-01

    Negative self-views have proved to be a consistent marker of vulnerability for depression. However, recent research has shown that a particular kind of cognitive conflict, implicative dilemma, is highly prevalent in depression. In this study, the relevance of these conflicts is assessed as compared to the cognitive model of depression of a negative view of the self. In so doing, 161 patients with major depression and 110 controls were assessed to explore negative self-construing (self-ideal discrepancy) and conflicts (implicative dilemmas), as well as severity of symptoms. Results showed specificity for the clinical group indicating a pattern of mixed positive and negative self-descriptions with a high rate of conflict. Regression analysis lent support to the conflict hypothesis in relation to clinically relevant indicators such as symptom severity, global functioning. However, self-ideal discrepancy was a stronger predictor of group membership. The findings showed the relevance of cognitive conflicts to compliment the well-consolidated theory of negative self-views. Clinical implications for designing interventions are discussed.

  20. Development and Validation of the ADAS Scale and Prediction of Attitudes Toward Affective-Sexual Diversity Among Spanish Secondary Students.

    PubMed

    Garrido-Hernansaiz, Helena; Martín-Fernández, Manuel; Castaño-Torrijos, Aida; Cuevas, Isabel

    2018-01-01

    Violence against non-heterosexual adolescents in educational contexts remains a worrying reality, but no adequate attitudes toward affective-sexual diversity (AtASD) measure exists for Spanish adolescent students. We developed a 27-item scale including cognitive, affective, and behavioral aspects, which was completed by 696 secondary school students from the Madrid area. Factor analyses suggested a unidimensional model, Cronbach's alpha indicated excellent scale scores reliability, and item calibration under the item response theory framework showed that the scale is especially informative for homophobic attitudes. A hierarchical multiple regression analysis showed that variables traditionally related to AtASD (gender, age, religion, nationality, perceived parental/peer attitudes, direct contact with LGB people) also were so in our sample. Moreover, interest in sexuality topics and perceived center's efforts to provide AtASD education were related to better AtASD. Our scale was reliable and valid, and it may also prove useful in efforts to detect those students with homophobic attitudes and to guide interventions.

  1. Socioeconomic Status Index to Interpret Inequalities in Child Development

    PubMed Central

    AHMADI DOULABI, Mahbobeh; SAJEDI, Firoozeh; VAMEGHI, Roshanak; MAZAHERI, Mohammad Ali; AKBARZADEH BAGHBAN, Alireza

    2017-01-01

    Objective There have been contradictory findings on the relationship between Socioeconomic Status (SES) and child development although SES is associated with child development outcomes. The present study intended to define the relationship between SES and child development in Tehran kindergartens, Iran. Materials & Methods This cross-sectional survey studied 1036 children aged 36-60 month, in different kindergartens in Tehran City, Iran, in 2014-2015. The principal factor analysis (PFA) model was employed to construct SES indices. The constructed SES variable was employed as an independent variable in logistic regression model to evaluate its role in developmental delay as a dependent variable. Results The relationship between SES and developmental delay was significant at P=0.003. SES proved to have a significant (P<0.05) impact on developmental delay, both as an independent variable and after controlling risk factors. Conclusion There should be more emphasis on developmental monitoring and appropriate intervention programs for children to give them higher chance of having a more productive life. PMID:28698723

  2. Bibliometric approach of factors affecting scientific productivity in environmental sciences and ecology.

    PubMed

    Dragos, Cristian Mihai; Dragos, Simona Laura

    2013-04-01

    Different academic bibliometric studies have measured the influence of economic, political and linguistic factors in the academic output of countries. Separate analysis in different fields can reveal specific incentive factors. Our study proves that the Environmental Performance Index, computed by Yale University, is highly significant (p<0.01) for the productivity of research and development activities in environmental sciences and ecology. The control variables like education financing, publishing of ISI Thomson domestic journals and the English language are also significant. The methodology uses Ordinary Least Squares multiple regressions with convincing results (R(2)=0.752). The relative positions of the 92 countries in the sample are also discussed. We draw up a ranking of the countries' concern for the environment, considering evenly the scientific productivity and the environment quality. We notice huge differences concerning the number of inhabitants and population income between the countries that dominate the classification and those occupying the last positions. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Evaluation of multiband, multitemporal, and transformed LANDSAT MSS data for land cover area estimation. [North Central Missouri

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; May, G. A.; Kalcic, M. T. (Principal Investigator)

    1981-01-01

    Sample segments of ground-verified land cover data collected in conjunction with the USDA/ESS June Enumerative Survey were merged with LANDSAT data and served as a focus for unsupervised spectral class development and accuracy assessment. Multitemporal data sets were created from single-date LANDSAT MSS acquisitions from a nominal scene covering an eleven-county area in north central Missouri. Classification accuracies for the four land cover types predominant in the test site showed significant improvement in going from unitemporal to multitemporal data sets. Transformed LANDSAT data sets did not significantly improve classification accuracies. Regression estimators yielded mixed results for different land covers. Misregistration of two LANDSAT data sets by as much and one half pixels did not significantly alter overall classification accuracies. Existing algorithms for scene-to scene overlay proved adequate for multitemporal data analysis as long as statistical class development and accuracy assessment were restricted to field interior pixels.

  4. Multivariate analysis of DSC-XRD simultaneous measurement data: a study of multistage crystalline structure changes in a linear poly(ethylene imine) thin film.

    PubMed

    Kakuda, Hiroyuki; Okada, Tetsuo; Otsuka, Makoto; Katsumoto, Yukiteru; Hasegawa, Takeshi

    2009-01-01

    A multivariate analytical technique has been applied to the analysis of simultaneous measurement data from differential scanning calorimetry (DSC) and X-ray diffraction (XRD) in order to study thermal changes in crystalline structure of a linear poly(ethylene imine) (LPEI) film. A large number of XRD patterns generated from the simultaneous measurements were subjected to an augmented alternative least-squares (ALS) regression analysis, and the XRD patterns were readily decomposed into chemically independent XRD patterns and their thermal profiles were also obtained at the same time. The decomposed XRD patterns and the profiles were useful in discussing the minute peaks in the DSC. The analytical results revealed the following changes of polymorphisms in detail: An LPEI film prepared by casting an aqueous solution was composed of sesquihydrate and hemihydrate crystals. The sesquihydrate one was lost at an early stage of heating, and the film changed into an amorphous state. Once the sesquihydrate was lost by heating, it was not recovered even when it was cooled back to room temperature. When the sample was heated again, structural changes were found between the hemihydrate and the amorphous components. In this manner, the simultaneous DSC-XRD measurements combined with ALS analysis proved to be powerful for obtaining a better understanding of the thermally induced changes of the crystalline structure in a polymer film.

  5. An investigation on thermal patterns in Iran based on spatial autocorrelation

    NASA Astrophysics Data System (ADS)

    Fallah Ghalhari, Gholamabbas; Dadashi Roudbari, Abbasali

    2018-02-01

    The present study aimed at investigating temporal-spatial patterns and monthly patterns of temperature in Iran using new spatial statistical methods such as cluster and outlier analysis, and hotspot analysis. To do so, climatic parameters, monthly average temperature of 122 synoptic stations, were assessed. Statistical analysis showed that January with 120.75% had the most fluctuation among the studied months. Global Moran's Index revealed that yearly changes of temperature in Iran followed a strong spatially clustered pattern. Findings showed that the biggest thermal cluster pattern in Iran, 0.975388, occurred in May. Cluster and outlier analyses showed that thermal homogeneity in Iran decreases in cold months, while it increases in warm months. This is due to the radiation angle and synoptic systems which strongly influence thermal order in Iran. The elevations, however, have the most notable part proved by Geographically weighted regression model. Iran's thermal analysis through hotspot showed that hot thermal patterns (very hot, hot, and semi-hot) were dominant in the South, covering an area of 33.5% (about 552,145.3 km2). Regions such as mountain foot and low lands lack any significant spatial autocorrelation, 25.2% covering about 415,345.1 km2. The last is the cold thermal area (very cold, cold, and semi-cold) with about 25.2% covering about 552,145.3 km2 of the whole area of Iran.

  6. Evaluation of Oil-Palm Fungal Disease Infestation with Canopy Hyperspectral Reflectance Data

    PubMed Central

    Lelong, Camille C. D.; Roger, Jean-Michel; Brégand, Simon; Dubertret, Fabrice; Lanore, Mathieu; Sitorus, Nurul A.; Raharjo, Doni A.; Caliman, Jean-Pierre

    2010-01-01

    Fungal disease detection in perennial crops is a major issue in estate management and production. However, nowadays such diagnostics are long and difficult when only made from visual symptom observation, and very expensive and damaging when based on root or stem tissue chemical analysis. As an alternative, we propose in this study to evaluate the potential of hyperspectral reflectance data to help detecting the disease efficiently without destruction of tissues. This study focuses on the calibration of a statistical model of discrimination between several stages of Ganoderma attack on oil palm trees, based on field hyperspectral measurements at tree scale. Field protocol and measurements are first described. Then, combinations of pre-processing, partial least square regression and linear discriminant analysis are tested on about hundred samples to prove the efficiency of canopy reflectance in providing information about the plant sanitary status. A robust algorithm is thus derived, allowing classifying oil-palm in a 4-level typology, based on disease severity from healthy to critically sick stages, with a global performance close to 94%. Moreover, this model discriminates sick from healthy trees with a confidence level of almost 98%. Applications and further improvements of this experiment are finally discussed. PMID:22315565

  7. Detection of heavy metal Cd in polluted fresh leafy vegetables by laser-induced breakdown spectroscopy.

    PubMed

    Yao, Mingyin; Yang, Hui; Huang, Lin; Chen, Tianbing; Rao, Gangfu; Liu, Muhua

    2017-05-10

    In seeking a novel method with the ability of green analysis in monitoring toxic heavy metals residue in fresh leafy vegetables, laser-induced breakdown spectroscopy (LIBS) was applied to prove its capability in performing this work. The spectra of fresh vegetable samples polluted in the lab were collected by optimized LIBS experimental setup, and the reference concentrations of cadmium (Cd) from samples were obtained by conventional atomic absorption spectroscopy after wet digestion. The direct calibration employing intensity of single Cd line and Cd concentration exposed the weakness of this calibration method. Furthermore, the accuracy of linear calibration can be improved a little by triple Cd lines as characteristic variables, especially after the spectra were pretreated. However, it is not enough in predicting Cd in samples. Therefore, partial least-squares regression (PLSR) was utilized to enhance the robustness of quantitative analysis. The results of the PLSR model showed that the prediction accuracy of the Cd target can meet the requirement of determination in food safety. This investigation presented that LIBS is a promising and emerging method in analyzing toxic compositions in agricultural products, especially combined with suitable chemometrics.

  8. Comparative evaluation of the powder and compression properties of various grades and brands of microcrystalline cellulose by multivariate methods.

    PubMed

    Haware, Rahul V; Bauer-Brandl, Annette; Tho, Ingunn

    2010-01-01

    The present work challenges a newly developed approach to tablet formulation development by using chemically identical materials (grades and brands of microcrystalline cellulose). Tablet properties with respect to process and formulation parameters (e.g. compression speed, added lubricant and Emcompress fractions) were evaluated by 2(3)-factorial designs. Tablets of constant true volume were prepared on a compaction simulator at constant pressure (approx. 100 MPa). The highly repeatable and accurate force-displacement data obtained was evaluated by simple 'in-die' Heckel method and work descriptors. Relationships and interactions between formulation, process and tablet parameters were identified and quantified by multivariate analysis techniques; principal component analysis (PCA) and partial least square regressions (PLS). The method proved to be able to distinguish between different grades of MCC and even between two different brands of the same grade (Avicel PH 101 and Vivapur 101). One example of interaction was studied in more detail by mixed level design: The interaction effect of lubricant and Emcompress on elastic recovery of Avicel PH 102 was demonstrated to be complex and non-linear using the development tool under investigation.

  9. Contact between the acetabulum and dome of a Kerboull-type plate influences the stress on the plate and screw.

    PubMed

    Hara, Katsutoshi; Kaku, Nobuhiro; Tabata, Tomonori; Tsumura, Hiroshi

    2015-07-01

    We used a three-dimensional finite element method to investigate the conditions behind the Kerboull-type (KT) dome. The KT plate dome was divided into five areas, and 14 models were created to examine different conditions of dome contact with the acetabulum. The maximum stress on the KT plate and screws was estimated for each model. Furthermore, to investigate the impact of the contact area with the acetabulum on the KT plate, a multiple regression analysis was conducted using the analysis results. The dome-acetabulum contact area affected the maximum equivalent stress on the KT plate; good contact with two specific areas of the vertical and horizontal beams (Areas 3 and 5) reduced the maximum equivalent stress. The maximum equivalent stress on the hook increased when the hardness of the bone representing the acetabulum varied. Thus, we confirmed the technical importance of providing a plate with a broad area of appropriate support from the bone and cement in the posterior portion of the dome and also proved the importance of supporting the area of the plate in the direction of the load at the center of the cross-plate and near the hook.

  10. Predictive models of safety based on audit findings: Part 2: Measurement of model validity.

    PubMed

    Hsiao, Yu-Lin; Drury, Colin; Wu, Changxu; Paquet, Victor

    2013-07-01

    Part 1 of this study sequence developed a human factors/ergonomics (HF/E) based classification system (termed HFACS-MA) for safety audit findings and proved its measurement reliability. In Part 2, we used the human error categories of HFACS-MA as predictors of future safety performance. Audit records and monthly safety incident reports from two airlines submitted to their regulatory authority were available for analysis, covering over 6.5 years. Two participants derived consensus results of HF/E errors from the audit reports using HFACS-MA. We adopted Neural Network and Poisson regression methods to establish nonlinear and linear prediction models respectively. These models were tested for the validity of prediction of the safety data, and only Neural Network method resulted in substantially significant predictive ability for each airline. Alternative predictions from counting of audit findings and from time sequence of safety data produced some significant results, but of much smaller magnitude than HFACS-MA. The use of HF/E analysis of audit findings provided proactive predictors of future safety performance in the aviation maintenance field. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Human predisposition to cognitive impairment and its relation with environmental exposure to potentially toxic elements.

    PubMed

    Cabral Pinto, Marina M S; Marinho-Reis, A Paula; Almeida, Agostinho; Ordens, Carlos M; Silva, Maria M V G; Freitas, Sandra; Simões, Mário R; Moreira, Paula I; Dinis, Pedro A; Diniz, M Luísa; Ferreira da Silva, Eduardo A; Condesso de Melo, M Teresa

    2017-03-09

    New lines of evidence suggest that less than 10% of neurodegenerative diseases have a strict genetic aetiology and other factors may be prevalent. Environmental exposures to potentially toxic elements appear to be a risk factor for Parkinson's, Alzheimer's and sclerosis diseases. This study proposes a multidisciplinary approach combining neurosciences, psychology and environmental sciences while integrating socio-economic, neuropsychological, environmental and health data. We present the preliminary results of a neuropsychological assessment carried out in elderly residents of the industrial city of Estarreja. A battery of cognitive tests and a personal questionnaire were administered to the participants. Multivariate analysis and multiple linear regression analysis were used to identify potential relationships between the cognitive status of the participants and environmental exposure to potentially toxic elements. The results suggest a relationship between urinary PTEs levels and the incidence of cognitive disorders. They also point towards water consumption habits and profession as relevant factors of exposure. Linear regression models show that aluminium (R 2  = 38%), cadmium (R 2  = 11%) and zinc (R 2  = 6%) are good predictors of the scores of the Mini-Mental State Examination cognitive test. Median contents (µg/l) in groundwater are above admissible levels for drinking water for aluminium (371), iron (860), manganese (250), and zinc (305). While the World Health Organization does not provide health-based reference values for aluminium, results obtained from this study suggest that it may have an important role in the cognitive status of the elderly. Urine proved to be a suitable biomarker of exposure both to elements with low and high excretion rates.

  12. Risk factors of falls among elderly living in Urban Suez - Egypt

    PubMed Central

    Kamel, Mohammed Hany; Abdulmajeed, Abdulmajeed Ahmed; Ismail, Sally El-Sayed

    2013-01-01

    Introduction Falling is one of the most common geriatric syndromes threatening the independence of older persons. Falls result from a complex and interactive mix of biological or medical, behavioral and environmental factors, many of which are preventable. Studying these diverse risk factors would aid early detection and management of them at the primary care level. Methods This is a cross sectional study about risk factors of falls was conducted to 340 elders in Urban Suez. Those are all patients over 60 who attended two family practice centers in Urban Suez. Results When asked about falling during the past 12 months, 205 elders recalled at least one incident of falling. Of them, 36% had their falls outdoors and 24% mentioned that stairs was the most prevalent site for indoor falls. Falls were also reported more among dependant than independent elderly. Using univariate regression analysis, almost all tested risk factors were significantly associated with falls in the studied population. These risk factors include: living alone, having chronic diseases, using medications, having a physical deficit, being in active, and having a high nutritional risk. However, the multivariate regression analysis proved that the strongest risk factors are low level of physical activity with OR 0.6 and P value 0.03, using a cane or walker (OR 1.69 and P value 0.001) and Impairment of daily living activities (OR 1.7 and P value 0.001). Conclusion Although falls is a serious problem among elderly with many consequences, it has many preventable risk factors. Health care providers should advice people to remain active and more research is needed in such an important area of Family Practice. PMID:23504298

  13. Relation Between Burnout Syndrome and Job Satisfaction Among Mental Health Workers

    PubMed Central

    Ogresta, Jelena; Rusac, Silvia; Zorec, Lea

    2008-01-01

    Aim To identify predictors of burnout syndrome, such as job satisfaction and manifestations of occupational stress, in mental health workers. Method The study included a snowball sample of 174 mental health workers in Croatia. The following measurement instruments were used: Maslach Burnout Inventory, Manifestations of Occupational Stress Survey, and Job Satisfaction Survey. We correlated dimensions of burnout syndrome with job satisfaction and manifestations of occupational stress dimensions. We also performed multiple regression analysis using three dimensions of burnout syndrome – emotional exhaustion, depersonalization, and personal accomplishment. Results Stepwise multiple regression analysis showed that pay and rewards satisfaction (β = -0.37), work climate (β = -0.18), advancement opportunities (β = 0.17), the degree of psychological (β = 0.41), and physical manifestations of occupational stress (β = 0.29) were significant predictors of emotional exhaustion (R = 0.76; F = 30.02; P<0.001). The frequency of negative emotional and behavioral reactions toward patients and colleagues (β = 0.48), psychological (β = 0.27) and physical manifestations of occupational stress (β = 0.24), and pay and rewards satisfaction (β = 0.22) were significant predictors of depersonalization (R = 0.57; F = 13.01; P<0.001). Satisfaction with the work climate (β = -0.20) was a significant predictor of lower levels of personal accomplishment (R = 0.20; F = 5.06; P<0.005). Conclusion Mental health workers exhibited a moderate degree of burnout syndrome, but there were no significant differences regarding their occupation. Generally, both dimensions of job satisfaction and manifestations of occupational stress proved to be relevant predictors of burnout syndrome. PMID:18581615

  14. How useful is determination of anti-factor Xa activity to guide bridging therapy with enoxaparin? A pilot study.

    PubMed

    Hammerstingl, Christoph; Omran, Heyder; Tripp, Christian; Poetzsch, Bernd

    2009-02-01

    Low-molecular-weight heparins (LMWH) are commonly used as peri-procedural bridging anticoagulants. The usefulness of measurement of anti-factor Xa activity (anti-Xa) to guide bridging therapy with LMWH is unknown. It was the objective of this study to determine levels of anti-Xa during standard bridging therapy with enoxaparin, and to examine predictors for residual anti-Xa. Consecutive patients receiving enoxaparin at a dosage of 1 mg/kg body weight/12 hours for temporary interruption of phenprocoumon were prospectively enrolled to the study. Blood-samples were obtained 14 hours after LMWH-application immediately pre- procedurally. Procedural details, clinical and demographic data were collected and subsequently analyzed. Seventy patients were included (age 75.2 +/- 10.8 years, Cr Cl 55.7 +/- 21.7ml/min, body mass index [BMI] 27.1 +/- 4.9). LMWH- therapy was for a mean of 4.2 +/- 1.6 days; overall anti-Xa was 0.58 +/- 0.32 U/ml. In 37 (52.8%) of patients anti-Xa was > or U/ml, including 10 (14.3%) patients with anti-Xa > 1U/ml. Linear regression analysis of single variables and logistic multivariable regression analysis failed to prove a correlation between anti-Xa and single or combined factors. No major bleeding, no thromboembolism and four (5.7%) minor haemorrhages were observed. When bridging OAC with therapeutic doses of enoxaparin a high percentage of patients undergo interventions with high residual anti-Xa. The levels of anti-Xa vary largely and are independent of single or combined clinical variables. Since the anti-Xa-related outcome of patients receiving bridging therapy with LMWH is not investigated, no firm recommendation on the usefulness of monitoring of anti-Xa can be given at this stage.

  15. Risk factors of falls among elderly living in urban Suez--Egypt.

    PubMed

    Kamel, Mohammed Hany; Abdulmajeed, Abdulmajeed Ahmed; Ismail, Sally El-Sayed

    2013-01-01

    Falling is one of the most common geriatric syndromes threatening the independence of older persons. Falls result from a complex and interactive mix of biological or medical, behavioral and environmental factors, many of which are preventable. Studying these diverse risk factors would aid early detection and management of them at the primary care level. This is a cross sectional study about risk factors of falls was conducted to 340 elders in Urban Suez. Those are all patients over 60 who attended two family practice centers in Urban Suez. When asked about falling during the past 12 months, 205 elders recalled at least one incident of falling. Of them, 36% had their falls outdoors and 24% mentioned that stairs was the most prevalent site for indoor falls. Falls were also reported more among dependant than independent elderly. Using univariate regression analysis, almost all tested risk factors were significantly associated with falls in the studied population. These risk factors include: living alone, having chronic diseases, using medications, having a physical deficit, being in active, and having a high nutritional risk. However, the multivariate regression analysis proved that the strongest risk factors are low level of physical activity with OR 0.6 and P value 0.03, using a cane or walker (OR 1.69 and P value 0.001) and Impairment of daily living activities (OR 1.7 and P value 0.001). Although falls is a serious problem among elderly with many consequences, it has many preventable risk factors. Health care providers should advice people to remain active and more research is needed in such an important area of Family Practice.

  16. Elucidation of chemosensitization effect of acridones in cancer cell lines: Combined pharmacophore modeling, 3D QSAR, and molecular dynamics studies.

    PubMed

    Gade, Deepak Reddy; Makkapati, Amareswararao; Yarlagadda, Rajesh Babu; Peters, Godefridus J; Sastry, B S; Rajendra Prasad, V V S

    2018-06-01

    Overexpression of P-glycoprotein (P-gp) leads to the emergence of multidrug resistance (MDR) in cancer treatment. Acridones have the potential to reverse MDR and sensitize cells. In the present study, we aimed to elucidate the chemosensitization potential of acridones by employing various molecular modelling techniques. Pharmacophore modeling was performed for the dataset of chemosensitizing acridones earlier proved for cytotoxic activity against MCF7 breast cancer cell line. Gaussian-based QSAR studies also performed to predict the favored and disfavored region of the acridone molecules. Molecular dynamics simulations were performed for compound 10 and human P-glycoprotein (obtained from Homology modeling). An efficient pharmacophore containing 2 hydrogen bond acceptors and 3 aromatic rings (AARRR.14) was identified. NCI 2012 chemical database was screened against AARRR.14 CPH and identified 25 best-fit molecules. Potential regions of the compound were identified through Field (Gaussian) based QSAR. Regression analysis of atom-based QSAR resulted in r 2 of 0.95 and q 2 of 0.72, whereas, regression analysis of field-based QSAR resulted in r 2 of 0.92 and q 2 of 0.87 along with r 2 cv as 0.71. The fate of the acridone molecule (compound 10) in the P-glycoprotein environment is analyzed through analyzing the conformational changes occurring during the molecular dynamics simulations. Combined data of different in silico techniques provided basis for deeper understanding of structural and mechanistic insights of interaction phenomenon of acridones with P-glycoprotein and also as strategic basis for designing more potent molecules for anti-cancer and multidrug resistance reversal activities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Severe chronic heart failure in patients considered for heart transplantation in Poland.

    PubMed

    Korewicki, Jerzy; Leszek, Przemysław; Zieliński, Tomasz; Rywik, Tomasz; Piotrowski, Walerian; Kurjata, Paweł; Kozar-Kamińska, Katarzyna; Kodziszewska, Katarzyna

    2012-01-01

    Based on the results of clinical trials, the prognosis for patients with severe heart failure (HF) has improved over the last 20 years. However, clinical trials do not reflect 'real life' due to patient selection. Thus, the aim of the POLKARD-HF registry was the analysis of survival of patients with refractory HF referred for orthotopic heart transplantation (OHT). Between 1 November 2003 and 31 October 2007, 983 patients with severe HF, referred for OHT in Poland, were included into the registry. All patients underwent routine clinical and hemodynamic evaluation, with NT-proBNP and hsCRP assessment. Death or an emergency OHT were assumed as the endpoints. The average observation period was 601 days. Kaplan-Meier curves with log-rank and univariate together with multifactor Cox regression model the stepwise variable selection method were used to determine the predictive value of analyzed variables. Among the 983 patients, the probability of surviving for one year was approximately 80%, for two years 70%, and for three years 67%. Etiology of the HF did not significantly influence the prognosis. The patients in NYHA class IV had a three-fold higher risk of death or emergency OHT. The univariate/multifactor Cox regression analysis revealed that NYHA IV class (HR 2.578, p < 0.0001), HFSS score (HR 2.572, p < 0.0001) and NT-proBNP plasma level (HR 1.600, p = 0.0200), proved to influence survival without death or emergency OHT. Despite optimal treatment, the prognosis for patients with refractory HF is still not good. NYHA class IV, NT-proBNP and HFSS score can help define the highest risk group. The results are consistent with the prognosis of patients enrolled into the randomized trials.

  18. Orthotopic bladder substitution in men revisited: identification of continence predictors.

    PubMed

    Koraitim, M M; Atta, M A; Foda, M K

    2006-11-01

    We determined the impact of the functional characteristics of the neobladder and urethral sphincter on continence results, and determined the most significant predictors of continence. A total of 88 male patients 29 to 70 years old underwent orthotopic bladder substitution with tubularized ileocecal segment (40) and detubularized sigmoid (25) or ileum (23). Uroflowmetry, cystometry and urethral pressure profilometry were performed at 13 to 36 months (mean 19) postoperatively. The correlation between urinary continence and 28 urodynamic variables was assessed. Parameters that correlated significantly with continence were entered into a multivariate analysis using a logistic regression model to determine the most significant predictors of continence. Maximum urethral closure pressure was the only parameter that showed a statistically significant correlation with diurnal continence. Nocturnal continence had not only a statistically significant positive correlation with maximum urethral closure pressure, but also statistically significant negative correlations with maximum contraction amplitude, and baseline pressure at mid and maximum capacity. Three of these 4 parameters, including maximum urethral closure pressure, maximum contraction amplitude and baseline pressure at mid capacity, proved to be significant predictors of continence on multivariate analysis. While daytime continence is determined by maximum urethral closure pressure, during the night it is the net result of 2 forces that have about equal influence but in opposite directions, that is maximum urethral closure pressure vs maximum contraction amplitude plus baseline pressure at mid capacity. Two equations were derived from the logistic regression model to predict the probability of continence after orthotopic bladder substitution, including Z1 (diurnal) = 0.605 + 0.0085 maximum urethral closure pressure and Z2 (nocturnal) = 0.841 + 0.01 [maximum urethral closure pressure - (maximum contraction amplitude + baseline pressure at mid capacity)].

  19. Spectral risk measures: the risk quadrangle and optimal approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouri, Drew P.

    We develop a general risk quadrangle that gives rise to a large class of spectral risk measures. The statistic of this new risk quadrangle is the average value-at-risk at a specific confidence level. As such, this risk quadrangle generates a continuum of error measures that can be used for superquantile regression. For risk-averse optimization, we introduce an optimal approximation of spectral risk measures using quadrature. Lastly, we prove the consistency of this approximation and demonstrate our results through numerical examples.

  20. Spectral risk measures: the risk quadrangle and optimal approximation

    DOE PAGES

    Kouri, Drew P.

    2018-05-24

    We develop a general risk quadrangle that gives rise to a large class of spectral risk measures. The statistic of this new risk quadrangle is the average value-at-risk at a specific confidence level. As such, this risk quadrangle generates a continuum of error measures that can be used for superquantile regression. For risk-averse optimization, we introduce an optimal approximation of spectral risk measures using quadrature. Lastly, we prove the consistency of this approximation and demonstrate our results through numerical examples.

  1. Analyzing big data with the hybrid interval regression methods.

    PubMed

    Huang, Chia-Hui; Yang, Keng-Chieh; Kao, Han-Ying

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.

  2. Analyzing Big Data with the Hybrid Interval Regression Methods

    PubMed Central

    Kao, Han-Ying

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes. PMID:25143968

  3. Applied Multiple Linear Regression: A General Research Strategy

    ERIC Educational Resources Information Center

    Smith, Brandon B.

    1969-01-01

    Illustrates some of the basic concepts and procedures for using regression analysis in experimental design, analysis of variance, analysis of covariance, and curvilinear regression. Applications to evaluation of instruction and vocational education programs are illustrated. (GR)

  4. Treatment for liver metastases from breast cancer: Results and prognostic factors

    PubMed Central

    Li, Xiao-Ping; Meng, Zhi-Qiang; Guo, Wei-Jian; Li, Jie

    2005-01-01

    AIM: Liver metastases from breast cancer (BCLM) are associated with poor prognosis. Cytotoxic chemotherapy can result in regression of tumor lesions and a decrease in symptoms. Available data, in the literature, also suggest a subgroup of patients may benefit from surgery, but few talked about transcatheter arterial chemoembolization (TACE). We report the results of TACE and systemic chemotherapy for patients with liver metastases from breast cancer and evaluate the prognostic factors. METHODS: Forty-eight patients with liver metastases, from proved breast primary cancer were treated with TACE or systemic chemotherapy between January 1995 and December 2000. Treatment results were assessed according to WHO criteria, along with analysis of prognostic factors for survival using Cox regression model. RESULTS: The median follow-up was 28 mo (1-72 mo). Response rates were calculated for the TACE group and chemotherapy group, being 35.7% and 7.1%, respectively. The difference was significant. The one-, two- and three-year Survival rates for the TACE group were 63.04%, 30.35%, and 13.01%, and those for the systemic chemotherapy group were 33.88%, 11.29%, and 0%. According to univariate analysis, variables significantly associated with survival were the lymph node status of the primary cancer, the clinical stage of liver metastases, the Child-Pugh grade, loss of weight. Other factors such as age, the intervals between the primary to the metastases, the maximal diameter of the liver metastases, the number of liver metastases, extrahepatic metastasis showed no prognostic significances. These factors mentioned above such as the lymph node status of the primary cancer, the clinical stage of liver metastases, the Child-Pugh grade, loss of weight were also independent factors in multivariate analysis. CONCLUSION: TACE treatment of liver metastases from breast cancer may prolong survival in certain patients. This approach offers new promise for the curative treatment of the patients with metastatic breast cancer. PMID:15968739

  5. External Tank Liquid Hydrogen (LH2) Prepress Regression Analysis Independent Review Technical Consultation Report

    NASA Technical Reports Server (NTRS)

    Parsons, Vickie s.

    2009-01-01

    The request to conduct an independent review of regression models, developed for determining the expected Launch Commit Criteria (LCC) External Tank (ET)-04 cycle count for the Space Shuttle ET tanking process, was submitted to the NASA Engineering and Safety Center NESC on September 20, 2005. The NESC team performed an independent review of regression models documented in Prepress Regression Analysis, Tom Clark and Angela Krenn, 10/27/05. This consultation consisted of a peer review by statistical experts of the proposed regression models provided in the Prepress Regression Analysis. This document is the consultation's final report.

  6. A study of machine learning regression methods for major elemental analysis of rocks using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Boucher, Thomas F.; Ozanne, Marie V.; Carmosino, Marco L.; Dyar, M. Darby; Mahadevan, Sridhar; Breves, Elly A.; Lepore, Kate H.; Clegg, Samuel M.

    2015-05-01

    The ChemCam instrument on the Mars Curiosity rover is generating thousands of LIBS spectra and bringing interest in this technique to public attention. The key to interpreting Mars or any other types of LIBS data are calibrations that relate laboratory standards to unknowns examined in other settings and enable predictions of chemical composition. Here, LIBS spectral data are analyzed using linear regression methods including partial least squares (PLS-1 and PLS-2), principal component regression (PCR), least absolute shrinkage and selection operator (lasso), elastic net, and linear support vector regression (SVR-Lin). These were compared against results from nonlinear regression methods including kernel principal component regression (K-PCR), polynomial kernel support vector regression (SVR-Py) and k-nearest neighbor (kNN) regression to discern the most effective models for interpreting chemical abundances from LIBS spectra of geological samples. The results were evaluated for 100 samples analyzed with 50 laser pulses at each of five locations averaged together. Wilcoxon signed-rank tests were employed to evaluate the statistical significance of differences among the nine models using their predicted residual sum of squares (PRESS) to make comparisons. For MgO, SiO2, Fe2O3, CaO, and MnO, the sparse models outperform all the others except for linear SVR, while for Na2O, K2O, TiO2, and P2O5, the sparse methods produce inferior results, likely because their emission lines in this energy range have lower transition probabilities. The strong performance of the sparse methods in this study suggests that use of dimensionality-reduction techniques as a preprocessing step may improve the performance of the linear models. Nonlinear methods tend to overfit the data and predict less accurately, while the linear methods proved to be more generalizable with better predictive performance. These results are attributed to the high dimensionality of the data (6144 channels) relative to the small number of samples studied. The best-performing models were SVR-Lin for SiO2, MgO, Fe2O3, and Na2O, lasso for Al2O3, elastic net for MnO, and PLS-1 for CaO, TiO2, and K2O. Although these differences in model performance between methods were identified, most of the models produce comparable results when p ≤ 0.05 and all techniques except kNN produced statistically-indistinguishable results. It is likely that a combination of models could be used together to yield a lower total error of prediction, depending on the requirements of the user.

  7. Estuarine Sediment Deposition during Wetland Restoration: A GIS and Remote Sensing Modeling Approach

    NASA Technical Reports Server (NTRS)

    Newcomer, Michelle; Kuss, Amber; Kentron, Tyler; Remar, Alex; Choksi, Vivek; Skiles, J. W.

    2011-01-01

    Restoration of the industrial salt flats in the San Francisco Bay, California is an ongoing wetland rehabilitation project. Remote sensing maps of suspended sediment concentration, and other GIS predictor variables were used to model sediment deposition within these recently restored ponds. Suspended sediment concentrations were calibrated to reflectance values from Landsat TM 5 and ASTER using three statistical techniques -- linear regression, multivariate regression, and an Artificial Neural Network (ANN), to map suspended sediment concentrations. Multivariate and ANN regressions using ASTER proved to be the most accurate methods, yielding r2 values of 0.88 and 0.87, respectively. Predictor variables such as sediment grain size and tidal frequency were used in the Marsh Sedimentation (MARSED) model for predicting deposition rates for three years. MARSED results for a fully restored pond show a root mean square deviation (RMSD) of 66.8 mm (<1) between modeled and field observations. This model was further applied to a pond breached in November 2010 and indicated that the recently breached pond will reach equilibrium levels after 60 months of tidal inundation.

  8. Modeling and forecasting US presidential election using learning algorithms

    NASA Astrophysics Data System (ADS)

    Zolghadr, Mohammad; Niaki, Seyed Armin Akhavan; Niaki, S. T. A.

    2017-09-01

    The primary objective of this research is to obtain an accurate forecasting model for the US presidential election. To identify a reliable model, artificial neural networks (ANN) and support vector regression (SVR) models are compared based on some specified performance measures. Moreover, six independent variables such as GDP, unemployment rate, the president's approval rate, and others are considered in a stepwise regression to identify significant variables. The president's approval rate is identified as the most significant variable, based on which eight other variables are identified and considered in the model development. Preprocessing methods are applied to prepare the data for the learning algorithms. The proposed procedure significantly increases the accuracy of the model by 50%. The learning algorithms (ANN and SVR) proved to be superior to linear regression based on each method's calculated performance measures. The SVR model is identified as the most accurate model among the other models as this model successfully predicted the outcome of the election in the last three elections (2004, 2008, and 2012). The proposed approach significantly increases the accuracy of the forecast.

  9. Hybrid modelling based on support vector regression with genetic algorithms in forecasting the cyanotoxins presence in the Trasona reservoir (Northern Spain).

    PubMed

    García Nieto, P J; Alonso Fernández, J R; de Cos Juez, F J; Sánchez Lasheras, F; Díaz Muñiz, C

    2013-04-01

    Cyanotoxins, a kind of poisonous substances produced by cyanobacteria, are responsible for health risks in drinking and recreational waters. As a result, anticipate its presence is a matter of importance to prevent risks. The aim of this study is to use a hybrid approach based on support vector regression (SVR) in combination with genetic algorithms (GAs), known as a genetic algorithm support vector regression (GA-SVR) model, in forecasting the cyanotoxins presence in the Trasona reservoir (Northern Spain). The GA-SVR approach is aimed at highly nonlinear biological problems with sharp peaks and the tests carried out proved its high performance. Some physical-chemical parameters have been considered along with the biological ones. The results obtained are two-fold. In the first place, the significance of each biological and physical-chemical variable on the cyanotoxins presence in the reservoir is determined with success. Finally, a predictive model able to forecast the possible presence of cyanotoxins in a short term was obtained. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Simulating land-use changes by incorporating spatial autocorrelation and self-organization in CLUE-S modeling: a case study in Zengcheng District, Guangzhou, China

    NASA Astrophysics Data System (ADS)

    Mei, Zhixiong; Wu, Hao; Li, Shiyun

    2018-06-01

    The Conversion of Land Use and its Effects at Small regional extent (CLUE-S), which is a widely used model for land-use simulation, utilizes logistic regression to estimate the relationships between land use and its drivers, and thus, predict land-use change probabilities. However, logistic regression disregards possible spatial autocorrelation and self-organization in land-use data. Autologistic regression can depict spatial autocorrelation but cannot address self-organization, while logistic regression by considering only self-organization (NElogistic regression) fails to capture spatial autocorrelation. Therefore, this study developed a regression (NE-autologistic regression) method, which incorporated both spatial autocorrelation and self-organization, to improve CLUE-S. The Zengcheng District of Guangzhou, China was selected as the study area. The land-use data of 2001, 2005, and 2009, as well as 10 typical driving factors, were used to validate the proposed regression method and the improved CLUE-S model. Then, three future land-use scenarios in 2020: the natural growth scenario, ecological protection scenario, and economic development scenario, were simulated using the improved model. Validation results showed that NE-autologistic regression performed better than logistic regression, autologistic regression, and NE-logistic regression in predicting land-use change probabilities. The spatial allocation accuracy and kappa values of NE-autologistic-CLUE-S were higher than those of logistic-CLUE-S, autologistic-CLUE-S, and NE-logistic-CLUE-S for the simulations of two periods, 2001-2009 and 2005-2009, which proved that the improved CLUE-S model achieved the best simulation and was thereby effective to a certain extent. The scenario simulation results indicated that under all three scenarios, traffic land and residential/industrial land would increase, whereas arable land and unused land would decrease during 2009-2020. Apparent differences also existed in the simulated change sizes and locations of each land-use type under different scenarios. The results not only demonstrate the validity of the improved model but also provide a valuable reference for relevant policy-makers.

  11. Propagation of Visible and Infrared Radiation in Fog, Rain, and Snow

    DTIC Science & Technology

    1982-07-01

    Force Base Washington, D.C. 20332 Project manager Smoke/Obscurants 3 Aberden Proving Ground , MD 21005 ATTN: DRCPM-SMK Air Force GL 1 Hanscom AFB...Research Laboratory Technical Reports Boulder, CO 80303 ATTN: Library, R-51 Director U.S. Army Materiel Systems Analysis Agency Aberdeen Proving Ground ...DRSMI-RRO 1 DRSMI-RHC 1 Commander 1 U.S. Army Electronic Proving Grounds Fort Huachuca, AZ 85613 ATTN: STEEP-MT-ST Director 1 U.S. Army Ballistic

  12. Seasonal forecasting of high wind speeds over Western Europe

    NASA Astrophysics Data System (ADS)

    Palutikof, J. P.; Holt, T.

    2003-04-01

    As financial losses associated with extreme weather events escalate, there is interest from end users in the forestry and insurance industries, for example, in the development of seasonal forecasting models with a long lead time. This study uses exceedences of the 90th, 95th, and 99th percentiles of daily maximum wind speed over the period 1958 to present to derive predictands of winter wind extremes. The source data is the 6-hourly NCEP Reanalysis gridded surface wind field. Predictor variables include principal components of Atlantic sea surface temperature and several indices of climate variability, including the NAO and SOI. Lead times of up to a year are considered, in monthly increments. Three regression techniques are evaluated; multiple linear regression (MLR), principal component regression (PCR), and partial least squares regression (PLS). PCR and PLS proved considerably superior to MLR with much lower standard errors. PLS was chosen to formulate the predictive model since it offers more flexibility in experimental design and gave slightly better results than PCR. The results indicate that winter windiness can be predicted with considerable skill one year ahead for much of coastal Europe, but that this deteriorates rapidly in the hinterland. The experiment succeeded in highlighting PLS as a very useful method for developing more precise forecasting models, and in identifying areas of high predictability.

  13. Mapping the spatial pattern of temperate forest above ground biomass by integrating airborne lidar with Radarsat-2 imagery via geostatistical models

    NASA Astrophysics Data System (ADS)

    Li, Wang; Niu, Zheng; Gao, Shuai; Wang, Cheng

    2014-11-01

    Light Detection and Ranging (LiDAR) and Synthetic Aperture Radar (SAR) are two competitive active remote sensing techniques in forest above ground biomass estimation, which is important for forest management and global climate change study. This study aims to further explore their capabilities in temperate forest above ground biomass (AGB) estimation by emphasizing the spatial auto-correlation of variables obtained from these two remote sensing tools, which is a usually overlooked aspect in remote sensing applications to vegetation studies. Remote sensing variables including airborne LiDAR metrics, backscattering coefficient for different SAR polarizations and their ratio variables for Radarsat-2 imagery were calculated. First, simple linear regression models (SLR) was established between the field-estimated above ground biomass and the remote sensing variables. Pearson's correlation coefficient (R2) was used to find which LiDAR metric showed the most significant correlation with the regression residuals and could be selected as co-variable in regression co-kriging (RCoKrig). Second, regression co-kriging was conducted by choosing the regression residuals as dependent variable and the LiDAR metric (Hmean) with highest R2 as co-variable. Third, above ground biomass over the study area was estimated using SLR model and RCoKrig model, respectively. The results for these two models were validated using the same ground points. Results showed that both of these two methods achieved satisfactory prediction accuracy, while regression co-kriging showed the lower estimation error. It is proved that regression co-kriging model is feasible and effective in mapping the spatial pattern of AGB in the temperate forest using Radarsat-2 data calibrated by airborne LiDAR metrics.

  14. Maintenance Operations in Mission Oriented Protective Posture Level IV (MOPPIV)

    DTIC Science & Technology

    1987-10-01

    Repair FADAC Printed Circuit Board ............. 6 3. Data Analysis Techniques ............................. 6 a. Multiple Linear Regression... ANALYSIS /DISCUSSION ............................... 12 1. Exa-ple of Regression Analysis ..................... 12 S2. Regression results for all tasks...6 * TABLE 9. Task Grouping for Analysis ........................ 7 "TABXLE 10. Remove/Replace H60A3 Power Pack................. 8 TABLE

  15. Robust design optimization method for centrifugal impellers under surface roughness uncertainties due to blade fouling

    NASA Astrophysics Data System (ADS)

    Ju, Yaping; Zhang, Chuhua

    2016-03-01

    Blade fouling has been proved to be a great threat to compressor performance in operating stage. The current researches on fouling-induced performance degradations of centrifugal compressors are based mainly on simplified roughness models without taking into account the realistic factors such as spatial non-uniformity and randomness of the fouling-induced surface roughness. Moreover, little attention has been paid to the robust design optimization of centrifugal compressor impellers with considerations of blade fouling. In this paper, a multi-objective robust design optimization method is developed for centrifugal impellers under surface roughness uncertainties due to blade fouling. A three-dimensional surface roughness map is proposed to describe the nonuniformity and randomness of realistic fouling accumulations on blades. To lower computational cost in robust design optimization, the support vector regression (SVR) metamodel is combined with the Monte Carlo simulation (MCS) method to conduct the uncertainty analysis of fouled impeller performance. The analyzed results show that the critical fouled region associated with impeller performance degradations lies at the leading edge of blade tip. The SVR metamodel has been proved to be an efficient and accurate means in the detection of impeller performance variations caused by roughness uncertainties. After design optimization, the robust optimal design is found to be more efficient and less sensitive to fouling uncertainties while maintaining good impeller performance in the clean condition. This research proposes a systematic design optimization method for centrifugal compressors with considerations of blade fouling, providing a practical guidance to the design of advanced centrifugal compressors.

  16. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  17. Resting-state functional magnetic resonance imaging: the impact of regression analysis.

    PubMed

    Yeh, Chia-Jung; Tseng, Yu-Sheng; Lin, Yi-Ru; Tsai, Shang-Yueh; Huang, Teng-Yi

    2015-01-01

    To investigate the impact of regression methods on resting-state functional magnetic resonance imaging (rsfMRI). During rsfMRI preprocessing, regression analysis is considered effective for reducing the interference of physiological noise on the signal time course. However, it is unclear whether the regression method benefits rsfMRI analysis. Twenty volunteers (10 men and 10 women; aged 23.4 ± 1.5 years) participated in the experiments. We used node analysis and functional connectivity mapping to assess the brain default mode network by using five combinations of regression methods. The results show that regressing the global mean plays a major role in the preprocessing steps. When a global regression method is applied, the values of functional connectivity are significantly lower (P ≤ .01) than those calculated without a global regression. This step increases inter-subject variation and produces anticorrelated brain areas. rsfMRI data processed using regression should be interpreted carefully. The significance of the anticorrelated brain areas produced by global signal removal is unclear. Copyright © 2014 by the American Society of Neuroimaging.

  18. 4D-Fingerprint Categorical QSAR Models for Skin Sensitization Based on Classification Local Lymph Node Assay Measures

    PubMed Central

    Li, Yi; Tseng, Yufeng J.; Pan, Dahua; Liu, Jianzhong; Kern, Petra S.; Gerberick, G. Frank; Hopfinger, Anton J.

    2008-01-01

    Currently, the only validated methods to identify skin sensitization effects are in vivo models, such as the Local Lymph Node Assay (LLNA) and guinea pig studies. There is a tremendous need, in particular due to novel legislation, to develop animal alternatives, eg. Quantitative Structure-Activity Relationship (QSAR) models. Here, QSAR models for skin sensitization using LLNA data have been constructed. The descriptors used to generate these models are derived from the 4D-molecular similarity paradigm and are referred to as universal 4D-fingerprints. A training set of 132 structurally diverse compounds and a test set of 15 structurally diverse compounds were used in this study. The statistical methodologies used to build the models are logistic regression (LR), and partial least square coupled logistic regression (PLS-LR), which prove to be effective tools for studying skin sensitization measures expressed in the two categorical terms of sensitizer and non-sensitizer. QSAR models with low values of the Hosmer-Lemeshow goodness-of-fit statistic, χHL2, are significant and predictive. For the training set, the cross-validated prediction accuracy of the logistic regression models ranges from 77.3% to 78.0%, while that of PLS-logistic regression models ranges from 87.1% to 89.4%. For the test set, the prediction accuracy of logistic regression models ranges from 80.0%-86.7%, while that of PLS-logistic regression models ranges from 73.3%-80.0%. The QSAR models are made up of 4D-fingerprints related to aromatic atoms, hydrogen bond acceptors and negatively partially charged atoms. PMID:17226934

  19. Agile Acceptance Test–Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software

    PubMed Central

    Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-01-01

    Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922

  20. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    PubMed

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.

  1. Exploring simple, transparent, interpretable and predictive QSAR models for classification and quantitative prediction of rat toxicity of ionic liquids using OECD recommended guidelines.

    PubMed

    Das, Rudra Narayan; Roy, Kunal; Popelier, Paul L A

    2015-11-01

    The present study explores the chemical attributes of diverse ionic liquids responsible for their cytotoxicity in a rat leukemia cell line (IPC-81) by developing predictive classification as well as regression-based mathematical models. Simple and interpretable descriptors derived from a two-dimensional representation of the chemical structures along with quantum topological molecular similarity indices have been used for model development, employing unambiguous modeling strategies that strictly obey the guidelines of the Organization for Economic Co-operation and Development (OECD) for quantitative structure-activity relationship (QSAR) analysis. The structure-toxicity relationships that emerged from both classification and regression-based models were in accordance with the findings of some previous studies. The models suggested that the cytotoxicity of ionic liquids is dependent on the cationic surfactant action, long alkyl side chains, cationic lipophilicity as well as aromaticity, the presence of a dialkylamino substituent at the 4-position of the pyridinium nucleus and a bulky anionic moiety. The models have been transparently presented in the form of equations, thus allowing their easy transferability in accordance with the OECD guidelines. The models have also been subjected to rigorous validation tests proving their predictive potential and can hence be used for designing novel and "greener" ionic liquids. The major strength of the present study lies in the use of a diverse and large dataset, use of simple reproducible descriptors and compliance with the OECD norms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Heuristics as Bayesian inference under extreme priors.

    PubMed

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Obesity and severe obesity forecasts through 2030.

    PubMed

    Finkelstein, Eric A; Khavjou, Olga A; Thompson, Hope; Trogdon, Justin G; Pan, Liping; Sherry, Bettylou; Dietz, William

    2012-06-01

    Previous efforts to forecast future trends in obesity applied linear forecasts assuming that the rise in obesity would continue unabated. However, evidence suggests that obesity prevalence may be leveling off. This study presents estimates of adult obesity and severe obesity prevalence through 2030 based on nonlinear regression models. The forecasted results are then used to simulate the savings that could be achieved through modestly successful obesity prevention efforts. The study was conducted in 2009-2010 and used data from the 1990 through 2008 Behavioral Risk Factor Surveillance System (BRFSS). The analysis sample included nonpregnant adults aged ≥ 18 years. The individual-level BRFSS variables were supplemented with state-level variables from the U.S. Bureau of Labor Statistics, the American Chamber of Commerce Research Association, and the Census of Retail Trade. Future obesity and severe obesity prevalence were estimated through regression modeling by projecting trends in explanatory variables expected to influence obesity prevalence. Linear time trend forecasts suggest that by 2030, 51% of the population will be obese. The model estimates a much lower obesity prevalence of 42% and severe obesity prevalence of 11%. If obesity were to remain at 2010 levels, the combined savings in medical expenditures over the next 2 decades would be $549.5 billion. The study estimates a 33% increase in obesity prevalence and a 130% increase in severe obesity prevalence over the next 2 decades. If these forecasts prove accurate, this will further hinder efforts for healthcare cost containment. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Socioeconomic development and secular trend in height in China.

    PubMed

    Zong, Xin-Nan; Li, Hui; Wu, Hua-Hong; Zhang, Ya-Qin

    2015-12-01

    The objective of this study was to examine the effect of socioeconomic development on secular trend in height among children and adolescents in China. Body height and spermarcheal/menarcheal ages were obtained from two periodic large-scale national representative surveys in China between 1975 and 2010. Chinese socioeconomic development indicators were obtained from the United Nations world population prospects. The effects of plausible determinants were assessed by partial least-squares regression. The average height of children and adolescents improved in tandem with socioeconomic development, without any tendency to plateau. The increment of height trend presented larger around puberty than earlier or later ages. The partial least-squares regressions with gross national income, life expectancy and spermarcheal/menarcheal age accounted for increment of height trend from 88.3% to 98.3% for males and from 82.9% to 97.3% for females in adolescence. Further, through the analysis of the variable importance for projection, the contributions of gross national income and life expectancy on height increment were confirmed to be significant in childhood and adolescence, and the contribution of spermarcheal/menarcheal age was superior to both of them in adolescence. We concluded that positive secular trend in height in China was significantly associated with socioeconomic status (GNI as indicator) and medical and health conditions (life expectancy as indicator). Earlier onset of spermarche and menarche proved to be an important role in larger increment of the trend over time of height at puberty for a population. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Closed-form solution for static pull-in voltage of electrostatically actuated clamped-clamped micro/nano beams under the effect of fringing field and van der Waals force

    NASA Astrophysics Data System (ADS)

    Bhojawala, V. M.; Vakharia, D. P.

    2017-12-01

    This investigation provides an accurate prediction of static pull-in voltage for clamped-clamped micro/nano beams based on distributed model. The Euler-Bernoulli beam theory is used adapting geometric non-linearity of beam, internal (residual) stress, van der Waals force, distributed electrostatic force and fringing field effects for deriving governing differential equation. The Galerkin discretisation method is used to make reduced-order model of the governing differential equation. A regime plot is presented in the current work for determining the number of modes required in reduced-order model to obtain completely converged pull-in voltage for micro/nano beams. A closed-form relation is developed based on the relationship obtained from curve fitting of pull-in instability plots and subsequent non-linear regression for the proposed relation. The output of regression analysis provides Chi-square (χ 2) tolerance value equals to 1  ×  10-9, adjusted R-square value equals to 0.999 29 and P-value equals to zero, these statistical parameters indicate the convergence of non-linear fit, accuracy of fitted data and significance of the proposed model respectively. The closed-form equation is validated using available data of experimental and numerical results. The relative maximum error of 4.08% in comparison to several available experimental and numerical data proves the reliability of the proposed closed-form equation.

  6. Can Psychological, Social and Demographical Factors Predict Clinical Characteristics Symptomatology of Bipolar Affective Disorder and Schizophrenia?

    PubMed

    Maciukiewicz, Malgorzata; Pawlak, Joanna; Kapelski, Pawel; Łabędzka, Magdalena; Skibinska, Maria; Zaremba, Dorota; Leszczynska-Rodziewicz, Anna; Dmitrzak-Weglarz, Monika; Hauser, Joanna

    2016-09-01

    Schizophrenia (SCH) is a complex, psychiatric disorder affecting 1 % of population. Its clinical phenotype is heterogeneous with delusions, hallucinations, depression, disorganized behaviour and negative symptoms. Bipolar affective disorder (BD) refers to periodic changes in mood and activity from depression to mania. It affects 0.5-1.5 % of population. Two types of disorder (type I and type II) are distinguished by severity of mania episodes. In our analysis, we aimed to check if clinical and demographical characteristics of the sample are predictors of symptom dimensions occurrence in BD and SCH cases. We included total sample of 443 bipolar and 439 schizophrenia patients. Diagnosis was based on DSM-IV criteria using Structured Clinical Interview for DSM-IV. We applied regression models to analyse associations between clinical and demographical traits from OPCRIT and symptom dimensions. We used previously computed dimensions of schizophrenia and bipolar affective disorder as quantitative traits for regression models. Male gender seemed protective factor for depression dimension in schizophrenia and bipolar disorder sample. Presence of definite psychosocial stressor prior disease seemed risk factor for depressive and suicidal domain in BD and SCH. OPCRIT items describing premorbid functioning seemed related with depression, positive and disorganised dimensions in schizophrenia and psychotic in BD. We proved clinical and demographical characteristics of the sample are predictors of symptom dimensions of schizophrenia and bipolar disorder. We also saw relation between clinical dimensions and course of disorder and impairment during disorder.

  7. Using decision trees to understand structure in missing data

    PubMed Central

    Tierney, Nicholas J; Harden, Fiona A; Harden, Maurice J; Mengersen, Kerrie L

    2015-01-01

    Objectives Demonstrate the application of decision trees—classification and regression trees (CARTs), and their cousins, boosted regression trees (BRTs)—to understand structure in missing data. Setting Data taken from employees at 3 different industrial sites in Australia. Participants 7915 observations were included. Materials and methods The approach was evaluated using an occupational health data set comprising results of questionnaires, medical tests and environmental monitoring. Statistical methods included standard statistical tests and the ‘rpart’ and ‘gbm’ packages for CART and BRT analyses, respectively, from the statistical software ‘R’. A simulation study was conducted to explore the capability of decision tree models in describing data with missingness artificially introduced. Results CART and BRT models were effective in highlighting a missingness structure in the data, related to the type of data (medical or environmental), the site in which it was collected, the number of visits, and the presence of extreme values. The simulation study revealed that CART models were able to identify variables and values responsible for inducing missingness. There was greater variation in variable importance for unstructured as compared to structured missingness. Discussion Both CART and BRT models were effective in describing structural missingness in data. CART models may be preferred over BRT models for exploratory analysis of missing data, and selecting variables important for predicting missingness. BRT models can show how values of other variables influence missingness, which may prove useful for researchers. Conclusions Researchers are encouraged to use CART and BRT models to explore and understand missing data. PMID:26124509

  8. Estimating conditional proportion curves by regression residuals.

    PubMed

    Han, Bing; Lim, Nelson

    2010-06-15

    Researchers often derive a categorical outcome from an observed continuous measurement y. For example, human obesity status can be defined by the body mass index. They proceed to estimate the conditional proportion curve p(x) = P(y

  9. Altitude Above Sea Level and Body Mass Index as Determinants of Oxygen Saturation in Children: The SON@ Study.

    PubMed

    Gochicoa-Rangel, Laura; Pérez-Padilla, José Rogelio; Rodríguez-Moreno, Luis; Montero-Matamoros, Arturo; Ojeda-Luna, Nancy; Martínez-Carbajal, Gema; Hernández-Raygoza, Roberto; Ruiz-Pedraza, Dolores; Fernández-Plata, María Rosario; Torre-Bouscoulet, Luis

    2015-01-01

    Altitude above sea level and body mass index are well-recognized determinants of oxygen saturation in adult populations; however, the contribution of these factors to oxygen saturation in children is less clear. To explore the contribution of altitude above sea level and body mass index to oxygen saturation in children. A multi-center, cross-sectional study conducted in nine cities in Mexico. Parents signed informed consent forms and completed a health status questionnaire. Height, weight, and pulse oximetry were recorded. We studied 2,200 subjects (52% girls) aged 8.7 ± 3.0 years. Mean body mass index, z-body mass index, and oxygen saturation were 18.1 ± 3.6 kg·m-2, 0.58 ± 1.3, and 95.5 ± 2.4%, respectively. By multiple regression analysis, altitude proved to be the main predictor of oxygen saturation, with non-significant contributions of age, gender, and body mass index. According to quantile regression, the median estimate of oxygen saturation was 98.7 minus 1.7% per km of altitude above sea level, and the oxygen saturation fifth percentile 97.4 minus 2.7% per km of altitude. Altitude was the main determinant of oxygen saturation, which on average decreased 1.7% per km of elevation from a percentage of 98.7 at sea level. In contrast with adults, this study in children found no association between oxygen saturation and obesity or age.

  10. Standards for Standardized Logistic Regression Coefficients

    ERIC Educational Resources Information Center

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  11. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  12. An improved multiple linear regression and data analysis computer program package

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  13. Weighted SGD for ℓ p Regression with Randomized Preconditioning.

    PubMed

    Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W

    2016-01-01

    In recent years, stochastic gradient descent (SGD) methods and randomized linear algebra (RLA) algorithms have been applied to many large-scale problems in machine learning and data analysis. SGD methods are easy to implement and applicable to a wide range of convex optimization problems. In contrast, RLA algorithms provide much stronger performance guarantees but are applicable to a narrower class of problems. We aim to bridge the gap between these two methods in solving constrained overdetermined linear regression problems-e.g., ℓ 2 and ℓ 1 regression problems. We propose a hybrid algorithm named pwSGD that uses RLA techniques for preconditioning and constructing an importance sampling distribution, and then performs an SGD-like iterative process with weighted sampling on the preconditioned system.By rewriting a deterministic ℓ p regression problem as a stochastic optimization problem, we connect pwSGD to several existing ℓ p solvers including RLA methods with algorithmic leveraging (RLA for short).We prove that pwSGD inherits faster convergence rates that only depend on the lower dimension of the linear system, while maintaining low computation complexity. Such SGD convergence rates are superior to other related SGD algorithm such as the weighted randomized Kaczmarz algorithm.Particularly, when solving ℓ 1 regression with size n by d , pwSGD returns an approximate solution with ε relative error in the objective value in (log n ·nnz( A )+poly( d )/ ε 2 ) time. This complexity is uniformly better than that of RLA methods in terms of both ε and d when the problem is unconstrained. In the presence of constraints, pwSGD only has to solve a sequence of much simpler and smaller optimization problem over the same constraints. In general this is more efficient than solving the constrained subproblem required in RLA.For ℓ 2 regression, pwSGD returns an approximate solution with ε relative error in the objective value and the solution vector measured in prediction norm in (log n ·nnz( A )+poly( d ) log(1/ ε )/ ε ) time. We show that for unconstrained ℓ 2 regression, this complexity is comparable to that of RLA and is asymptotically better over several state-of-the-art solvers in the regime where the desired accuracy ε , high dimension n and low dimension d satisfy d ≥ 1/ ε and n ≥ d 2 / ε . We also provide lower bounds on the coreset complexity for more general regression problems, indicating that still new ideas will be needed to extend similar RLA preconditioning ideas to weighted SGD algorithms for more general regression problems. Finally, the effectiveness of such algorithms is illustrated numerically on both synthetic and real datasets, and the results are consistent with our theoretical findings and demonstrate that pwSGD converges to a medium-precision solution, e.g., ε = 10 -3 , more quickly.

  14. Weighted SGD for ℓp Regression with Randomized Preconditioning*

    PubMed Central

    Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W.

    2018-01-01

    In recent years, stochastic gradient descent (SGD) methods and randomized linear algebra (RLA) algorithms have been applied to many large-scale problems in machine learning and data analysis. SGD methods are easy to implement and applicable to a wide range of convex optimization problems. In contrast, RLA algorithms provide much stronger performance guarantees but are applicable to a narrower class of problems. We aim to bridge the gap between these two methods in solving constrained overdetermined linear regression problems—e.g., ℓ2 and ℓ1 regression problems. We propose a hybrid algorithm named pwSGD that uses RLA techniques for preconditioning and constructing an importance sampling distribution, and then performs an SGD-like iterative process with weighted sampling on the preconditioned system.By rewriting a deterministic ℓp regression problem as a stochastic optimization problem, we connect pwSGD to several existing ℓp solvers including RLA methods with algorithmic leveraging (RLA for short).We prove that pwSGD inherits faster convergence rates that only depend on the lower dimension of the linear system, while maintaining low computation complexity. Such SGD convergence rates are superior to other related SGD algorithm such as the weighted randomized Kaczmarz algorithm.Particularly, when solving ℓ1 regression with size n by d, pwSGD returns an approximate solution with ε relative error in the objective value in 𝒪(log n·nnz(A)+poly(d)/ε2) time. This complexity is uniformly better than that of RLA methods in terms of both ε and d when the problem is unconstrained. In the presence of constraints, pwSGD only has to solve a sequence of much simpler and smaller optimization problem over the same constraints. In general this is more efficient than solving the constrained subproblem required in RLA.For ℓ2 regression, pwSGD returns an approximate solution with ε relative error in the objective value and the solution vector measured in prediction norm in 𝒪(log n·nnz(A)+poly(d) log(1/ε)/ε) time. We show that for unconstrained ℓ2 regression, this complexity is comparable to that of RLA and is asymptotically better over several state-of-the-art solvers in the regime where the desired accuracy ε, high dimension n and low dimension d satisfy d ≥ 1/ε and n ≥ d2/ε. We also provide lower bounds on the coreset complexity for more general regression problems, indicating that still new ideas will be needed to extend similar RLA preconditioning ideas to weighted SGD algorithms for more general regression problems. Finally, the effectiveness of such algorithms is illustrated numerically on both synthetic and real datasets, and the results are consistent with our theoretical findings and demonstrate that pwSGD converges to a medium-precision solution, e.g., ε = 10−3, more quickly. PMID:29782626

  15. [A SAS marco program for batch processing of univariate Cox regression analysis for great database].

    PubMed

    Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin

    2015-02-01

    To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.

  16. Species Composition at the Sub-Meter Level in Discontinuous Permafrost in Subarctic Sweden

    NASA Astrophysics Data System (ADS)

    Anderson, S. M.; Palace, M. W.; Layne, M.; Varner, R. K.; Crill, P. M.

    2013-12-01

    Northern latitudes are experiencing rapid warming. Wetlands underlain by permafrost are particularly vulnerable to warming which results in changes in vegetative cover. Specific species have been associated with greenhouse gas emissions therefore knowledge of species compositional shift allows for the systematic change and quantification of emissions and changes in such emissions. Species composition varies on the sub-meter scale based on topography and other microsite environmental parameters. This complexity and the need to scale vegetation to the landscape level proves vital in our estimation of carbon dioxide (CO2) and methane (CH4) emissions and dynamics. Stordalen Mire (68°21'N, 18°49'E) in Abisko and is located at the edge of discontinuous permafrost zone. This provides a unique opportunity to analyze multiple vegetation communities in a close proximity. To do this, we randomly selected 25 1x1 meter plots that were representative of five major cover types: Semi-wet, wet, hummock, tall graminoid, and tall shrub. We used a quadrat with 64 sub plots and measured areal percent cover for 24 species. We collected ground based remote sensing (RS) at each plot to determine species composition using an ADC-lite (near infrared, red, green) and GoPro (red, blue, green). We normalized each image based on a Teflon white chip placed in each image. Textural analysis was conducted on each image for entropy, angular second momentum, and lacunarity. A logistic regression was developed to examine vegetation cover types and remote sensing parameters. We used a multiple linear regression using forwards stepwise variable selection. We found statistical difference in species composition and diversity indices between vegetation cover types. In addition, we were able to build regression model to significantly estimate vegetation cover type as well as percent cover for specific key vegetative species. This ground-based remote sensing allows for quick quantification of vegetation cover and species and also provides the framework for scaling to satellite image data to estimate species composition and shift on the landscape level. To determine diversity within our plots we calculated species richness and Shannon Index. We found that there were statistically different species composition within each vegetation cover type and also determined which species were indicative for cover type. Our logistical regression was able to significantly classify vegetation cover types based on RS parameters. Our multiple regression analysis indicated Betunla nana (Dwarf Birch) (r2= .48, p=<0.0001) and Sphagnum (r2=0.59, p=<0.0001) were statistically significant with respect to RS parameters. We suggest that ground based remote sensing methods may provide a unique and efficient method to quantify vegetation across the landscape in northern latitude wetlands.

  17. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  18. Selective principal component regression analysis of fluorescence hyperspectral image to assess aflatoxin contamination in corn

    USDA-ARS?s Scientific Manuscript database

    Selective principal component regression analysis (SPCR) uses a subset of the original image bands for principal component transformation and regression. For optimal band selection before the transformation, this paper used genetic algorithms (GA). In this case, the GA process used the regression co...

  19. Prevalence and Predictors of Clozapine-Associated Constipation: A Systematic Review and Meta-Analysis

    PubMed Central

    Shirazi, Ayala; Stubbs, Brendon; Gomez, Lucia; Moore, Susan; Gaughran, Fiona; Flanagan, Robert J.; MacCabe, James H.; Lally, John

    2016-01-01

    Constipation is a frequently overlooked side effect of clozapine treatment that can prove fatal. We conducted a systematic review and meta-analysis to estimate the prevalence and risk factors for clozapine-associated constipation. Two authors performed a systematic search of major electronic databases from January 1990 to March 2016 for articles reporting the prevalence of constipation in adults treated with clozapine. A random effects meta-analysis was conducted. A total of 32 studies were meta-analyzed, establishing a pooled prevalence of clozapine-associated constipation of 31.2% (95% CI: 25.6–37.4) (n = 2013). People taking clozapine were significantly more likely to be constipated versus other antipsychotics (OR 3.02 (CI: 1.91–4.77), p < 0.001, n = 11 studies). Meta-regression identified two significant study-level factors associated with constipation prevalence: significantly higher (p = 0.02) rates of constipation were observed for those treated in inpatient versus outpatient or mixed settings and for those studies in which constipation was a primary or secondary outcome measure (36.9%) compared to studies in which constipation was not a specified outcome measure (24.8%, p = 0.048). Clozapine-associated constipation is common and approximately three times more likely than with other antipsychotics. Screening and preventative strategies should be established and appropriate symptomatic treatment applied when required. PMID:27271593

  20. Influence of beta-cyclodextrin complexation on glipizide release from hydroxypropyl methylcellulose matrix tablets.

    PubMed

    Shivakumar, H N; Desai, B G; Pandya, Saumyak; Karki, S S

    2007-01-01

    Glipizide was complexed with beta-cyclodextrin in an attempt to enhance the drug solubility. The phase solubility diagram was classified as A(L) type, which was characterized by an apparent 1:1 stability constant that had a value of 413.82 M(-1). Fourier transform infrared spectrophotometry, differential scanning calorimetry, powder x-ray diffractometry and proton nuclear magnetic resonance spectral analysis indicated considerable interaction between the drug and beta-cyclodextrin. A 2(3) factorial design was employed to prepare hydroxypropyl methylcellulose (HPMC) matrix tablets containing the drug or its complex. The effect of the total polymer loads (X1), levels of HPMC K100LV (X9), and complexation (X3) on release at first hour (Y1), 24 h (Y2), time taken for 50% release (Y3), and diffusion exponent (Y4) was systematically analyzed using the F test. Mathematical models containing only the significant terms (P < 0.05) were generated for each parameter by multiple linear regression analysis and analysis of variance. Complexation was found to exert a significant effect on Y1, Y2, and Y3, whereas total polymer loads significantly influenced all the responses. The models generated were validated by developing two new formulations with a combination of factors within the experimental domain. The experimental values of the response parameters were in close agreement with the predicted values, thereby proving-the validity of the generated mathematical models.

  1. Assessing the effectiveness of sustainable land management policies for combating desertification: A data mining approach.

    PubMed

    Salvati, L; Kosmas, C; Kairis, O; Karavitis, C; Acikalin, S; Belgacem, A; Solé-Benet, A; Chaker, M; Fassouli, V; Gokceoglu, C; Gungor, H; Hessel, R; Khatteli, H; Kounalaki, A; Laouina, A; Ocakoglu, F; Ouessar, M; Ritsema, C; Sghaier, M; Sonmez, H; Taamallah, H; Tezcan, L; de Vente, J; Kelly, C; Colantoni, A; Carlucci, M

    2016-12-01

    This study investigates the relationship between fine resolution, local-scale biophysical and socioeconomic contexts within which land degradation occurs, and the human responses to it. The research draws on experimental data collected under different territorial and socioeconomic conditions at 586 field sites in five Mediterranean countries (Spain, Greece, Turkey, Tunisia and Morocco). We assess the level of desertification risk under various land management practices (terracing, grazing control, prevention of wildland fires, soil erosion control measures, soil water conservation measures, sustainable farming practices, land protection measures and financial subsidies) taken as possible responses to land degradation. A data mining approach, incorporating principal component analysis, non-parametric correlations, multiple regression and canonical analysis, was developed to identify the spatial relationship between land management conditions, the socioeconomic and environmental context (described using 40 biophysical and socioeconomic indicators) and desertification risk. Our analysis identified a number of distinct relationships between the level of desertification experienced and the underlying socioeconomic context, suggesting that the effectiveness of responses to land degradation is strictly dependent on the local biophysical and socioeconomic context. Assessing the latent relationship between land management practices and the biophysical/socioeconomic attributes characterizing areas exposed to different levels of desertification risk proved to be an indirect measure of the effectiveness of field actions contrasting land degradation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A novel hybrid method of beta-turn identification in protein using binary logistic regression and neural network

    PubMed Central

    Asghari, Mehdi Poursheikhali; Hayatshahi, Sayyed Hamed Sadat; Abdolmaleki, Parviz

    2012-01-01

    From both the structural and functional points of view, β-turns play important biological roles in proteins. In the present study, a novel two-stage hybrid procedure has been developed to identify β-turns in proteins. Binary logistic regression was initially used for the first time to select significant sequence parameters in identification of β-turns due to a re-substitution test procedure. Sequence parameters were consisted of 80 amino acid positional occurrences and 20 amino acid percentages in sequence. Among these parameters, the most significant ones which were selected by binary logistic regression model, were percentages of Gly, Ser and the occurrence of Asn in position i+2, respectively, in sequence. These significant parameters have the highest effect on the constitution of a β-turn sequence. A neural network model was then constructed and fed by the parameters selected by binary logistic regression to build a hybrid predictor. The networks have been trained and tested on a non-homologous dataset of 565 protein chains. With applying a nine fold cross-validation test on the dataset, the network reached an overall accuracy (Qtotal) of 74, which is comparable with results of the other β-turn prediction methods. In conclusion, this study proves that the parameter selection ability of binary logistic regression together with the prediction capability of neural networks lead to the development of more precise models for identifying β-turns in proteins. PMID:27418910

  3. A novel hybrid method of beta-turn identification in protein using binary logistic regression and neural network.

    PubMed

    Asghari, Mehdi Poursheikhali; Hayatshahi, Sayyed Hamed Sadat; Abdolmaleki, Parviz

    2012-01-01

    From both the structural and functional points of view, β-turns play important biological roles in proteins. In the present study, a novel two-stage hybrid procedure has been developed to identify β-turns in proteins. Binary logistic regression was initially used for the first time to select significant sequence parameters in identification of β-turns due to a re-substitution test procedure. Sequence parameters were consisted of 80 amino acid positional occurrences and 20 amino acid percentages in sequence. Among these parameters, the most significant ones which were selected by binary logistic regression model, were percentages of Gly, Ser and the occurrence of Asn in position i+2, respectively, in sequence. These significant parameters have the highest effect on the constitution of a β-turn sequence. A neural network model was then constructed and fed by the parameters selected by binary logistic regression to build a hybrid predictor. The networks have been trained and tested on a non-homologous dataset of 565 protein chains. With applying a nine fold cross-validation test on the dataset, the network reached an overall accuracy (Qtotal) of 74, which is comparable with results of the other β-turn prediction methods. In conclusion, this study proves that the parameter selection ability of binary logistic regression together with the prediction capability of neural networks lead to the development of more precise models for identifying β-turns in proteins.

  4. Prostate cancer localization with endorectal MR imaging and MR spectroscopic imaging: effect of clinical data on reader accuracy.

    PubMed

    Dhingsa, Rajpal; Qayyum, Aliya; Coakley, Fergus V; Lu, Ying; Jones, Kirk D; Swanson, Mark G; Carroll, Peter R; Hricak, Hedvig; Kurhanewicz, John

    2004-01-01

    To determine the effect of digital rectal examination findings, sextant biopsy results, and prostate-specific antigen (PSA) levels on reader accuracy in the localization of prostate cancer with endorectal magnetic resonance (MR) imaging and MR spectroscopic imaging. This was a retrospective study of 37 patients (mean age, 57 years) with biopsy-proved prostate cancer. Transverse T1-weighted, transverse high-spatial-resolution, and coronal T2-weighted MR images and MR spectroscopic images were obtained. Two independent readers, unaware of clinical data, recorded the size and location of suspicious peripheral zone tumor nodules on a standardized diagram of the prostate. Readers also recorded their degree of diagnostic confidence for each nodule on a five-point scale. Both readers repeated this interpretation with knowledge of rectal examination findings, sextant biopsy results, and PSA level. Step-section histopathologic findings were the reference standard. Logistic regression analysis with generalized estimating equations was used to correlate tumor detection with clinical data, and alternative free-response receiver operating characteristic (AFROC) curve analysis was used to examine the overall effect of clinical data on all positive results. Fifty-one peripheral zone tumor nodules were identified at histopathologic evaluation. Logistic regression analysis showed awareness of clinical data significantly improved tumor detection rate (P <.02) from 15 to 19 nodules for reader 1 and from 13 to 19 nodules for reader 2 (27%-37% overall) by using both size and location criteria. AFROC analysis showed no significant change in overall reader performance because there was an associated increase in the number of false-positive findings with awareness of clinical data, from 11 to 21 for reader 1 and from 16 to 25 for reader 2. Awareness of clinical data significantly improves reader detection of prostate cancer nodules with endorectal MR imaging and MR spectroscopic imaging, but there is no overall change in reader accuracy, because of an associated increase in false-positive findings. A stricter definition of a true-positive result is associated with reduced sensitivity for prostate cancer nodule detection. Copyright RSNA, 2004

  5. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  6. Regression Analysis and the Sociological Imagination

    ERIC Educational Resources Information Center

    De Maio, Fernando

    2014-01-01

    Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.

  7. Outperforming whom? A multilevel study of performance-prove goal orientation, performance, and the moderating role of shared team identification.

    PubMed

    Dietz, Bart; van Knippenberg, Daan; Hirst, Giles; Restubog, Simon Lloyd D

    2015-11-01

    Performance-prove goal orientation affects performance because it drives people to try to outperform others. A proper understanding of the performance-motivating potential of performance-prove goal orientation requires, however, that we consider the question of whom people desire to outperform. In a multilevel analysis of this issue, we propose that the shared team identification of a team plays an important moderating role here, directing the performance-motivating influence of performance-prove goal orientation to either the team level or the individual level of performance. A multilevel study of salespeople nested in teams supports this proposition, showing that performance-prove goal orientation motivates team performance more with higher shared team identification, whereas performance-prove goal orientation motivates individual performance more with lower shared team identification. Establishing the robustness of these findings, a second study replicates them with individual and team performance in an educational context. (c) 2015 APA, all rights reserved).

  8. External Validation of the European Hernia Society Classification for Postoperative Complications after Incisional Hernia Repair: A Cohort Study of 2,191 Patients.

    PubMed

    Kroese, Leonard F; Kleinrensink, Gert-Jan; Lange, Johan F; Gillion, Jean-Francois

    2018-03-01

    Incisional hernia is a frequent complication after midline laparotomy. Surgical hernia repair is associated with complications, but no clear predictive risk factors have been identified. The European Hernia Society (EHS) classification offers a structured framework to describe hernias and to analyze postoperative complications. Because of its structured nature, it might prove to be useful for preoperative patient or treatment classification. The objective of this study was to investigate the EHS classification as a predictor for postoperative complications after incisional hernia surgery. An analysis was performed using a registry-based, large-scale, prospective cohort study, including all patients undergoing incisional hernia surgery between September 1, 2011 and February 29, 2016. Univariate analyses and multivariable logistic regression analysis were performed to identify risk factors for postoperative complications. A total of 2,191 patients were included, of whom 323 (15%) had 1 or more complications. Factors associated with complications in univariate analyses (p < 0.20) and clinically relevant factors were included in the multivariable analysis. In the multivariable analysis, EHS width class, incarceration, open surgery, duration of surgery, Altemeier wound class, and therapeutic antibiotic treatment were independent risk factors for postoperative complications. Third recurrence and emergency surgery were associated with fewer complications. Incisional hernia repair is associated with a 15% complication rate. The EHS width classification is associated with postoperative complications. To identify patients at risk for complications, the EHS classification is useful. Copyright © 2017. Published by Elsevier Inc.

  9. Monitoring of adherence to headache treatments by means of hair analysis.

    PubMed

    Ferrari, Anna; Licata, Manuela; Rustichelli, Cecilia; Baraldi, Carlo; Vandelli, Daniele; Marchesi, Filippo; Palazzoli, Federica; Verri, Patrizia; Silingardi, Enrico

    2017-02-01

    The aim of this study was to evaluate the potential of hair analysis to monitor medication adherence in headache patients undergoing chronic therapy. For this purpose, the following parameters were analyzed: the detection rate of 23 therapeutic drugs in headache patients' hair, the degree of agreement between the self-reported drug and the drug found in hair, and whether the levels found in hair reflected the drug intake reported by the patients. The study included 93 patients suffering from primary headaches declaring their daily intake of at least one of the following drugs during the 3 months before the hair sampling: alprazolam, amitriptyline, citalopram, clomipramine, clonazepam, delorazepam, diazepam, duloxetine, fluoxetine, flurazepam, levomepromazine, levosulpiride, lorazepam, lormetazepam, mirtazapine, paroxetine, quetiapine, sertraline, topiramate, trazodone, triazolam, venlafaxine, and zolpidem. A detailed pharmacological history and a sample of hair were collected for each patient. Hair samples were analyzed by liquid chromatography-electrospray tandem mass spectrometry, using a previously developed method. All 23 drugs were detected in the examined hair samples. The agreement between the self-reported drug and the drug found in hair was excellent for most analytes (P < 0.001, Cohen's kappa); a statistically significant relationship (P < 0.05, linear regression analysis) between dose and hair level was found for amitriptyline, citalopram, delorazepam, duloxetine, lorazepam, and venlafaxine. Hair analysis proved to be a unique matrix to document chronic drug use in headache patients, and the level found for each individual drug can represent a reliable marker of adherence to pharmacological treatments.

  10. Lymphangiography

    PubMed Central

    Kadin, Michael R.; Thompson, Ronald W.

    1974-01-01

    Since the introduction of “staging laparotomy” (to determine the disease's stage) in assessing Hodgkin's disease, some observers have argued that lymphangiography could be safely omitted in the initial diagnostic evaluation. To test these opinions a series of 75 patients with Hodgkin's disease who had a staging laparotomy and histological correlation with lymphangiograms was reviewed. Of 16 examinations with positive results, one proved to be a false positive. Of the 14 examinations with equivocal results, one proved histologically positive. In the remaining 45 lymphangiograms, five were falsely negative. In all five of these patients abdominal lymph nodes were involved, but in areas that do not routinely opacify on lower extremity lymphangiography. The overall accuracy was 90 percent. Therapeutically, the lymphangiogram permits accurate planning for treatment by radiation therapy so that all known disease is treated and yet bone marrow is not excessively irradiated. Changes in lymph node architecture after therapy provide valuable information as to regression of the disease or signs of its early recurrence. ImagesFigure 1.Figure 2.Figure 3.Figure 4.Figure 5.Figure 6.Figure 7. PMID:4816398

  11. Logical errors on proving theorem

    NASA Astrophysics Data System (ADS)

    Sari, C. K.; Waluyo, M.; Ainur, C. M.; Darmaningsih, E. N.

    2018-01-01

    In tertiary level, students of mathematics education department attend some abstract courses, such as Introduction to Real Analysis which needs an ability to prove mathematical statements almost all the time. In fact, many students have not mastered this ability appropriately. In their Introduction to Real Analysis tests, even though they completed their proof of theorems, they achieved an unsatisfactory score. They thought that they succeeded, but their proof was not valid. In this study, a qualitative research was conducted to describe logical errors that students made in proving the theorem of cluster point. The theorem was given to 54 students. Misconceptions on understanding the definitions seem to occur within cluster point, limit of function, and limit of sequences. The habit of using routine symbol might cause these misconceptions. Suggestions to deal with this condition are described as well.

  12. Multivariate Regression Analysis and Slaughter Livestock,

    DTIC Science & Technology

    AGRICULTURE, *ECONOMICS), (*MEAT, PRODUCTION), MULTIVARIATE ANALYSIS, REGRESSION ANALYSIS , ANIMALS, WEIGHT, COSTS, PREDICTIONS, STABILITY, MATHEMATICAL MODELS, STORAGE, BEEF, PORK, FOOD, STATISTICAL DATA, ACCURACY

  13. The Role of Auxiliary Variables in Deterministic and Deterministic-Stochastic Spatial Models of Air Temperature in Poland

    NASA Astrophysics Data System (ADS)

    Szymanowski, Mariusz; Kryza, Maciej

    2017-02-01

    Our study examines the role of auxiliary variables in the process of spatial modelling and mapping of climatological elements, with air temperature in Poland used as an example. The multivariable algorithms are the most frequently applied for spatialization of air temperature, and their results in many studies are proved to be better in comparison to those obtained by various one-dimensional techniques. In most of the previous studies, two main strategies were used to perform multidimensional spatial interpolation of air temperature. First, it was accepted that all variables significantly correlated with air temperature should be incorporated into the model. Second, it was assumed that the more spatial variation of air temperature was deterministically explained, the better was the quality of spatial interpolation. The main goal of the paper was to examine both above-mentioned assumptions. The analysis was performed using data from 250 meteorological stations and for 69 air temperature cases aggregated on different levels: from daily means to 10-year annual mean. Two cases were considered for detailed analysis. The set of potential auxiliary variables covered 11 environmental predictors of air temperature. Another purpose of the study was to compare the results of interpolation given by various multivariable methods using the same set of explanatory variables. Two regression models: multiple linear (MLR) and geographically weighted (GWR) method, as well as their extensions to the regression-kriging form, MLRK and GWRK, respectively, were examined. Stepwise regression was used to select variables for the individual models and the cross-validation method was used to validate the results with a special attention paid to statistically significant improvement of the model using the mean absolute error (MAE) criterion. The main results of this study led to rejection of both assumptions considered. Usually, including more than two or three of the most significantly correlated auxiliary variables does not improve the quality of the spatial model. The effects of introduction of certain variables into the model were not climatologically justified and were seen on maps as unexpected and undesired artefacts. The results confirm, in accordance with previous studies, that in the case of air temperature distribution, the spatial process is non-stationary; thus, the local GWR model performs better than the global MLR if they are specified using the same set of auxiliary variables. If only GWR residuals are autocorrelated, the geographically weighted regression-kriging (GWRK) model seems to be optimal for air temperature spatial interpolation.

  14. Regression Analysis: Legal Applications in Institutional Research

    ERIC Educational Resources Information Center

    Frizell, Julie A.; Shippen, Benjamin S., Jr.; Luna, Andrew L.

    2008-01-01

    This article reviews multiple regression analysis, describes how its results should be interpreted, and instructs institutional researchers on how to conduct such analyses using an example focused on faculty pay equity between men and women. The use of multiple regression analysis will be presented as a method with which to compare salaries of…

  15. RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,

    DTIC Science & Technology

    This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)

  16. Identification of the Rice Wines with Different Marked Ages by Electronic Nose Coupled with Smartphone and Cloud Storage Platform

    PubMed Central

    Wei, Zhebo; Xiao, Xize

    2017-01-01

    In this study, a portable electronic nose (E-nose) was self-developed to identify rice wines with different marked ages—all the operations of the E-nose were controlled by a special Smartphone Application. The sensor array of the E-nose was comprised of 12 MOS sensors and the obtained response values were transmitted to the Smartphone thorough a wireless communication module. Then, Aliyun worked as a cloud storage platform for the storage of responses and identification models. The measurement of the E-nose was composed of the taste information obtained phase (TIOP) and the aftertaste information obtained phase (AIOP). The area feature data obtained from the TIOP and the feature data obtained from the TIOP-AIOP were applied to identify rice wines by using pattern recognition methods. Principal component analysis (PCA), locally linear embedding (LLE) and linear discriminant analysis (LDA) were applied for the classification of those wine samples. LDA based on the area feature data obtained from the TIOP-AIOP proved a powerful tool and showed the best classification results. Partial least-squares regression (PLSR) and support vector machine (SVM) were applied for the predictions of marked ages and SVM (R2 = 0.9942) worked much better than PLSR. PMID:29088076

  17. Relationships Between Adolescent Sexual Outcomes and Exposure to Sex in Media: Robustness to Propensity-Based Analysis

    PubMed Central

    Collins, Rebecca L.; Martino, Steven C.; Elliott, Marc N.; Miu, Angela

    2013-01-01

    Adolescent sexual health is a substantial problem in the U.S., and two recent studies have linked adolescent sexual behavior and/or outcomes to youths' exposure to sex in the media. Both studies had longitudinal survey designs and used covariate-adjusted regression analysis. Steinberg and Monahan (2010) reanalyzed data from one of these studies (Brown et al., 2006) using a propensity-score approach, arguing that this method better addresses the possibility of unobserved confounders. Based on their reanalysis, which found no relationship between media exposure and sexual behavior, they concluded that “Adolescents' Exposure to Sexy Media Does Not Hasten the Initiation of Sexual Intercourse.” We subject data from the second study (Collins et al., 2004; Chandra et al., 2008) to reanalysis using a propensity-score approach. We find only modest reductions in two of the three previously documented associations, and no reduction in the third. Based on these findings, we conclude that there is an association between exposure to sex in the media and adolescent sexual outcomes. While the evidence does not prove causality, it is sufficient to advise caution among parents, develop interventions for youth, and work with media producers and distributors to reduce youth exposure to sexual content. PMID:24839301

  18. Breech delivery in very preterm and very low birthweight infants in The Netherlands.

    PubMed

    Gravenhorst, J B; Schreuder, A M; Veen, S; Brand, R; Verloove-Vanhorick, S P; Verweij, R A; van Zeben-van der Aa, D M; Ens-Dokkum, M H

    1993-05-01

    To study the relation between various perinatal factors and the sequelae of very preterm birth, applying logistic regression analysis. In a nationwide collaborative study in the Netherlands, perinatal and follow up data were collected on 899 liveborn singleton nonmalformed infants with gestational age less than 32 weeks or birthweight less than 1500 g born in 1983. Neonatal mortality rate and total handicap rates (minor and major) in surviving children at two years and five years of age. Comparing breech with vertex presentation, the odds ratio for neonatal mortality (adjusted for duration of pregnancy, birthweight, maternal hypertension and prolonged rupture of membranes) is 1.6 (P < 0.05). Comparing abdominal versus vaginal delivery, the odds ratio indicates equal risks. When breech and vertex presentation are analysed separately it appears that breech presenting infants have a significantly lower mortality risk when born by caesarean section compared with vaginal delivery. However, comparing abdominal versus vaginal delivery in breech presentation, the odds ratio for handicap at five years (0.9) is not significantly different from 1. The data presented suggest a reduced neonatal mortality rate in breech presenting infants born by caesarean section but because of the observational design of the study the statistical analysis described only identifies a possible trend and cannot prove the issue.

  19. Warranty optimisation based on the prediction of costs to the manufacturer using neural network model and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Stamenkovic, Dragan D.; Popovic, Vladimir M.

    2015-02-01

    Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.

  20. Identification of the Rice Wines with Different Marked Ages by Electronic Nose Coupled with Smartphone and Cloud Storage Platform.

    PubMed

    Wei, Zhebo; Xiao, Xize; Wang, Jun; Wang, Hui

    2017-10-31

    In this study, a portable electronic nose (E-nose) was self-developed to identify rice wines with different marked ages-all the operations of the E-nose were controlled by a special Smartphone Application. The sensor array of the E-nose was comprised of 12 MOS sensors and the obtained response values were transmitted to the Smartphone thorough a wireless communication module. Then, Aliyun worked as a cloud storage platform for the storage of responses and identification models. The measurement of the E-nose was composed of the taste information obtained phase (TIOP) and the aftertaste information obtained phase (AIOP). The area feature data obtained from the TIOP and the feature data obtained from the TIOP-AIOP were applied to identify rice wines by using pattern recognition methods. Principal component analysis (PCA), locally linear embedding (LLE) and linear discriminant analysis (LDA) were applied for the classification of those wine samples. LDA based on the area feature data obtained from the TIOP-AIOP proved a powerful tool and showed the best classification results. Partial least-squares regression (PLSR) and support vector machine (SVM) were applied for the predictions of marked ages and SVM (R² = 0.9942) worked much better than PLSR.

  1. Effect of Linkage Disequilibrium on the Identification of Functional Variants

    PubMed Central

    Thomas, Alun; Abel, Haley J; Di, Yanming; Faye, Laura L; Jin, Jing; Liu, Jin; Wu, Zheyan; Paterson, Andrew D

    2011-01-01

    We summarize the contributions of Group 9 of Genetic Analysis Workshop 17. This group addressed the problems of linkage disequilibrium and other longer range forms of allelic association when evaluating the effects of genotypes on phenotypes. Issues raised by long-range associations, whether a result of selection, stratification, possible technical errors, or chance, were less expected but proved to be important. Most contributors focused on regression methods of various types to illustrate problematic issues or to develop adaptations for dealing with high-density genotype assays. Study design was also considered, as was graphical modeling. Although no method emerged as uniformly successful, most succeeded in reducing false-positive results either by considering clusters of loci within genes or by applying smoothing metrics that required results from adjacent loci to be similar. Two unexpected results that questioned our assumptions of what is required to model linkage disequilibrium were observed. The first was that correlations between loci separated by large genetic distances can greatly inflate single-locus test statistics, and, whether the result of selection, stratification, possible technical errors, or chance, these correlations seem overabundant. The second unexpected result was that applying principal components analysis to genome-wide genotype data can apparently control not only for population structure but also for linkage disequilibrium. PMID:22128051

  2. [Influence and correlation of attitude, availability and institutional support to research implementation in nursing practice – results from an exploratory, cross-sectional quantitative study].

    PubMed

    Haslinger-Baumann, Elisabeth; Lang, Gert; Müller, Gerhard

    2015-06-01

    The concrete application of research findings in nursing practice is a multidimensional process. In Austria, there are currently no results available that explain the impact of and association with the implementation of research in hospitals. The aim of the study was to investigate influences and relationships of individual attitudes towards research utilization, availability of research results and institutional support of nurses in Austrian hospitals with respect to research application. In a non-experimental quantitative cross-sectional design a multi-centre study (n = 10) was performed in 2011. The sample comprises 178 certified nurses who were interviewed with a survey questionnaire. The multiple regression analysis shows that a positive attitude towards research use (β = 0.388, p < 0.001), the availability of processed research results (β = 0.470, p < 0.001), and an adequate institutional support (β = 0.142, p < 0.050) has a significant influence on the application of research results. The path analysis proves that course attendance in evidence-based nursing has a strong positive influence towards research application (β = 0.464; p < 0.001). Health institutions are, according to legal instructions, called on to make use of the positive attitude and supply supporting measures in order to introduce research results into the daily nursing practice.

  3. Lessons about parks and poverty from a decade of forest loss and economic growth around Kibale National Park, Uganda.

    PubMed

    Naughton-Treves, Lisa; Alix-Garcia, Jennifer; Chapman, Colin A

    2011-08-23

    We use field data linked to satellite image analysis to examine the relationship between biodiversity loss, deforestation, and poverty around Kibale National Park (KNP) in western Uganda, 1996-2006. Over this decade, KNP generally maintained forest cover, tree species, and primate populations, whereas neighboring communal forest patches were reduced by half and showed substantial declines in tree species and primate populations. However, a bad decade for forest outside the park proved a prosperous one for most local residents. Panel data for 252 households show substantial improvement in welfare indicators (e.g., safer water, more durable roof material), with the greatest increases found among those with highest initial assets. A combination of regression analysis and matching estimators shows that although the poor tend to be located on the park perimeter, proximity to the park has no measureable effect on growth of productive assets. The risk for land loss among the poor was inversely correlated with proximity to the park, initial farm size, and decline in adjacent communal forests. We conclude the current disproportionate presence of poor households at the edge of the park does not signal that the park is a poverty trap. Rather, Kibale appears to provide protection against desperation sales and farm loss among those most vulnerable.

  4. Lessons about parks and poverty from a decade of forest loss and economic growth around Kibale National Park, Uganda

    PubMed Central

    Naughton-Treves, Lisa; Alix-Garcia, Jennifer; Chapman, Colin A.

    2011-01-01

    We use field data linked to satellite image analysis to examine the relationship between biodiversity loss, deforestation, and poverty around Kibale National Park (KNP) in western Uganda, 1996–2006. Over this decade, KNP generally maintained forest cover, tree species, and primate populations, whereas neighboring communal forest patches were reduced by half and showed substantial declines in tree species and primate populations. However, a bad decade for forest outside the park proved a prosperous one for most local residents. Panel data for 252 households show substantial improvement in welfare indicators (e.g., safer water, more durable roof material), with the greatest increases found among those with highest initial assets. A combination of regression analysis and matching estimators shows that although the poor tend to be located on the park perimeter, proximity to the park has no measureable effect on growth of productive assets. The risk for land loss among the poor was inversely correlated with proximity to the park, initial farm size, and decline in adjacent communal forests. We conclude the current disproportionate presence of poor households at the edge of the park does not signal that the park is a poverty trap. Rather, Kibale appears to provide protection against desperation sales and farm loss among those most vulnerable. PMID:21873178

  5. Correspondence analysis to evaluate the transmission of Staphylococcus aureus strains in two New York State maximum-security prisons.

    PubMed

    Befus, M; Mukherjee, D V; Herzig, C T A; Lowy, F D; Larson, E

    2017-07-01

    Prisons/jails are thought to amplify the transmission of Staphylococcus aureus (SA) particularly methicillin-resistant SA infection and colonisation. Two independently pooled cross-sectional samples of detainees being admitted or discharged from two New York State maximum-security prisons were used to explore this concept. Private interviews of participants were conducted, during which the anterior nares and oropharynx were sampled and assessed for SA colonisation. Log-binomial regression and correspondence analysis (CA) were used to evaluate the prevalence of colonisation at entry as compared with discharge. Approximately 51% of admitted (N = 404) and 41% of discharged (N = 439) female detainees were colonised with SA. Among males, 59% of those admitted (N = 427) and 49% of those discharged (N = 393) were colonised. Females had a statistically significant higher prevalence (1·26: P = 0·003) whereas males showed no significant difference (1·06; P = 0·003) in SA prevalence between entry and discharge. CA demonstrated that some strains, such as spa types t571 and t002, might have an affinity for certain mucosal sites. Contrary to our hypothesis, the prison setting did not amplify SA transmission, and CA proved to be a useful tool in describing the population structure of strains according to time and/or mucosal site.

  6. Geochemical Exploration Techniques Applicable in the Search for Copper Deposits

    USGS Publications Warehouse

    Chaffee, Maurice A.

    1975-01-01

    Geochemical exploration is an important part of copper-resource evaluation. A large number of geochemical exploration techniques, both proved and untried, are available to the geochemist to use in the search for new copper deposits. Analyses of whole-rock samples have been used in both regional and local geochemical exploration surveys in the search for copper. Analyses of mineral separates, such as biotite, magnetite, and sulfides, have also been used. Analyses of soil samples are widely used in geochemical exploration, especially for localized surveys. It is important to distinguish between residual and transported soil types. Orientation studies should always be conducted prior to a geochemical investigation in a given area in order to determine the best soil horizon and the best size of soil material for sampling in that area. Silty frost boils, caliche, and desert varnish are specialized types of soil samples that might be useful sampling media. Soil gas is a new and potentially valuable geochemical sampling medium, especially in exploring for buried mineral deposits in arid regions. Gaseous products in samples of soil may be related to base-metal deposits and include mercury vapor, sulfur dioxide, hydrogen sulfide, carbon oxysulfide, carbon dioxide, hydrogen, oxygen, nitrogen, the noble gases, the halogens, and many hydrocarbon compounds. Transported materials that have been used in geochemical sampling programs include glacial float boulders, glacial till, esker gravels, stream sediments, stream-sediment concentrates, and lake sediments. Stream-sediment sampling is probably the most widely used and most successful geochemical exploration technique. Hydrogeochemical exploration programs have utilized hot- and cold-spring waters and their precipitates as well as waters from lakes, streams, and wells. Organic gel found in lakes and at stream mouths is an unproved sampling medium. Suspended material and dissolved gases in any type of water may also be useful media. Samples of ice and snow have been used for limited geochemical surveys. Both geobotanical and biogeochemical surveys have been successful in locating copper deposits in many parts of the world. Micro-organisms, including bacteria and algae, are other unproved media that should be studied. Animals can be used in geochemical-prospecting programs. Dogs have been used quite successfully to sniff out hidden and exposed sulfide minerals. Tennite mounds are commonly composed of subsurface material, but have not as yet proved to be useful in locating buried mineral deposits. Animal tissue and waste products are essentially unproved but potentially valuable sampling media. Knowledge of the location of areas where trace-element-associated diseases in animals and man are endemic as well as a better understanding of these diseases, may aid in identifying regions that are enriched in or depleted of various elements, including copper. Results of analyses of gases in the atmosphere are proving valuable in mineral-exploration surveys. Studies involving metallic compounds exhaled by plants into the atmosphere, and of particulate matter suspended in the atmosphere are reviewed these methods may become important in the future. Remote-sensing techniques are useful for making indirect measurements of geochemical responses. Two techniques applicable to geochemical exploration are neutron-activation analysis and gamma-ray spectrometry. Aerial photography is especially useful in vegetation surveys. Radar imagery is an unproved but potentially valuable method for use in studies of vegetation in perpetually clouded regions. With the advent of modern computers, many new techniques, such as correlation analysis, regression analysis, discriminant analysis, factor analysis, cluster analysis, trend-surface analysis, and moving-average analysis can be applied to geochemical data sets. Selective use of these techniques can provide new insights into the interpretatio

  7. [Comparison of application of Cochran-Armitage trend test and linear regression analysis for rate trend analysis in epidemiology study].

    PubMed

    Wang, D Z; Wang, C; Shen, C F; Zhang, Y; Zhang, H; Song, G D; Xue, X D; Xu, Z L; Zhang, S; Jiang, G H

    2017-05-10

    We described the time trend of acute myocardial infarction (AMI) from 1999 to 2013 in Tianjin incidence rate with Cochran-Armitage trend (CAT) test and linear regression analysis, and the results were compared. Based on actual population, CAT test had much stronger statistical power than linear regression analysis for both overall incidence trend and age specific incidence trend (Cochran-Armitage trend P value

  8. A primer for biomedical scientists on how to execute model II linear regression analysis.

    PubMed

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  9. Water quality parameter measurement using spectral signatures

    NASA Technical Reports Server (NTRS)

    White, P. E.

    1973-01-01

    Regression analysis is applied to the problem of measuring water quality parameters from remote sensing spectral signature data. The equations necessary to perform regression analysis are presented and methods of testing the strength and reliability of a regression are described. An efficient algorithm for selecting an optimal subset of the independent variables available for a regression is also presented.

  10. Pan-arctic trends in terrestrial dissolved organic matter from optical measurements

    USGS Publications Warehouse

    Mann, Paul J.; Spencer, Robert G.M.; Hernes, Peter J.; Six, Johan; Aiken, George R.; Tank, Suzanne E.; McClelland, James W.; Butler, Kenna D.; Dyda, Rachael Y.; Holmes, Robert M.

    2016-01-01

    Climate change is causing extensive warming across Arctic regions resulting in permafrost degradation, alterations to regional hydrology and shifting amounts and composition of dissolved organic matter (DOM) transported by streams and rivers. Here, we characterize the DOM composition and optical properties of the six largest Arctic rivers draining into the Arctic Ocean to examine the ability of optical measurements to provide meaningful insights into terrigenous carbon export patterns and biogeochemical cycling. The chemical composition of aquatic DOM varied with season, spring months were typified by highest lignin phenol and dissolved organic carbon (DOC) concentrations with greater hydrophobic acid content, and lower proportions of hydrophilic compounds, relative to summer and winter months. Chromophoric DOM (CDOM) spectral slope (S275–295) tracked seasonal shifts in DOM composition across river basins. Fluorescence and parallel factor analysis identified seven components across the six Arctic rivers. The ratios of “terrestrial humic-like” vs. “marine humic-like” fluorescent components co-varied with lignin monomer ratios over summer and winter months, suggesting fluorescence may provide information on the age and degradation state of riverine DOM. CDOM absorbance (a350) proved a sensitive proxy for lignin phenol concentrations across all six river basins and over the hydrograph, enabling for the first time the development of a single pan-arctic relationship between a350 and terrigenous DOC (R2 = 0.93). Combining this lignin proxy with high-resolution monitoring of a350, pan-arctic estimates of annual lignin flux were calculated to range from 156 to 185 Gg, resulting in shorter and more constrained estimates of terrigenous DOM residence times in the Arctic Ocean (spanning 7 months to 2½ years). Furthermore, multiple linear regression models incorporating both absorbance and fluorescence variables proved capable of explaining much of the variability in lignin composition across rivers and seasons. Our findings suggest that synoptic, high-resolution optical measurements can provide improved understanding of northern high-latitude organic matter cycling and flux, and prove an important technique for capturing future climate-driven changes.

  11. Pan-arctic trends in terrestrial dissolved organic matter from optical measurements

    USGS Publications Warehouse

    Mann, Paul J.; Spencer, Robert G. M.; Hernes, Peter J.; Six, Johan; Aiken, George R.; Tank, Suzanne E.; McClelland, James W.; Butler, Kenna D.; Dyda, Rachael Y.; Holmes, Robert M.

    2016-01-01

    Climate change is causing extensive warming across Arctic regions resulting in permafrost degradation, alterations to regional hydrology and shifting amounts and composition of dissolved organic matter (DOM) transported by streams and rivers. Here, we characterize the DOM composition and optical properties of the six largest Arctic rivers draining into the Arctic Ocean to examine the ability of optical measurements to provide meaningful insights into terrigenous carbon export patterns and biogeochemical cycling. The chemical composition of aquatic DOM varied with season, spring months were typified by highest lignin phenol and dissolved organic carbon (DOC) concentrations with greater hydrophobic acid content, and lower proportions of hydrophilic compounds, relative to summer and winter months. Chromophoric DOM (CDOM) spectral slope (S275–295) tracked seasonal shifts in DOM composition across river basins. Fluorescence and parallel factor analysis identified seven components across the six Arctic rivers. The ratios of “terrestrial humic-like” vs. “marine humic-like” fluorescent components co-varied with lignin monomer ratios over summer and winter months, suggesting fluorescence may provide information on the age and degradation state of riverine DOM. CDOM absorbance (a350) proved a sensitive proxy for lignin phenol concentrations across all six river basins and over the hydrograph, enabling for the first time the development of a single pan-arctic relationship between a350 and terrigenous DOC (R2 = 0.93). Combining this lignin proxy with high-resolution monitoring of a350, pan-arctic estimates of annual lignin flux were calculated to range from 156 to 185 Gg, resulting in shorter and more constrained estimates of terrigenous DOM residence times in the Arctic Ocean (spanning 7 months to 2½ years). Furthermore, multiple linear regression models incorporating both absorbance and fluorescence variables proved capable of explaining much of the variability in lignin composition across rivers and seasons. Our findings suggest that synoptic, high-resolution optical measurements can provide improved understanding of northern high-latitude organic matter cycling and flux, and prove an important technique for capturing future climate-driven changes.

  12. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  13. Seasonal differences in the subjective assessment of outdoor thermal conditions and the impact of analysis techniques on the obtained results

    NASA Astrophysics Data System (ADS)

    Kántor, Noémi; Kovács, Attila; Takács, Ágnes

    2016-11-01

    Wide research attention has been paid in the last two decades to the thermal comfort conditions of different outdoor and semi-outdoor urban spaces. Field studies were conducted in a wide range of geographical regions in order to investigate the relationship between the thermal sensation of people and thermal comfort indices. Researchers found that the original threshold values of these indices did not describe precisely the actual thermal sensation patterns of subjects, and they reported neutral temperatures that vary among nations and with time of the year. For that reason, thresholds of some objective indices were rescaled and new thermal comfort categories were defined. This research investigates the outdoor thermal perception patterns of Hungarians regarding the Physiologically Equivalent Temperature ( PET) index, based on more than 5800 questionnaires. The surveys were conducted in the city of Szeged on 78 days in spring, summer, and autumn. Various, frequently applied analysis approaches (simple descriptive technique, regression analysis, and probit models) were adopted to reveal seasonal differences in the thermal assessment of people. Thermal sensitivity and neutral temperatures were found to be significantly different, especially between summer and the two transient seasons. Challenges of international comparison are also emphasized, since the results prove that neutral temperatures obtained through different analysis techniques may be considerably different. The outcomes of this study underline the importance of the development of standard measurement and analysis methodologies in order to make future studies comprehensible, hereby facilitating the broadening of the common scientific knowledge about outdoor thermal comfort.

  14. Women's empowerment and the intention to continue the practice of female genital cutting in Egypt.

    PubMed

    Afifi, Mustafa

    2009-03-01

    The study aimed to (dis)prove the association of the level of women's empowerment with their future intention to perpetuate female genital cutting for their daughters. In a national representative community-based sample of 14,393 currently-married women in Egypt, the level of empowerment, intention to continue the practice, and other socio- demographic variables were collected in the 2000 Egypt Demographic and Health Survey. Secondary in-depth analysis was conducted on data downloaded from MEASURE Demographic Health Surveys (MEASURE DHS) website. About 14% of the women intended to discontinue the practice. Twenty-six percent of the women were empowered in all household decisions. Levels of women's empowerment adjusted for age, residence, education, interaction between empowerment and education, work status, and female genital cutting status of currently-married women were entered in six logistic regression models in a sequential way. In the last model, those of high levels of empowerment and education were 8.06 times more likely not intending to perpetuate female genital cutting for their daughters than low- empowered low-educated women.

  15. Calibration sets selection strategy for the construction of robust PLS models for prediction of biodiesel/diesel blends physico-chemical properties using NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Palou, Anna; Miró, Aira; Blanco, Marcelo; Larraz, Rafael; Gómez, José Francisco; Martínez, Teresa; González, Josep Maria; Alcalà, Manel

    2017-06-01

    Even when the feasibility of using near infrared (NIR) spectroscopy combined with partial least squares (PLS) regression for prediction of physico-chemical properties of biodiesel/diesel blends has been widely demonstrated, inclusion in the calibration sets of the whole variability of diesel samples from diverse production origins still remains as an important challenge when constructing the models. This work presents a useful strategy for the systematic selection of calibration sets of samples of biodiesel/diesel blends from diverse origins, based on a binary code, principal components analysis (PCA) and the Kennard-Stones algorithm. Results show that using this methodology the models can keep their robustness over time. PLS calculations have been done using a specialized chemometric software as well as the software of the NIR instrument installed in plant, and both produced RMSEP under reproducibility values of the reference methods. The models have been proved for on-line simultaneous determination of seven properties: density, cetane index, fatty acid methyl esters (FAME) content, cloud point, boiling point at 95% of recovery, flash point and sulphur.

  16. The association between higher body mass index and poor school performance in high school students.

    PubMed

    Tonetti, L; Fabbri, M; Filardi, M; Martoni, M; Natale, V

    2016-12-01

    This study aimed to examine the association between body mass index (BMI) and school performance in high school students by controlling for relevant mediators such as sleep quality, sleep duration and socioeconomic status. Thirty-seven high school students (mean age: 18.16 ± 0.44 years) attending the same school type, i.e. 'liceo scientifico' (science-based high school), were enrolled. Students' self-reported weight and height were used to calculate BMI. Participants wore an actigraph to objectively assess the quality and duration of sleep. School performance was assessed through the actual grade obtained at the final school-leaving exam, in which higher grades indicate higher performance. BMI, get-up time, mean motor activity, wake after sleep onset and number of awakenings were negatively correlated with the grade, while sleep efficiency was positively correlated. When performing a multiple regression analysis, BMI proved the only significant (negative) predictor of grade. When controlling for sleep quality, sleep duration and socioeconomic status, a higher BMI is associated with a poorer school performance in high school students. © 2015 World Obesity Federation.

  17. Conceptual and statistical problems associated with the use of diversity indices in ecology.

    PubMed

    Barrantes, Gilbert; Sandoval, Luis

    2009-09-01

    Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.

  18. Quality by design (QbD), Process Analytical Technology (PAT), and design of experiment applied to the development of multifunctional sunscreens.

    PubMed

    Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim

    2017-02-01

    Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.

  19. Clinical and radiographic assessment of various predictors for healing outcome 1 year after periapical surgery.

    PubMed

    von Arx, Thomas; Jensen, Simon Storgård; Hänni, Stefan

    2007-02-01

    This clinical study prospectively evaluated the influence of various predictors on healing outcome 1 year after periapical surgery. The study cohort included 194 teeth in an equal number of patients. Three teeth were lost for the follow-up (1.5% drop-out rate). Clinical and radiographic measures were used to determine the healing outcome. For statistical analysis, results were dichotomized (healed versus nonhealed). The overall success rate was 83.8% (healed cases). The only individual predictors to prove significant for the outcome were pain at initial examination (p=0.030) and other clinical signs or symptoms at initial examination (p=0.042), meaning that such teeth had lower healing rates 1 year after periapical surgery compared with teeth without such signs or symptoms. Logistic regression revealed that pain at initial examination (odds ratio=2.59, confidence interval=1.2-5.6, p=0.04) was the only predictor reaching significance. Several predictors almost reached statistical significance: lesion size (p=0.06), retrofilling material (p=0.06), and postoperative healing course (p=0.06).

  20. Exploring the impact of mentoring functions on job satisfaction and organizational commitment of new staff nurses

    PubMed Central

    2010-01-01

    Background Although previous studies proved that the implementation of mentoring program is beneficial for enhancing the nursing skills and attitudes, few researchers devoted to exploring the impact of mentoring functions on job satisfaction and organizational commitment of new nurses. In this research we aimed at examining the effects of mentoring functions on the job satisfaction and organizational commitment of new nurses in Taiwan's hospitals. Methods We employed self-administered questionnaires to collect research data and select new nurses from three regional hospitals as samples in Taiwan. In all, 306 nurse samples were obtained. We adopted a multiple regression analysis to test the impact of the mentoring functions. Results Results revealed that career development and role modeling functions have positive effects on the job satisfaction and organizational commitment of new nurses; however, the psychosocial support function was incapable of providing adequate explanation for these work outcomes. Conclusion It is suggested in this study that nurse managers should improve the career development and role modeling functions of mentoring in order to enhance the job satisfaction and organizational commitment of new nurses. PMID:20712873

  1. [Measurement of cognitive constriction in suicide notes].

    PubMed

    Heinrich, Monika; Berzlanovich, Andrea; Willinger, Ulrike; Eisenwort, Brigitte

    2008-01-01

    The target of this paper was to quantify the amount of cognitive constriction in German-language suicide notes by studying quantitative psycholinguistic parameters of texts. This should give a better understanding of presuicidal events and encourage improvement in the field of suicide prevention and crisis intervention. The study is based on letters of the "Vienna Corpus of Suicide Notes". To prove various hypotheses a factor analysis, a number of regression analyses, and the General Linear Model were applied, apart from descriptive methods. The 16 parameters could be reduced to five factors of cognitive constriction, such as the writing style, the usage of words, the dichotomy, the length and the grammatical correctness of the suicide notes. Regarding the writing style the highest values of cognitive constriction were found among women (p=0.005), young persons (p< or =0.000), in short letters (p=0.027) and if psychological problems were the motive for suicide (p=0.020). The discovery site of the letters (p=0.002) was important as well. The construct of cognitive constriction is a multidimensional and complex phenomenon. Therefore the quantification must contain variables of the persons and the texts.

  2. The impact of communicating information about air pollution events on public health.

    PubMed

    McLaren, J; Williams, I D

    2015-12-15

    Short-term exposure to air pollution has been associated with exacerbation of asthma and chronic obstructive pulmonary disease (COPD). This study investigated the relationship between emergency hospital admissions for asthma, COPD and episodes of poor air quality in an English city (Southampton) from 2008-2013. The city's council provides a forecasting service for poor air quality to individuals with respiratory disease to reduce preventable admissions to hospital and this has been evaluated. Trends in nitrogen dioxide, ozone and particulate matter concentrations were related to hospital admissions data using regression analysis. The impacts of air quality on emergency admissions were quantified using the relative risks associated with each pollutant. Seasonal and weekly trends were apparent for both air pollution and hospital admissions, although there was a weak relationship between the two. The air quality forecasting service proved ineffective at reducing hospital admissions. Improvements to the health forecasting service are necessary to protect the health of susceptible individuals, as there is likely to be an increasing need for such services in the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  4. On the Viability of Diffusion MRI-Based Microstructural Biomarkers in Ischemic Stroke

    PubMed Central

    Boscolo Galazzo, Ilaria; Brusini, Lorenza; Obertino, Silvia; Zucchelli, Mauro; Granziera, Cristina; Menegaz, Gloria

    2018-01-01

    Recent tract-based analyses provided evidence for the exploitability of 3D-SHORE microstructural descriptors derived from diffusion MRI (dMRI) in revealing white matter (WM) plasticity. In this work, we focused on the main open issues left: (1) the comparative analysis with respect to classical tensor-derived indices, i.e., Fractional Anisotropy (FA) and Mean Diffusivity (MD); and (2) the ability to detect plasticity processes in gray matter (GM). Although signal modeling in GM is still largely unexplored, we investigated their sensibility to stroke-induced microstructural modifications occurring in the contralateral hemisphere. A more complete picture could provide hints for investigating the interplay of GM and WM modulations. Ten stroke patients and ten age/gender-matched healthy controls were enrolled in the study and underwent diffusion spectrum imaging (DSI). Acquisitions at three and two time points (tp) were performed on patients and controls, respectively. For all subjects and acquisitions, FA and MD were computed along with 3D-SHORE-based indices [Generalized Fractional Anisotropy (GFA), Propagator Anisotropy (PA), Return To the Axis Probability (RTAP), Return To the Plane Probability (RTPP), and Mean Square Displacement (MSD)]. Tract-based analysis involving the cortical, subcortical and transcallosal motor networks and region-based analysis in GM were successively performed, focusing on the contralateral hemisphere to the stroke. Reproducibility of all the indices on both WM and GM was quantitatively proved on controls. For tract-based, longitudinal group analyses revealed the highest significant differences across the subcortical and transcallosal networks for all the indices. The optimal regression model for predicting the clinical motor outcome at tp3 included GFA, PA, RTPP, and MSD in the subcortical network in combination with the main clinical information at baseline. Region-based analysis in the contralateral GM highlighted the ability of anisotropy indices in discriminating between groups mainly at tp1, while diffusivity indices appeared to be altered at tp2. 3D-SHORE indices proved to be suitable in probing plasticity in both WM and GM, further confirming their viability as a novel family of biomarkers in ischemic stroke in WM and revealing their potential exploitability in GM. Their combination with tensor-derived indices can provide more detailed insights of the different tissue modulations related to stroke pathology. PMID:29515362

  5. Using Robust Standard Errors to Combine Multiple Regression Estimates with Meta-Analysis

    ERIC Educational Resources Information Center

    Williams, Ryan T.

    2012-01-01

    Combining multiple regression estimates with meta-analysis has continued to be a difficult task. A variety of methods have been proposed and used to combine multiple regression slope estimates with meta-analysis, however, most of these methods have serious methodological and practical limitations. The purpose of this study was to explore the use…

  6. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    ERIC Educational Resources Information Center

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  7. A retrospective analysis to identify the factors affecting infection in patients undergoing chemotherapy.

    PubMed

    Park, Ji Hyun; Kim, Hyeon-Young; Lee, Hanna; Yun, Eun Kyoung

    2015-12-01

    This study compares the performance of the logistic regression and decision tree analysis methods for assessing the risk factors for infection in cancer patients undergoing chemotherapy. The subjects were 732 cancer patients who were receiving chemotherapy at K university hospital in Seoul, Korea. The data were collected between March 2011 and February 2013 and were processed for descriptive analysis, logistic regression and decision tree analysis using the IBM SPSS Statistics 19 and Modeler 15.1 programs. The most common risk factors for infection in cancer patients receiving chemotherapy were identified as alkylating agents, vinca alkaloid and underlying diabetes mellitus. The logistic regression explained 66.7% of the variation in the data in terms of sensitivity and 88.9% in terms of specificity. The decision tree analysis accounted for 55.0% of the variation in the data in terms of sensitivity and 89.0% in terms of specificity. As for the overall classification accuracy, the logistic regression explained 88.0% and the decision tree analysis explained 87.2%. The logistic regression analysis showed a higher degree of sensitivity and classification accuracy. Therefore, logistic regression analysis is concluded to be the more effective and useful method for establishing an infection prediction model for patients undergoing chemotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Visual grading characteristics and ordinal regression analysis during optimisation of CT head examinations.

    PubMed

    Zarb, Francis; McEntee, Mark F; Rainford, Louise

    2015-06-01

    To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.

  9. REGRESSION ANALYSIS OF SEA-SURFACE-TEMPERATURE PATTERNS FOR THE NORTH PACIFIC OCEAN.

    DTIC Science & Technology

    SEA WATER, *SURFACE TEMPERATURE, *OCEANOGRAPHIC DATA, PACIFIC OCEAN, REGRESSION ANALYSIS , STATISTICAL ANALYSIS, UNDERWATER EQUIPMENT, DETECTION, UNDERWATER COMMUNICATIONS, DISTRIBUTION, THERMAL PROPERTIES, COMPUTERS.

  10. The Colorectal Cancer Mortality-to-Incidence Ratio as an Indicator of Global Cancer Screening and Care

    PubMed Central

    Sunkara, Vasu; Hébert, James R.

    2015-01-01

    BACKGROUND Disparities in cancer screening, incidence, treatment, and survival are worsening globally. The mortality-to-incidence ratio (MIR) has been used previously to evaluate such disparities. METHODS The MIR for colorectal cancer is calculated for all Organisation for Economic Cooperation and Development (OECD) countries using the 2012 GLOBOCAN incidence and mortality statistics. Health system rankings were obtained from the World Health Organization. Two linear regression models were fit with the MIR as the dependent variable and health system ranking as the independent variable; one included all countries and one model had the “divergents” removed. RESULTS The regression model for all countries explained 24% of the total variance in the MIR. Nine countries were found to have regression-calculated MIRs that differed from the actual MIR by >20%. Countries with lower-than-expected MIRs were found to have strong national health systems characterized by formal colorectal cancer screening programs. Conversely, countries with higher-than-expected MIRs lack screening programs. When these divergent points were removed from the data set, the recalculated regression model explained 60% of the total variance in the MIR. CONCLUSIONS The MIR proved useful for identifying disparities in cancer screening and treatment internationally. It has potential as an indicator of the long-term success of cancer surveillance programs and may be extended to other cancer types for these purposes. PMID:25572676

  11. The colorectal cancer mortality-to-incidence ratio as an indicator of global cancer screening and care.

    PubMed

    Sunkara, Vasu; Hébert, James R

    2015-05-15

    Disparities in cancer screening, incidence, treatment, and survival are worsening globally. The mortality-to-incidence ratio (MIR) has been used previously to evaluate such disparities. The MIR for colorectal cancer is calculated for all Organisation for Economic Cooperation and Development (OECD) countries using the 2012 GLOBOCAN incidence and mortality statistics. Health system rankings were obtained from the World Health Organization. Two linear regression models were fit with the MIR as the dependent variable and health system ranking as the independent variable; one included all countries and one model had the "divergents" removed. The regression model for all countries explained 24% of the total variance in the MIR. Nine countries were found to have regression-calculated MIRs that differed from the actual MIR by >20%. Countries with lower-than-expected MIRs were found to have strong national health systems characterized by formal colorectal cancer screening programs. Conversely, countries with higher-than-expected MIRs lack screening programs. When these divergent points were removed from the data set, the recalculated regression model explained 60% of the total variance in the MIR. The MIR proved useful for identifying disparities in cancer screening and treatment internationally. It has potential as an indicator of the long-term success of cancer surveillance programs and may be extended to other cancer types for these purposes. © 2015 American Cancer Society.

  12. Least-Squares Data Adjustment with Rank-Deficient Data Covariance Matrices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, J.G.

    2011-07-01

    A derivation of the linear least-squares adjustment formulae is required that avoids the assumption that the covariance matrix of prior parameters can be inverted. Possible proofs are of several kinds, including: (i) extension of standard results for the linear regression formulae, and (ii) minimization by differentiation of a quadratic form of the deviations in parameters and responses. In this paper, the least-squares adjustment equations are derived in both these ways, while explicitly assuming that the covariance matrix of prior parameters is singular. It will be proved that the solutions are unique and that, contrary to statements that have appeared inmore » the literature, the least-squares adjustment problem is not ill-posed. No modification is required to the adjustment formulae that have been used in the past in the case of a singular covariance matrix for the priors. In conclusion: The linear least-squares adjustment formula that has been used in the past is valid in the case of a singular covariance matrix for the covariance matrix of prior parameters. Furthermore, it provides a unique solution. Statements in the literature, to the effect that the problem is ill-posed are wrong. No regularization of the problem is required. This has been proved in the present paper by two methods, while explicitly assuming that the covariance matrix of prior parameters is singular: i) extension of standard results for the linear regression formulae, and (ii) minimization by differentiation of a quadratic form of the deviations in parameters and responses. No modification is needed to the adjustment formulae that have been used in the past. (author)« less

  13. Random regression models using different functions to model milk flow in dairy cows.

    PubMed

    Laureano, M M M; Bignardi, A B; El Faro, L; Cardoso, V L; Tonhati, H; Albuquerque, L G

    2014-09-12

    We analyzed 75,555 test-day milk flow records from 2175 primiparous Holstein cows that calved between 1997 and 2005. Milk flow was obtained by dividing the mean milk yield (kg) of the 3 daily milking by the total milking time (min) and was expressed as kg/min. Milk flow was grouped into 43 weekly classes. The analyses were performed using a single-trait Random Regression Models that included direct additive genetic, permanent environmental, and residual random effects. In addition, the contemporary group and linear and quadratic effects of cow age at calving were included as fixed effects. Fourth-order orthogonal Legendre polynomial of days in milk was used to model the mean trend in milk flow. The additive genetic and permanent environmental covariance functions were estimated using random regression Legendre polynomials and B-spline functions of days in milk. The model using a third-order Legendre polynomial for additive genetic effects and a sixth-order polynomial for permanent environmental effects, which contained 7 residual classes, proved to be the most adequate to describe variations in milk flow, and was also the most parsimonious. The heritability in milk flow estimated by the most parsimonious model was of moderate to high magnitude.

  14. Anti-correlated Networks, Global Signal Regression, and the Effects of Caffeine in Resting-State Functional MRI

    PubMed Central

    Wong, Chi Wah; Olafsson, Valur; Tal, Omer; Liu, Thomas T.

    2012-01-01

    Resting-state functional connectivity magnetic resonance imaging is proving to be an essential tool for the characterization of functional networks in the brain. Two of the major networks that have been identified are the default mode network (DMN) and the task positive network (TPN). Although prior work indicates that these two networks are anti-correlated, the findings are controversial because the anti-correlations are often found only after the application of a pre-processing step, known as global signal regression, that can produce artifactual anti-correlations. In this paper, we show that, for subjects studied in an eyes-closed rest state, caffeine can significantly enhance the detection of anti-correlations between the DMN and TPN without the need for global signal regression. In line with these findings, we find that caffeine also leads to widespread decreases in connectivity and global signal amplitude. Using a recently introduced geometric model of global signal effects, we demonstrate that these decreases are consistent with the removal of an additive global signal confound. In contrast to the effects observed in the eyes-closed rest state, caffeine did not lead to significant changes in global functional connectivity in the eyes-open rest state. PMID:22743194

  15. The process and utility of classification and regression tree methodology in nursing research

    PubMed Central

    Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda

    2014-01-01

    Aim This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Background Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Design Discussion paper. Data sources English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984–2013. Discussion Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Implications for Nursing Research Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Conclusion Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. PMID:24237048

  16. The process and utility of classification and regression tree methodology in nursing research.

    PubMed

    Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda

    2014-06-01

    This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Discussion paper. English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984-2013. Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. © 2013 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  17. Advantages of the net benefit regression framework for economic evaluations of interventions in the workplace: a case study of the cost-effectiveness of a collaborative mental health care program for people receiving short-term disability benefits for psychiatric disorders.

    PubMed

    Hoch, Jeffrey S; Dewa, Carolyn S

    2014-04-01

    Economic evaluations commonly accompany trials of new treatments or interventions; however, regression methods and their corresponding advantages for the analysis of cost-effectiveness data are not well known. To illustrate regression-based economic evaluation, we present a case study investigating the cost-effectiveness of a collaborative mental health care program for people receiving short-term disability benefits for psychiatric disorders. We implement net benefit regression to illustrate its strengths and limitations. Net benefit regression offers a simple option for cost-effectiveness analyses of person-level data. By placing economic evaluation in a regression framework, regression-based techniques can facilitate the analysis and provide simple solutions to commonly encountered challenges. Economic evaluations of person-level data (eg, from a clinical trial) should use net benefit regression to facilitate analysis and enhance results.

  18. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  19. Population heterogeneity in the salience of multiple risk factors for adolescent delinquency.

    PubMed

    Lanza, Stephanie T; Cooper, Brittany R; Bray, Bethany C

    2014-03-01

    To present mixture regression analysis as an alternative to more standard regression analysis for predicting adolescent delinquency. We demonstrate how mixture regression analysis allows for the identification of population subgroups defined by the salience of multiple risk factors. We identified population subgroups (i.e., latent classes) of individuals based on their coefficients in a regression model predicting adolescent delinquency from eight previously established risk indices drawn from the community, school, family, peer, and individual levels. The study included N = 37,763 10th-grade adolescents who participated in the Communities That Care Youth Survey. Standard, zero-inflated, and mixture Poisson and negative binomial regression models were considered. Standard and mixture negative binomial regression models were selected as optimal. The five-class regression model was interpreted based on the class-specific regression coefficients, indicating that risk factors had varying salience across classes of adolescents. Standard regression showed that all risk factors were significantly associated with delinquency. Mixture regression provided more nuanced information, suggesting a unique set of risk factors that were salient for different subgroups of adolescents. Implications for the design of subgroup-specific interventions are discussed. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  20. In vivo suppression of solid Ehrlich cancer via chlorophyllin derivative mediated PDT: an albino mouse tumour model

    NASA Astrophysics Data System (ADS)

    Gomaa, Iman; Saraya, Hend; Zekri, Maha; Abdel-Kader, Mahmoud

    2015-03-01

    In this study, copper chlorophyllin was used as a photosensitizer for photodynamic therapy (PDT) in Ehrlich tumour mouse model. Six groups of animals comprising 5 animals per group were subcutaneously transplanted with 1x106 Ehrlich tumour cells. A single dose of 200 μg/gm body weight chlorophylin derivative was administered by intravenous (IV) or intratumoral (IT) routes. Mice were exposed to monochromatic red laser of 630 nm for 1 h, and tumour regression was followed up for three consecutive months post treatment. Several Biochemical, histological and molecular tests were performed in order to evaluate the efficacy and safety of the applied treatment. An interest has been also directed towards investigating the molecular mechanisms underlying chlorophyllin derivative mediated PDT. PDT-treated animals via either the IV or IT routes showed significant decrease in tumour size 72 h post-treatment. Tumours at the IV-PDT group disappeared totally within a week with no recurrence over three months follow up. In the IT-PDT, the decrease in tumour size at the first week was interrupted by a slight increase; however never reached the original size. Histological examination of tumour tissues of the IV-PDT group at 24 h post treatment demonstrated restoring the normal muscle tissue architecture, and the biochemical assays indicated normal liver functions. The immunohistochemical analysis of caspase-3, and the quantitative PCR results of caspases-8 and 9 proved the presence of extrinsic apoptotic pathway after cholorphyllin derivative-mediated PDT. In conclusion IV-PDT strategy proved better cure rate than the IT-PDT, with no recurrence over three months of follow up.

  1. A simple test of choice stepping reaction time for assessing fall risk in people with multiple sclerosis.

    PubMed

    Tijsma, Mylou; Vister, Eva; Hoang, Phu; Lord, Stephen R

    2017-03-01

    Purpose To determine (a) the discriminant validity for established fall risk factors and (b) the predictive validity for falls of a simple test of choice stepping reaction time (CSRT) in people with multiple sclerosis (MS). Method People with MS (n = 210, 21-74y) performed the CSRT, sensorimotor, balance and neuropsychological tests in a single session. They were then followed up for falls using monthly fall diaries for 6 months. Results The CSRT test had excellent discriminant validity with respect to established fall risk factors. Frequent fallers (≥3 falls) performed significantly worse in the CSRT test than non-frequent fallers (0-2 falls). With the odds of suffering frequent falls increasing 69% with each SD increase in CSRT (OR = 1.69, 95% CI: 1.27-2.26, p = <0.001). In regression analysis, CSRT was best explained by sway, time to complete the 9-Hole Peg test, knee extension strength of the weaker leg, proprioception and the time to complete the Trails B test (multiple R 2   =   0.449, p < 0.001). Conclusions A simple low tech CSRT test has excellent discriminative and predictive validity in relation to falls in people with MS. This test may prove useful in documenting longitudinal changes in fall risk in relation to MS disease progression and effects of interventions. Implications for rehabilitation Good choice stepping reaction time (CSRT) is required for maintaining balance. A simple low-tech CSRT test has excellent discriminative and predictive validity in relation to falls in people with MS. This test may prove useful documenting longitudinal changes in fall risk in relation to MS disease progression and effects of interventions.

  2. [Renal insufficiency and clinical outcomes in patients with acute coronary syndrome undergoing percutaneous coronary intervention: a multi-centre study].

    PubMed

    Huo, Yong; Ho, Wa

    2007-12-18

    To investigate the association of renal insufficiency and clinical outcomes in patients with acute coronary syndrome(ACS). The study was a multi-centre register study including 3,589 ACS patients coming from 39 centers across China who had received percutaneous coronary intervention(PCI) prior to 1st February, 2007. Estimated glomerular filtration rate (eGFR) was calculated for all patients using the 4-variable MDRD equation with the serum creatinine obtained before angiography. The association between renal insufficiency and clinical outcomes and the presence of in-hospital death and bleeding was studied by Fisher's exact test. Multi-variable analysis on the risk factors of in-hospital bleeding was done by logistic regression test. The mean age of the study population was (61.74+/-11.37) years (ranging from 23 years to 92 years)and 76.5% (2,746/3,589) of the population was male. Only 90 patients (2.51%) were known to have chronic kidney disease at the time of admission and 144 patients(4.01%) had serum creatinine levels above 133 micromol/L. However, after the evaluation of renal status by the MDRD equation, 2,250 patients (63.1%)showed a reduction in eGFR of less than 90 mL/min, of whom, 472 (13.1%) even reached the level of moderate renal insufficiency (eGFR<60 mL/min) and above. Seven patients(0.20%) were proved to have chronic total occlusion lesions(CTO) and eight (0.22%) needed shift to coronary artery bypass grafting (CABG) after angiography. Both the presence of CTO lesions and CABG were proved to be associated with decrease of renal function through Fisher's exact test (P= 0.005 8 and 0.041, respectively). The in-hospital mortality rate was 0.47%(17/3 589) which was associated with the degree of renal insufficiency (P=0.001 3). A total of 75 patients(2.09%) of in-hospital bleeding were recorded with 26 patients(0.72%) diagnosed as major bleeding events. 92% (69/75) of the bleeding events occurred after PCI. Bleeding was found to be associated with the degree of renal insufficiency in every type of antithrombotic therapy (P<0.001). After adjusting with other variables by logistic regression test, renal insufficiency (eGFR per 10 mL/min decrease, OR=1.133, 95% CI 1.011- 1.27, P=0.032)and age (above 65 years, OR=1.907, 95% CI 1.107-3.28, P=0.02) were proved to be the risk factors of in-hospital bleeding. Renal insufficiency is very common in ACS patients but self-report rate is low among this population. Renal function evaluated by eGFR should be carried out for every patient hospitalized for ACS for risk stratification. Patients with severer renal insufficiency usually have more complicated clinical manifestations and a higher rate of in-hospital bleeding.

  3. The Nuremberg mind redeemed: a comprehensive analysis of the Rorschachs of Nazi war criminals.

    PubMed

    Resnick, M N; Nunno, V J

    1991-08-01

    We examined a blind, actuarial analysis of the Rorschach data of the Nuremberg war criminals (NWC) using Exner's (1974) Comprehensive System in an attempt to prove the convergence of the NWC construct along dimensions of psychological (personality) functioning and to prove its discriminability from other appropriate psychiatric and nonpsychiatric comparison groups. The weaknesses of previous research methodologies are examined and discussed vis-à-vis the historical and theoretical developments of the concepts of authoritarianism, dogmatism, obedience to authority, and the development of the Rorschach Inkblot Technique.

  4. Effect of the mass center shift for force-free flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Meirovitch, L.; Juang, J.-N.

    1975-01-01

    For a spinning flexible spacecraft the mass center generally shifts relative to the nominal undeformed position. It is thought that this shift of center complicates spacecraft stability analysis. It is proved, on the basis of results achieved by Meirovitch and Calico (1972), that for the general class of force-free single-spin flexible spacecraft it is possible to ignore this shift of center without affecting the stability criteria in any significant way. A new theorem on inequalities for quadratic forms is proved to demonstrate the validity of the stability analysis.

  5. Mathematical analysis on the cosets of subgroup in the group of E-convex sets

    NASA Astrophysics Data System (ADS)

    Abbas, Nada Mohammed; Ajeena, Ruma Kareem K.

    2018-05-01

    In this work, analyzing the cosets of the subgroup in the group of L – convex sets is presented as a new and powerful tool in the topics of the convex analysis and abstract algebra. On L – convex sets, the properties of these cosets are proved mathematically. Most important theorem on a finite group of L – convex sets theory which is the Lagrange’s Theorem has been proved. As well as, the mathematical proof of the quotient group of L – convex sets is presented.

  6. Dimensional Analysis in Mathematical Modeling Systems: A Simple Numerical Method

    DTIC Science & Technology

    1991-02-01

    US Army Ballistic Research Laboratories, Aberden Proving Ground , NID, August 1975. [18] Hi1irlimann, T., and .J. lKohlas "LPL: A Structured Language...such systems can prove that (a’ + ab + b2 + ba) = (a + b) 2 . With some effort, since the laws of physical algebra are a minor variant on those of

  7. Conjecturing, Generalizing and Justifying: Building Theory around Teacher Knowledge of Proving

    ERIC Educational Resources Information Center

    Lesseig, Kristin

    2016-01-01

    The purpose of this study was to detail teachers' proving activity and contribute to a framework of Mathematical Knowledge for Teaching Proof (MKT for Proof). While working to justify claims about sums of consecutive numbers, teachers searched for key ideas and productively used examples to make, test and refine conjectures. Analysis of teachers'…

  8. A Note on the Relationship between the Number of Indicators and Their Reliability in Detecting Regression Coefficients in Latent Regression Analysis

    ERIC Educational Resources Information Center

    Dolan, Conor V.; Wicherts, Jelte M.; Molenaar, Peter C. M.

    2004-01-01

    We consider the question of how variation in the number and reliability of indicators affects the power to reject the hypothesis that the regression coefficients are zero in latent linear regression analysis. We show that power remains constant as long as the coefficient of determination remains unchanged. Any increase in the number of indicators…

  9. Evaluation of interpolation techniques for the creation of gridded daily precipitation (1 × 1 km2); Cyprus, 1980-2010

    NASA Astrophysics Data System (ADS)

    Camera, Corrado; Bruggeman, Adriana; Hadjinicolaou, Panos; Pashiardis, Stelios; Lange, Manfred A.

    2014-01-01

    High-resolution gridded daily data sets are essential for natural resource management and the analyses of climate changes and their effects. This study aims to evaluate the performance of 15 simple or complex interpolation techniques in reproducing daily precipitation at a resolution of 1 km2 over topographically complex areas. Methods are tested considering two different sets of observation densities and different rainfall amounts. We used rainfall data that were recorded at 74 and 145 observational stations, respectively, spread over the 5760 km2 of the Republic of Cyprus, in the Eastern Mediterranean. Regression analyses utilizing geographical copredictors and neighboring interpolation techniques were evaluated both in isolation and combined. Linear multiple regression (LMR) and geographically weighted regression methods (GWR) were tested. These included a step-wise selection of covariables, as well as inverse distance weighting (IDW), kriging, and 3D-thin plate splines (TPS). The relative rank of the different techniques changes with different station density and rainfall amounts. Our results indicate that TPS performs well for low station density and large-scale events and also when coupled with regression models. It performs poorly for high station density. The opposite is observed when using IDW. Simple IDW performs best for local events, while a combination of step-wise GWR and IDW proves to be the best method for large-scale events and high station density. This study indicates that the use of step-wise regression with a variable set of geographic parameters can improve the interpolation of large-scale events because it facilitates the representation of local climate dynamics.

  10. Contextual and sociopsychological factors in predicting habitual cleaning of water storage containers in rural Benin

    NASA Astrophysics Data System (ADS)

    Stocker, Andrea; Mosler, Hans-Joachim

    2015-04-01

    Recontamination of drinking water occurring between water collection at the source and the point of consumption is a current problem in developing countries. The household drinking water storage container is one source of contamination and should therefore be cleaned regularly. First, the present study investigated contextual factors that stimulate or inhibit the development of habitual cleaning of drinking water storage containers with soap and water. Second, based on the Risk, Attitudes, Norms, Abilities, and Self-regulation (RANAS) Model of behavior, the study aimed to determine which sociopsychological factors should be influenced by an intervention to promote habitual cleaning. In a cross-sectional study, 905 households in rural Benin were interviewed by structured face-to-face interviews. A forced-entry regression analysis was used to determine potential contextual factors related to habitual cleaning. Subsequently, a hierarchical regression was conducted with the only relevant contextual factor entered in the first step (R2 = 6.7%) and the sociopsychological factors added in the second step (R2 = 62.5%). Results showed that households using a clay container for drinking water storage had a significantly weaker habit of cleaning their water storage containers with soap and water than did households using other types of containers (β = -0.10). The most important sociopsychological predictors of habitual cleaning were commitment (β = 0.35), forgetting (β = -0.22), and self-efficacy (β = 0.14). The combined investigation of contextual and sociopsychological factors proved beneficial in terms of developing intervention strategies. Possible interventions based on these findings are recommended.

  11. Development of European NO2 Land Use Regression Model for present and future exposure assessment: Implications for policy analysis.

    PubMed

    Vizcaino, Pilar; Lavalle, Carlo

    2018-05-04

    A new Land Use Regression model was built to develop pan-European 100 m resolution maps of NO 2 concentrations. The model was built using NO 2 concentrations from routine monitoring stations available in the Airbase database as dependent variable. Predictor variables included land use, road traffic proxies, population density, climatic and topographical variables, and distance to sea. In order to capture international and inter regional disparities not accounted for with the mentioned predictor variables, additional proxies of NO 2 concentrations, like levels of activity intensity and NO x emissions for specific sectors, were also included. The model was built using Random Forest techniques. Model performance was relatively good given the EU-wide scale (R 2  = 0.53). Output predictions of annual average concentrations of NO 2 were in line with other existing models in terms of spatial distribution and values of concentration. The model was validated for year 2015, comparing model predictions derived from updated values of independent variables, with concentrations in monitoring stations for that year. The algorithm was then used to model future concentrations up to the year 2030, considering different emission scenarios as well as changes in land use, population distribution and economic factors assuming the most likely socio-economic trends. Levels of exposure were derived from maps of concentration. The model proved to be a useful tool for the ex-ante evaluation of specific air pollution mitigation measures, and more broadly, for impact assessment of EU policies on territorial development. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Chronic subdural hematoma: Surgical management and outcome in 986 cases: A classification and regression tree approach

    PubMed Central

    Rovlias, Aristedis; Theodoropoulos, Spyridon; Papoutsakis, Dimitrios

    2015-01-01

    Background: Chronic subdural hematoma (CSDH) is one of the most common clinical entities in daily neurosurgical practice which carries a most favorable prognosis. However, because of the advanced age and medical problems of patients, surgical therapy is frequently associated with various complications. This study evaluated the clinical features, radiological findings, and neurological outcome in a large series of patients with CSDH. Methods: A classification and regression tree (CART) technique was employed in the analysis of data from 986 patients who were operated at Asclepeion General Hospital of Athens from January 1986 to December 2011. Burr holes evacuation with closed system drainage has been the operative technique of first choice at our institution for 29 consecutive years. A total of 27 prognostic factors were examined to predict the outcome at 3-month postoperatively. Results: Our results indicated that neurological status on admission was the best predictor of outcome. With regard to the other data, age, brain atrophy, thickness and density of hematoma, subdural accumulation of air, and antiplatelet and anticoagulant therapy were found to correlate significantly with prognosis. The overall cross-validated predictive accuracy of CART model was 85.34%, with a cross-validated relative error of 0.326. Conclusions: Methodologically, CART technique is quite different from the more commonly used methods, with the primary benefit of illustrating the important prognostic variables as related to outcome. Since, the ideal therapy for the treatment of CSDH is still under debate, this technique may prove useful in developing new therapeutic strategies and approaches for patients with CSDH. PMID:26257985

  13. Network intrusion detection based on a general regression neural network optimized by an improved artificial immune algorithm.

    PubMed

    Wu, Jianfa; Peng, Dahao; Li, Zhuping; Zhao, Li; Ling, Huanzhang

    2015-01-01

    To effectively and accurately detect and classify network intrusion data, this paper introduces a general regression neural network (GRNN) based on the artificial immune algorithm with elitist strategies (AIAE). The elitist archive and elitist crossover were combined with the artificial immune algorithm (AIA) to produce the AIAE-GRNN algorithm, with the aim of improving its adaptivity and accuracy. In this paper, the mean square errors (MSEs) were considered the affinity function. The AIAE was used to optimize the smooth factors of the GRNN; then, the optimal smooth factor was solved and substituted into the trained GRNN. Thus, the intrusive data were classified. The paper selected a GRNN that was separately optimized using a genetic algorithm (GA), particle swarm optimization (PSO), and fuzzy C-mean clustering (FCM) to enable a comparison of these approaches. As shown in the results, the AIAE-GRNN achieves a higher classification accuracy than PSO-GRNN, but the running time of AIAE-GRNN is long, which was proved first. FCM and GA-GRNN were eliminated because of their deficiencies in terms of accuracy and convergence. To improve the running speed, the paper adopted principal component analysis (PCA) to reduce the dimensions of the intrusive data. With the reduction in dimensionality, the PCA-AIAE-GRNN decreases in accuracy less and has better convergence than the PCA-PSO-GRNN, and the running speed of the PCA-AIAE-GRNN was relatively improved. The experimental results show that the AIAE-GRNN has a higher robustness and accuracy than the other algorithms considered and can thus be used to classify the intrusive data.

  14. Parameterization of the middle and upper tropospheric water vapor from ATOVS observations over a tropical climate region

    NASA Astrophysics Data System (ADS)

    Makama, Ezekiel Kaura; Lim, Hwee San; Abdullah, Khiruddin

    2018-01-01

    Precipitable water vapor (PWV) is a highly variable, but important greenhouse gas that regulates the radiation budget of the earth. Its variability in time and space makes it difficult to quantify. Knowledge of its vertical distribution, in particular, is crucial for many reasons. In this study, empirical relationships between isobaric layers of PWV over Peninsular Malaysia are examined. Analysis of variance (ANOVA) technique on Advanced Television and Infrared Observation Satellite Operational Vertical Sounder (ATOVS) observations, from 2005 to 2011, has been used to propose a relationship of the form, W=α(WL)β for the middle (MW) and upper (UW) layers PWV. W is either MW or UW with α and β as regression coefficients, which are functions of latitude. Coefficients of determination (R2) and root mean square error (RMSE) of respective values between 0.75-0.86 and 1.65-2.38 mm, across the zones, were obtained for both the MW and UW predictions, with a mean bias (MB) below ±1 mm.The predicted and observed PWV presented a better agreement northerly. Initial predictability test for each model was done on two independent data sets: ATOVS (2012-2015), and radiosonde (2010-2011) at Penang, Kuantan and Sepang stations, with very good outcomes. The results of the tests revealed remarkable performances, when compared with two previously reported models. The inclusion of variable regression coefficients, and the utilization of satellite-derived data, which provide soundings of data-void regions between radiosonde networks, proved to have optimized the results.

  15. Sialendoscopy for Patients with Radioiodine-Induced Sialadenitis and Xerostomia.

    PubMed

    Bhayani, Mihir K; Acharya, Varun; Kongkiatkamon, Suchada; Farah, Sally; Roberts, Dianna B; Sterba, Jennifer; Chambers, Mark S; Lai, Stephen Y

    2015-07-01

    We examined outcomes in patients treated for radioactive iodine-induced sialadenitis (RAIS) and xerostomia with sialendoscopy. Data was prospectively collected for all patients undergoing sialendoscopy for RAIS from a single institution. Interventional details and intraoperative findings were recorded. Qualitative data were obtained through patient examination, telephone interviews, and use of a standard quality of life questionnaire, Xerostomia Questionnaire. Quantitative data were obtained from patients who underwent sialometry. Twenty-six patients (24 women and 2 men; median age, 43 years; age range, 19-57 years) underwent interventional sialendoscopy after conservative management of symptoms proved unsuccessful. Sialadenitis was present in 25 patients and xerostomia in 22 patients. Mucus plugging in the duct of the gland was the most common finding (22 patients) followed by stenosis (18 patients), inflammation (eight patients), and erythema (eight patients). Median follow-up time was 23.4±12.1 months. Sixteen patients (64%) reported complete resolution; seven (28%), partial resolution; one (4%), no change in symptoms; and one (4%), regression in RAIS-related symptoms. Patients subjectively noted the following regarding their xerostomia symptoms: seven (31.8%) had complete resolution; 10 (45.5%), partial resolution; four (18.2%), no change; and one (4.5%), regression. Statistical analysis of the available sialometry data revealed a statistically significant difference in saliva production at 6 months following sialendoscopy for unstimulated saliva production (p=0.028). Sialendoscopy is an effective treatment option for the management of RAIS and xerostomia refractory to conservative therapy and medical management. Patients in our cohort report durable improvement in symptoms after intervention.

  16. Moderation analysis using a two-level regression model.

    PubMed

    Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott

    2014-10-01

    Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.

  17. Multiple Correlation versus Multiple Regression.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    2003-01-01

    Describes differences between multiple correlation analysis (MCA) and multiple regression analysis (MRA), showing how these approaches involve different research questions and study designs, different inferential approaches, different analysis strategies, and different reported information. (SLD)

  18. Functional Relationships and Regression Analysis.

    ERIC Educational Resources Information Center

    Preece, Peter F. W.

    1978-01-01

    Using a degenerate multivariate normal model for the distribution of organismic variables, the form of least-squares regression analysis required to estimate a linear functional relationship between variables is derived. It is suggested that the two conventional regression lines may be considered to describe functional, not merely statistical,…

  19. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression

    ERIC Educational Resources Information Center

    Beckstead, Jason W.

    2012-01-01

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic…

  20. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  1. Logistic Regression: Concept and Application

    ERIC Educational Resources Information Center

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  2. Risk factors for Mycobacterium ulcerans infection (Buruli Ulcer) in Togo ─ a case-control study in Zio and Yoto districts of the maritime region.

    PubMed

    Maman, Issaka; Tchacondo, Tchadjobo; Kere, Abiba Banla; Piten, Ebekalisai; Beissner, Marcus; Kobara, Yiragnima; Kossi, Komlan; Badziklou, Kossi; Wiedemann, Franz Xaver; Amekuse, Komi; Bretzel, Gisela; Karou, Damintoti Simplice

    2018-01-19

    Buruli ulcer (BU) is a neglected mycobacterial skin infection caused by Mycobacterium ulcerans. This disease mostly affects poor rural populations, especially in areas with low hygiene standards and sanitation coverage. The objective of this study was to identify these risk factors in the districts of Zio and Yoto of the Maritime Region in Togo. We conducted a case-control study in Zio and Yoto, two districts proved BU endemic from November 2014 to May 2015. BU cases were diagnosed according to the WHO clinical case definition at the Centre Hospitalier Régional de Tsévié (CHR Tsévié) and confirmed by Ziehl-Neelsen (ZN) microscopy and IS2404 polymerase chain reaction (PCR). For each case, up to two controls matched by sex and place of residence were recruited. Socio-demographic, environmental or behavioral data were collected and conditional logistic regression analysis was used to identify and compare risk factors between BU cases and controls. A total of 83 cases and 128 controls were enrolled. The median age was 15 years (range 3-65 years). Multivariate conditional logistic regression analysis after adjustment for potential confounders identified age (< 10 years (OR =11.48, 95% CI = 3.72-35.43) and 10-14 years (OR = 3.63, 95% CI = 1.22-10.83)), receiving insect bites near a river (OR = 7.8, 95% CI = 1.48-41.21) and bathing with water from open borehole (OR = 5.77, (1.11-29.27)) as independent predictors of acquiring BU infection. This study identified age, bathing with water from open borehole and receiving insect bites near a river as potential risk of acquiring BU infection in Zio and Yoto districts of the Maritime Region in south Togo.

  3. Applying Regression Analysis to Problems in Institutional Research.

    ERIC Educational Resources Information Center

    Bohannon, Tom R.

    1988-01-01

    Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)

  4. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies

    PubMed Central

    Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.

    2016-01-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis. PMID:27274911

  5. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    PubMed

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  6. Temperature Control and Numerical Analysis for Mass Concrete Pile Cap of Hai-huang Bridge

    NASA Astrophysics Data System (ADS)

    Shi, Han; Hao, Yang; Yong-liang, Wang

    2018-05-01

    In order to study the heat of hydration in massive concrete, this paper takes Hai-huang bridge for engineering background and uses the finite element analysis software of FEA to analyze the heat of hydration effect of the cushion cap. Comparing the measured data with the theory data, the results showed that the concrete crack was controlled effectively and ensure the construction quality by adopted reasonable temperature control measures. The results of the research prove that the measured data was consistent with calculation data, and it proves the accuracy of the finite element analysis. Finally, the study provides certain reference and guiding significance for similar project.

  7. First derivative spectrophotometric determination of granisetron hydrochloride in presence of its hydrolytic products and preservative and application to pharmaceutical preparations.

    PubMed

    Hewala, Ismail I; Bedair, Mona M; Shousha, Sherif M

    2013-04-01

    Granisetron is a selective 5-HT3 receptor antagonist used in prevention and treatment of chemotherapy-induced nausea and vomiting. The drug is available in tablet dosage form and parenteral dosage form containing benzyl alcohol as a preservative. The main route of degradation of granisetron is through hydrolysis. The present work describes the development of a simple, rapid, and reliable first derivative spectrophotometric method for the determination of granisetron in presence of its hydrolytic products as well as the formulations adjuvant and benzyl alcohol. The method is based on the measurement of the first derivative response of granisetron at 290 nm where the interference of the hydrolytic products, the co-formulated adjuvant and benzyl alcohol is completely eliminated. The proposed method was validated with respect to specificity, linearity, selectivity, accuracy, precision, robustness, detection, and quantification limits. Regression analysis showed good correlation between the first derivative response and the concentration of granisetron over a range of 8-16 μg ml(-1) . Statistical analysis proved the accuracy of the proposed method compared with a reference stability indicating high performance liquid chromatography method. The described method was successfully applied to the determination of granisetron in different batches of tablets and ampoules. The assay results obtained in this study strongly encourage us to apply the validated method for the quality control and routine analysis of tablets and parenteral preparations containing granisetron. Copyright © 2012 John Wiley & Sons, Ltd.

  8. New insight of hybrid membrane to degrade Congo red and Reactive yellow under sunlight.

    PubMed

    Rajeswari, A; Jackcina Stobel Christy, E; Pius, Anitha

    2018-02-01

    A study was carried out to investigate the degradation of organic contaminants (Congo red and Reactive yellow - 105) using cellulose acetate - polystyrene (CA-PS) membrane with and without ZnO impregnation. Scanning electron microscope (SEM), electron dispersive analysis of X-rays (EDAX), Fourier transform infrared spectrometer (FTIR), atomic force microscope (AFM) and thermogravimeric analysis (TG-DTA) analysis were carried out to characterize bare and ZnO impregnated CA-PS membranes. Membrane efficiency was also tested for pure water flux and antifouling performance. The modified membrane showed almost 85% water flux recovery. Blending of ZnO nanoparticles to CA-PS matrix could decrease membrane fouling and increase permeation quality of the membrane with above 90% of photocatalytic degradation efficiency for dyes. The rate of degradation of dyes was observed using UV-Vis spectrometer. Reusability of CA-PS-ZnO membrane was studied and no significant change was noted in the degradation efficiency until fourth cycle. Langmuir-Hinshelwood kinetic model well describes the photo degradation capacity and the degradation of dyes CR and RY - 105 exhibited pseudo-first order kinetics. The regression coefficient (R) of CR and RY - 105 found to be 0.99. The novelty of the prepared CA-PS-ZnO membrane is that it has better efficiency and high thermal stability than our previously reported material. Therefore, ZnO impregnated CA-PS membrane had proved to be an innovative alternative for the degradation of CR and RY - 105 dyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Stepwise versus Hierarchical Regression: Pros and Cons

    ERIC Educational Resources Information Center

    Lewis, Mitzi

    2007-01-01

    Multiple regression is commonly used in social and behavioral data analysis. In multiple regression contexts, researchers are very often interested in determining the "best" predictors in the analysis. This focus may stem from a need to identify those predictors that are supportive of theory. Alternatively, the researcher may simply be interested…

  10. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    ERIC Educational Resources Information Center

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  11. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  12. Precision Efficacy Analysis for Regression.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.

    When multiple linear regression is used to develop a prediction model, sample size must be large enough to ensure stable coefficients. If the derivation sample size is inadequate, the model may not predict well for future subjects. The precision efficacy analysis for regression (PEAR) method uses a cross- validity approach to select sample sizes…

  13. Estimation of 1RM for knee extension based on the maximal isometric muscle strength and body composition.

    PubMed

    Kanada, Yoshikiyo; Sakurai, Hiroaki; Sugiura, Yoshito; Arai, Tomoaki; Koyama, Soichiro; Tanabe, Shigeo

    2017-11-01

    [Purpose] To create a regression formula in order to estimate 1RM for knee extensors, based on the maximal isometric muscle strength measured using a hand-held dynamometer and data regarding the body composition. [Subjects and Methods] Measurement was performed in 21 healthy males in their twenties to thirties. Single regression analysis was performed, with measurement values representing 1RM and the maximal isometric muscle strength as dependent and independent variables, respectively. Furthermore, multiple regression analysis was performed, with data regarding the body composition incorporated as another independent variable, in addition to the maximal isometric muscle strength. [Results] Through single regression analysis with the maximal isometric muscle strength as an independent variable, the following regression formula was created: 1RM (kg)=0.714 + 0.783 × maximal isometric muscle strength (kgf). On multiple regression analysis, only the total muscle mass was extracted. [Conclusion] A highly accurate regression formula to estimate 1RM was created based on both the maximal isometric muscle strength and body composition. Using a hand-held dynamometer and body composition analyzer, it was possible to measure these items in a short time, and obtain clinically useful results.

  14. Textural analysis of pre-therapeutic [18F]-FET-PET and its correlation with tumor grade and patient survival in high-grade gliomas.

    PubMed

    Pyka, Thomas; Gempt, Jens; Hiob, Daniela; Ringel, Florian; Schlegel, Jürgen; Bette, Stefanie; Wester, Hans-Jürgen; Meyer, Bernhard; Förster, Stefan

    2016-01-01

    Amino acid positron emission tomography (PET) with [18F]-fluoroethyl-L-tyrosine (FET) is well established in the diagnostic work-up of malignant brain tumors. Analysis of FET-PET data using tumor-to-background ratios (TBR) has been shown to be highly valuable for the detection of viable hypermetabolic brain tumor tissue; however, it has not proven equally useful for tumor grading. Recently, textural features in 18-fluorodeoxyglucose-PET have been proposed as a method to quantify the heterogeneity of glucose metabolism in a variety of tumor entities. Herein we evaluate whether textural FET-PET features are of utility for grading and prognostication in patients with high-grade gliomas. One hundred thirteen patients (70 men, 43 women) with histologically proven high-grade gliomas were included in this retrospective study. All patients received static FET-PET scans prior to first-line therapy. TBR (max and mean), volumetric parameters and textural parameters based on gray-level neighborhood difference matrices were derived from static FET-PET images. Receiver operating characteristic (ROC) and discriminant function analyses were used to assess the value for tumor grading. Kaplan-Meier curves and univariate and multivariate Cox regression were employed for analysis of progression-free and overall survival. All FET-PET textural parameters showed the ability to differentiate between World Health Organization (WHO) grade III and IV tumors (p < 0.001; AUC 0.775). Further improvement in discriminatory power was possible through a combination of texture and metabolic tumor volume, classifying 85 % of tumors correctly (AUC 0.830). TBR and volumetric parameters alone were correlated with tumor grade, but showed lower AUC values (0.644 and 0.710, respectively). Furthermore, a correlation of FET-PET texture but not TBR was shown with patient PFS and OS, proving significant in multivariate analysis as well. Volumetric parameters were predictive for OS, but this correlation did not hold in multivariate analysis. Determination of uptake heterogeneity in pre-therapeutic FET-PET using textural features proved valuable for the (sub-)grading of high-grade glioma as well as prediction of tumor progression and patient survival, and showed improved performance compared to standard parameters such as TBR and tumor volume. Our results underscore the importance of intratumoral heterogeneity in the biology of high-grade glial cell tumors and may contribute to individual therapy planning in the future, although they must be confirmed in prospective studies before incorporation into clinical routine.

  15. Standardization of a traditional polyherbo-mineral formulation - Brahmi vati.

    PubMed

    Mishra, Amrita; Mishra, Arun K; Ghosh, Ashoke K; Jha, Shivesh

    2013-01-01

    The present study deals with standardization of an in-house standard preparation and three marketed samples of Brahmi vati, which is a traditional medicine known to be effective in mental disorders, convulsions, weak memory, high fever and hysteria. Preparation and standardization have been done by following modern scientific quality control procedures for raw material and the finished products. The scanning electron microscopic (SEM) analysis showed the reduction of metals and minerals (particle size range 2-5 µm) which indicates the proper preparation of bhasmas, the important ingredient of Brahmi vati. Findings of EDX analysis of all samples of Brahmi vati suggested the absence of Gold, an important constituent of Brahmi vati in two marketed samples. All the samples of Brahmi vati were subjected to quantitative estimation of Bacoside A (marker compound) by HPTLC technique. Extraction of the samples was done in methanol and the chromatograms were developed in Butanol: Glacial acetic acid: water (4.5:0.5:5 v/v) and detected at 225nm. The regression analysis of calibration plots of Bacoside A exhibited linear relationship in the concentration range of 50-300 ng, while the % recovery was found to be 96.06% w/w, thus proving the accuracy and precision of the analysis. The Bacoside A content in the in-house preparation was found to be higher than that of the commercial samples. The proposed HPTLC method was found to be rapid, simple and accurate for quantitative estimation of Bacoside A in different formulations. The results of this study could be used as a model data in the standardization of Brahmi vati.

  16. Spectral analysis for GNSS coordinate time series using chirp Fourier transform

    NASA Astrophysics Data System (ADS)

    Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan

    2017-12-01

    Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.

  17. Anaesthesia Management for Awake Craniotomy: Systematic Review and Meta-Analysis

    PubMed Central

    Rossaint, Rolf; Veldeman, Michael

    2016-01-01

    Background Awake craniotomy (AC) renders an expanded role in functional neurosurgery. Yet, evidence for optimal anaesthesia management remains limited. We aimed to summarise the latest clinical evidence of AC anaesthesia management and explore the relationship of AC failures on the used anaesthesia techniques. Methods Two authors performed independently a systematic search of English articles in PubMed and EMBASE database 1/2007-12/2015. Search included randomised controlled trials (RCTs), observational trials, and case reports (n>4 cases), which reported anaesthetic approach for AC and at least one of our pre-specified outcomes: intraoperative seizures, hypoxia, arterial hypertension, nausea and vomiting, neurological dysfunction, conversion into general anaesthesia and failure of AC. Random effects meta-analysis was used to estimate event rates for four outcomes. Relationship with anaesthesia technique was explored using logistic meta-regression, calculating the odds ratios (OR) and 95% confidence intervals [95%CI]. Results We have included forty-seven studies. Eighteen reported asleep-awake-asleep technique (SAS), twenty-seven monitored anaesthesia care (MAC), one reported both and one used the awake-awake-awake technique (AAA). Proportions of AC failures, intraoperative seizures, new neurological dysfunction and conversion into general anaesthesia (GA) were 2% [95%CI:1–3], 8% [95%CI:6–11], 17% [95%CI:12–23] and 2% [95%CI:2–3], respectively. Meta-regression of SAS and MAC technique did not reveal any relevant differences between outcomes explained by the technique, except for conversion into GA. Estimated OR comparing SAS to MAC for AC failures was 0.98 [95%CI:0.36–2.69], 1.01 [95%CI:0.52–1.88] for seizures, 1.66 [95%CI:1.35–3.70] for new neurological dysfunction and 2.17 [95%CI:1.22–3.85] for conversion into GA. The latter result has to be interpreted cautiously. It is based on one retrospective high-risk of bias study and significance was abolished in a sensitivity analysis of only prospectively conducted studies. Conclusion SAS and MAC techniques were feasible and safe, whereas data for AAA technique are limited. Large RCTs are required to prove superiority of one anaesthetic regime for AC. PMID:27228013

  18. Anaesthesia Management for Awake Craniotomy: Systematic Review and Meta-Analysis.

    PubMed

    Stevanovic, Ana; Rossaint, Rolf; Veldeman, Michael; Bilotta, Federico; Coburn, Mark

    2016-01-01

    Awake craniotomy (AC) renders an expanded role in functional neurosurgery. Yet, evidence for optimal anaesthesia management remains limited. We aimed to summarise the latest clinical evidence of AC anaesthesia management and explore the relationship of AC failures on the used anaesthesia techniques. Two authors performed independently a systematic search of English articles in PubMed and EMBASE database 1/2007-12/2015. Search included randomised controlled trials (RCTs), observational trials, and case reports (n>4 cases), which reported anaesthetic approach for AC and at least one of our pre-specified outcomes: intraoperative seizures, hypoxia, arterial hypertension, nausea and vomiting, neurological dysfunction, conversion into general anaesthesia and failure of AC. Random effects meta-analysis was used to estimate event rates for four outcomes. Relationship with anaesthesia technique was explored using logistic meta-regression, calculating the odds ratios (OR) and 95% confidence intervals [95%CI]. We have included forty-seven studies. Eighteen reported asleep-awake-asleep technique (SAS), twenty-seven monitored anaesthesia care (MAC), one reported both and one used the awake-awake-awake technique (AAA). Proportions of AC failures, intraoperative seizures, new neurological dysfunction and conversion into general anaesthesia (GA) were 2% [95%CI:1-3], 8% [95%CI:6-11], 17% [95%CI:12-23] and 2% [95%CI:2-3], respectively. Meta-regression of SAS and MAC technique did not reveal any relevant differences between outcomes explained by the technique, except for conversion into GA. Estimated OR comparing SAS to MAC for AC failures was 0.98 [95%CI:0.36-2.69], 1.01 [95%CI:0.52-1.88] for seizures, 1.66 [95%CI:1.35-3.70] for new neurological dysfunction and 2.17 [95%CI:1.22-3.85] for conversion into GA. The latter result has to be interpreted cautiously. It is based on one retrospective high-risk of bias study and significance was abolished in a sensitivity analysis of only prospectively conducted studies. SAS and MAC techniques were feasible and safe, whereas data for AAA technique are limited. Large RCTs are required to prove superiority of one anaesthetic regime for AC.

  19. Regression Model Optimization for the Analysis of Experimental Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.

    2009-01-01

    A candidate math model search algorithm was developed at Ames Research Center that determines a recommended math model for the multivariate regression analysis of experimental data. The search algorithm is applicable to classical regression analysis problems as well as wind tunnel strain gage balance calibration analysis applications. The algorithm compares the predictive capability of different regression models using the standard deviation of the PRESS residuals of the responses as a search metric. This search metric is minimized during the search. Singular value decomposition is used during the search to reject math models that lead to a singular solution of the regression analysis problem. Two threshold dependent constraints are also applied. The first constraint rejects math models with insignificant terms. The second constraint rejects math models with near-linear dependencies between terms. The math term hierarchy rule may also be applied as an optional constraint during or after the candidate math model search. The final term selection of the recommended math model depends on the regressor and response values of the data set, the user s function class combination choice, the user s constraint selections, and the result of the search metric minimization. A frequently used regression analysis example from the literature is used to illustrate the application of the search algorithm to experimental data.

  20. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    PubMed

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P < 0.0001) based on testing by the Lagrangemultiplier. Therefore, the over-dispersion dispersed data using a modified Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  1. Logistic regression analysis of conventional ultrasonography, strain elastosonography, and contrast-enhanced ultrasound characteristics for the differentiation of benign and malignant thyroid nodules

    PubMed Central

    Deng, Yingyuan; Wang, Tianfu; Chen, Siping; Liu, Weixiang

    2017-01-01

    The aim of the study is to screen the significant sonographic features by logistic regression analysis and fit a model to diagnose thyroid nodules. A total of 525 pathological thyroid nodules were retrospectively analyzed. All the nodules underwent conventional ultrasonography (US), strain elastosonography (SE), and contrast -enhanced ultrasound (CEUS). Those nodules’ 12 suspicious sonographic features were used to assess thyroid nodules. The significant features of diagnosing thyroid nodules were picked out by logistic regression analysis. All variables that were statistically related to diagnosis of thyroid nodules, at a level of p < 0.05 were embodied in a logistic regression analysis model. The significant features in the logistic regression model of diagnosing thyroid nodules were calcification, suspected cervical lymph node metastasis, hypoenhancement pattern, margin, shape, vascularity, posterior acoustic, echogenicity, and elastography score. According to the results of logistic regression analysis, the formula that could predict whether or not thyroid nodules are malignant was established. The area under the receiver operating curve (ROC) was 0.930 and the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were 83.77%, 89.56%, 87.05%, 86.04%, and 87.79% respectively. PMID:29228030

  2. Logistic regression analysis of conventional ultrasonography, strain elastosonography, and contrast-enhanced ultrasound characteristics for the differentiation of benign and malignant thyroid nodules.

    PubMed

    Pang, Tiantian; Huang, Leidan; Deng, Yingyuan; Wang, Tianfu; Chen, Siping; Gong, Xuehao; Liu, Weixiang

    2017-01-01

    The aim of the study is to screen the significant sonographic features by logistic regression analysis and fit a model to diagnose thyroid nodules. A total of 525 pathological thyroid nodules were retrospectively analyzed. All the nodules underwent conventional ultrasonography (US), strain elastosonography (SE), and contrast -enhanced ultrasound (CEUS). Those nodules' 12 suspicious sonographic features were used to assess thyroid nodules. The significant features of diagnosing thyroid nodules were picked out by logistic regression analysis. All variables that were statistically related to diagnosis of thyroid nodules, at a level of p < 0.05 were embodied in a logistic regression analysis model. The significant features in the logistic regression model of diagnosing thyroid nodules were calcification, suspected cervical lymph node metastasis, hypoenhancement pattern, margin, shape, vascularity, posterior acoustic, echogenicity, and elastography score. According to the results of logistic regression analysis, the formula that could predict whether or not thyroid nodules are malignant was established. The area under the receiver operating curve (ROC) was 0.930 and the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were 83.77%, 89.56%, 87.05%, 86.04%, and 87.79% respectively.

  3. Use of COD, TOC, and Fluorescence Spectroscopy to Estimate BOD in Wastewater.

    PubMed

    Christian, Evelyn; Batista, Jacimaria R; Gerrity, Daniel

    2017-02-01

      Common to all National Pollutant Discharge Elimination System (NPDES) permits in the United States is a limit on biochemical oxygen demand (BOD). Chemical oxygen demand (COD), total organic carbon (TOC), and fluorescence spectroscopy are also capable of quantifying organic content, although the mechanisms of quantification and the organic fractions targeted differ for each test. This study explores correlations between BOD5 and these alternate test procedures using facility influent, primary effluent, and facility effluent samples from a full-scale water resource recovery facility. Relative reductions of the water quality parameters proved to be strong indicators of their suitability as surrogates for BOD5. Suitable correlations were generally limited to the combined datasets for the three sampling locations or the facility effluent alone. COD exhibited relatively strong linear correlations with BOD5 when considering the three sample points (r = 0.985) and the facility effluent alone (r = 0.914), while TOC exhibited a suitable linear correlation with BOD5 in the facility effluent (r = 0.902). Exponential regressions proved to be useful for estimating BOD5 based on TOC or fluorescence (r > 0.95).

  4. Porosity estimation by semi-supervised learning with sparsely available labeled samples

    NASA Astrophysics Data System (ADS)

    Lima, Luiz Alberto; Görnitz, Nico; Varella, Luiz Eduardo; Vellasco, Marley; Müller, Klaus-Robert; Nakajima, Shinichi

    2017-09-01

    This paper addresses the porosity estimation problem from seismic impedance volumes and porosity samples located in a small group of exploratory wells. Regression methods, trained on the impedance as inputs and the porosity as output labels, generally suffer from extremely expensive (and hence sparsely available) porosity samples. To optimally make use of the valuable porosity data, a semi-supervised machine learning method was proposed, Transductive Conditional Random Field Regression (TCRFR), showing good performance (Görnitz et al., 2017). TCRFR, however, still requires more labeled data than those usually available, which creates a gap when applying the method to the porosity estimation problem in realistic situations. In this paper, we aim to fill this gap by introducing two graph-based preprocessing techniques, which adapt the original TCRFR for extremely weakly supervised scenarios. Our new method outperforms the previous automatic estimation methods on synthetic data and provides a comparable result to the manual labored, time-consuming geostatistics approach on real data, proving its potential as a practical industrial tool.

  5. Developing a Model for Forecasting Road Traffic Accident (RTA) Fatalities in Yemen

    NASA Astrophysics Data System (ADS)

    Karim, Fareed M. A.; Abdo Saleh, Ali; Taijoobux, Aref; Ševrović, Marko

    2017-12-01

    The aim of this paper is to develop a model for forecasting RTA fatalities in Yemen. The yearly fatalities was modeled as the dependent variable, while the number of independent variables included the population, number of vehicles, GNP, GDP and Real GDP per capita. It was determined that all these variables are highly correlated with the correlation coefficient (r ≈ 0.9); in order to avoid multicollinearity in the model, a single variable with the highest r value was selected (real GDP per capita). A simple regression model was developed; the model was very good (R2=0.916); however, the residuals were serially correlated. The Prais-Winsten procedure was used to overcome this violation of the regression assumption. The data for a 20-year period from 1991-2010 were analyzed to build the model; the model was validated by using data for the years 2011-2013; the historical fit for the period 1991 - 2011 was very good. Also, the validation for 2011-2013 proved accurate.

  6. Robust Joint Graph Sparse Coding for Unsupervised Spectral Feature Selection.

    PubMed

    Zhu, Xiaofeng; Li, Xuelong; Zhang, Shichao; Ju, Chunhua; Wu, Xindong

    2017-06-01

    In this paper, we propose a new unsupervised spectral feature selection model by embedding a graph regularizer into the framework of joint sparse regression for preserving the local structures of data. To do this, we first extract the bases of training data by previous dictionary learning methods and, then, map original data into the basis space to generate their new representations, by proposing a novel joint graph sparse coding (JGSC) model. In JGSC, we first formulate its objective function by simultaneously taking subspace learning and joint sparse regression into account, then, design a new optimization solution to solve the resulting objective function, and further prove the convergence of the proposed solution. Furthermore, we extend JGSC to a robust JGSC (RJGSC) via replacing the least square loss function with a robust loss function, for achieving the same goals and also avoiding the impact of outliers. Finally, experimental results on real data sets showed that both JGSC and RJGSC outperformed the state-of-the-art algorithms in terms of k -nearest neighbor classification performance.

  7. Probabilistic forecasting for extreme NO2 pollution episodes.

    PubMed

    Aznarte, José L

    2017-10-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Intelligent Optimization of the Film-to-Fiber Ratio of a Degradable Braided Bicomponent Ureteral Stent

    PubMed Central

    Liu, Xiaoyan; Li, Feng; Ding, Yongsheng; Zou, Ting; Wang, Lu; Hao, Kuangrong

    2015-01-01

    A hierarchical support vector regression (SVR) model (HSVRM) was employed to correlate the compositions and mechanical properties of bicomponent stents composed of poly(lactic-co-glycolic acid) (PGLA) film and poly(glycolic acid) (PGA) fibers for urethral repair for the first time. PGLA film and PGA fibers could provide ureteral stents with good compressive and tensile properties, respectively. In bicomponent stents, high film content led to high stiffness, while high fiber content resulted in poor compressional properties. To simplify the procedures to optimize the ratio of PGLA film and PGA fiber in the stents, a hierarchical support vector regression model (HSVRM) and particle swarm optimization (PSO) algorithm were used to construct relationships between the film-to-fiber weight ratio and the measured compressional/tensile properties of the stents. The experimental data and simulated data fit well, proving that the HSVRM could closely reflect the relationship between the component ratio and performance properties of the ureteral stents. PMID:28793658

  9. Prediction models for Arabica coffee beverage quality based on aroma analyses and chemometrics.

    PubMed

    Ribeiro, J S; Augusto, F; Salva, T J G; Ferreira, M M C

    2012-11-15

    In this work, soft modeling based on chemometric analyses of coffee beverage sensory data and the chromatographic profiles of volatile roasted coffee compounds is proposed to predict the scores of acidity, bitterness, flavor, cleanliness, body, and overall quality of the coffee beverage. A partial least squares (PLS) regression method was used to construct the models. The ordered predictor selection (OPS) algorithm was applied to select the compounds for the regression model of each sensory attribute in order to take only significant chromatographic peaks into account. The prediction errors of these models, using 4 or 5 latent variables, were equal to 0.28, 0.33, 0.35, 0.33, 0.34 and 0.41, for each of the attributes and compatible with the errors of the mean scores of the experts. Thus, the results proved the feasibility of using a similar methodology in on-line or routine applications to predict the sensory quality of Brazilian Arabica coffee. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    PubMed Central

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  11. Quantum walks with an anisotropic coin I: spectral theory

    NASA Astrophysics Data System (ADS)

    Richard, S.; Suzuki, A.; Tiedra de Aldecoa, R.

    2018-02-01

    We perform the spectral analysis of the evolution operator U of quantum walks with an anisotropic coin, which include one-defect models, two-phase quantum walks, and topological phase quantum walks as special cases. In particular, we determine the essential spectrum of U, we show the existence of locally U-smooth operators, we prove the discreteness of the eigenvalues of U outside the thresholds, and we prove the absence of singular continuous spectrum for U. Our analysis is based on new commutator methods for unitary operators in a two-Hilbert spaces setting, which are of independent interest.

  12. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    PubMed

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  13. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  14. Effect of Contact Damage on the Strength of Ceramic Materials.

    DTIC Science & Technology

    1982-10-01

    variables that are important to erosion, and a multivariate , linear regression analysis is used to fit the data to the dimensional analysis. The...of Equations 7 and 8 by a multivariable regression analysis (room tem- perature data) Exponent Regression Standard error Computed coefficient of...1980) 593. WEAVER, Proc. Brit. Ceram. Soc. 22 (1973) 125. 39. P. W. BRIDGMAN, "Dimensional Analaysis ", (Yale 18. R. W. RICE, S. W. FREIMAN and P. F

  15. [Study of Cervical Exfoliated Cell's DNA Quantitative Analysis Based on Multi-Spectral Imaging Technology].

    PubMed

    Wu, Zheng; Zeng, Li-bo; Wu, Qiong-shui

    2016-02-01

    The conventional cervical cancer screening methods mainly include TBS (the bethesda system) classification method and cellular DNA quantitative analysis, however, by using multiple staining method in one cell slide, which is staining the cytoplasm with Papanicolaou reagent and the nucleus with Feulgen reagent, the study of achieving both two methods in the cervical cancer screening at the same time is still blank. Because the difficulty of this multiple staining method is that the absorbance of the non-DNA material may interfere with the absorbance of DNA, so that this paper has set up a multi-spectral imaging system, and established an absorbance unmixing model by using multiple linear regression method based on absorbance's linear superposition character, and successfully stripped out the absorbance of DNA to run the DNA quantitative analysis, and achieved the perfect combination of those two kinds of conventional screening method. Through a series of experiment we have proved that between the absorbance of DNA which is calculated by the absorbance unmixxing model and the absorbance of DNA which is measured there is no significant difference in statistics when the test level is 1%, also the result of actual application has shown that there is no intersection between the confidence interval of the DNA index of the tetraploid cells which are screened by using this paper's analysis method when the confidence level is 99% and the DNA index's judging interval of cancer cells, so that the accuracy and feasibility of the quantitative DNA analysis with multiple staining method expounded by this paper have been verified, therefore this analytical method has a broad application prospect and considerable market potential in early diagnosis of cervical cancer and other cancers.

  16. GIS-based groundwater potential mapping using boosted regression tree, classification and regression tree, and random forest machine learning models in Iran.

    PubMed

    Naghibi, Seyed Amir; Pourghasemi, Hamid Reza; Dixon, Barnali

    2016-01-01

    Groundwater is considered one of the most valuable fresh water resources. The main objective of this study was to produce groundwater spring potential maps in the Koohrang Watershed, Chaharmahal-e-Bakhtiari Province, Iran, using three machine learning models: boosted regression tree (BRT), classification and regression tree (CART), and random forest (RF). Thirteen hydrological-geological-physiographical (HGP) factors that influence locations of springs were considered in this research. These factors include slope degree, slope aspect, altitude, topographic wetness index (TWI), slope length (LS), plan curvature, profile curvature, distance to rivers, distance to faults, lithology, land use, drainage density, and fault density. Subsequently, groundwater spring potential was modeled and mapped using CART, RF, and BRT algorithms. The predicted results from the three models were validated using the receiver operating characteristics curve (ROC). From 864 springs identified, 605 (≈70 %) locations were used for the spring potential mapping, while the remaining 259 (≈30 %) springs were used for the model validation. The area under the curve (AUC) for the BRT model was calculated as 0.8103 and for CART and RF the AUC were 0.7870 and 0.7119, respectively. Therefore, it was concluded that the BRT model produced the best prediction results while predicting locations of springs followed by CART and RF models, respectively. Geospatially integrated BRT, CART, and RF methods proved to be useful in generating the spring potential map (SPM) with reasonable accuracy.

  17. Common pitfalls in statistical analysis: Linear regression analysis

    PubMed Central

    Aggarwal, Rakesh; Ranganathan, Priya

    2017-01-01

    In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis. PMID:28447022

  18. Predicting water consumption habits for seven arsenic-safe water options in Bangladesh.

    PubMed

    Inauen, Jennifer; Tobias, Robert; Mosler, Hans-Joachim

    2013-05-01

    In Bangladesh, 20 million people are at the risk of developing arsenicosis because of excessive arsenic intake. Despite increased awareness, many of the implemented arsenic-safe water options are not being sufficiently used by the population. This study investigated the role of social-cognitive factors in explaining the habitual use of arsenic-safe water options. Eight hundred seventy-two randomly selected households in six arsenic-affected districts of rural Bangladesh, which had access to an arsenic-safe water option, were interviewed using structured face-to-face interviews in November 2009. Habitual use of arsenic-safe water options, severity, vulnerability, affective and instrumental attitudes, injunctive and descriptive norms, self-efficacy, and coping planning were measured. The data were analyzed using multiple linear regressions. Linear regression revealed that self-efficacy (B = 0.42, SE = .03, p < .001), the instrumental attitude towards the safe water option (B = 0.24, SE = .04, p < .001), the affective attitude towards contaminated tube wells (B = -0.04, SE = .02, p = .024), vulnerability (B = -0.20, SE = .02, p < .001), as well as injunctive (B = 0.08, SE = 0.04, p = .049) and descriptive norms (B = 0.34, SE = .03, p < .001) primarily explained the habitual use of arsenic-safe water options (R2 = 0.688). This model proved highly generalizable to all seven arsenic-safe water options investigated, even though habitual use of single options were predicted on the basis of parameters estimated without these options. This general model for the habitual use of arsenic-safe water options may prove useful to predict other water consumption habits. Behavior-change interventions are derived from the model to promote the habitual use of arsenic-safe water options.

  19. Libya Country Analysis Brief

    EIA Publications

    2015-01-01

    Libya joined the Organization of the Petroleum Exporting Countries (OPEC) in 1962, a year after Libya began exporting oil. Libya holds the largest amount of proved crude oil reserves in Africa, the fifth-largest amount of proved natural gas reserves on the continent, and in past years was an important contributor to the global supply of light, sweet (low sulfur) crude oil, which Libya mostly exports to European markets.

  20. Demand reduction analysis for Aberdeen Proving Grounds, Aberdeen, Maryland. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-06-01

    The objectives of the project are to research, identify, evaluate, and define energy saving projects that meet the Army`s criteria and lead to energy savings at the Aberdeen Proving Grounds, Aberdeen campus, with respect to electrical demand reduction. Details of the authorization and objectives of this report, which delineates our contractual arrangement with the government, may be found in Section 8.11.

  1. Supplemental Analysis on Compressed Sensing Based Interior Tomography

    PubMed Central

    Yu, Hengyong; Yang, Jiansheng; Jiang, Ming; Wang, Ge

    2010-01-01

    Recently, in the compressed sensing framework we proved that an interior ROI can be exactly reconstructed via the total variation minimization if the ROI is piecewise constant. In the proofs, we implicitly utilized the property that if an artifact image assumes a constant value within the ROI then this constant must be zero. Here we prove this property in the space of square integrable functions. PMID:19717891

  2. Quality of life in breast cancer patients--a quantile regression analysis.

    PubMed

    Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma

    2008-01-01

    Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.

  3. Bayesian B-spline mapping for dynamic quantitative traits.

    PubMed

    Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong

    2012-04-01

    Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.

  4. Game theory based models to analyze water conflicts in the Middle Route of the South-to-North Water Transfer Project in China.

    PubMed

    Wei, Shouke; Yang, Hong; Abbaspour, Karim; Mousavi, Jamshid; Gnauck, Albrecht

    2010-04-01

    This study applied game theory based models to analyze and solve water conflicts concerning water allocation and nitrogen reduction in the Middle Route of the South-to-North Water Transfer Project in China. The game simulation comprised two levels, including one main game with five players and four sub-games with each containing three sub-players. We used statistical and econometric regression methods to formulate payoff functions of the players, economic valuation methods (EVMs) to transform non-monetary value into economic one, cost-benefit Analysis (CBA) to compare the game outcomes, and scenario analysis to investigate the future uncertainties. The validity of game simulation was evaluated by comparing predictions with observations. The main results proved that cooperation would make the players collectively better off, though some player would face losses. However, players were not willing to cooperate, which would result in a prisoners' dilemma. Scenarios simulation results displayed that players in water scare area could not solve its severe water deficit problem without cooperation with other players even under an optimistic scenario, while the uncertainty of cooperation would come from the main polluters. The results suggest a need to design a mechanism to reduce the risk of losses of those players by a side payment, which provides them with economic incentives to cooperate. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  5. Discriminating between stabilizing and destabilizing protein design mutations via recombination and simulation.

    PubMed

    Johnson, Lucas B; Gintner, Lucas P; Park, Sehoo; Snow, Christopher D

    2015-08-01

    Accuracy of current computational protein design (CPD) methods is limited by inherent approximations in energy potentials and sampling. These limitations are often used to qualitatively explain design failures; however, relatively few studies provide specific examples or quantitative details that can be used to improve future CPD methods. Expanding the design method to include a library of sequences provides data that is well suited for discriminating between stabilizing and destabilizing design elements. Using thermophilic endoglucanase E1 from Acidothermus cellulolyticus as a model enzyme, we computationally designed a sequence with 60 mutations. The design sequence was rationally divided into structural blocks and recombined with the wild-type sequence. Resulting chimeras were assessed for activity and thermostability. Surprisingly, unlike previous chimera libraries, regression analysis based on one- and two-body effects was not sufficient for predicting chimera stability. Analysis of molecular dynamics simulations proved helpful in distinguishing stabilizing and destabilizing mutations. Reverting to the wild-type amino acid at destabilized sites partially regained design stability, and introducing predicted stabilizing mutations in wild-type E1 significantly enhanced thermostability. The ability to isolate stabilizing and destabilizing elements in computational design offers an opportunity to interpret previous design failures and improve future CPD methods. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Granulysin as a novel factor for the prognosis of the clinical course of chickenpox.

    PubMed

    Baljic, R; Gojak, R; Konjo, H; Hukic, M

    2018-05-01

    Granulysin is a recently discovered cytolytic protein of natural killer (NK) cells and cytotoxic T lymphocytes. Studies of healthy and immunocompromised patients with primary or recurrent varicella-zoster infections demonstrate the importance of virus-specific cellular immunity in controlling viral replication, but also some studies presented granulysin as a molecule that can play a role in chickenpox immunopathogenesis. This study investigated possible correlation between serum granulysin levels and clinical course of chickenpox. A total of 69 patients with chickenpox were included in the study. We measured the levels of granulysin and percentage count for CD4+, CD8+ and NK cells in serum for all patients and healthy controls. For detection and quantification of granulysin in sera, we performed ELISA test and flow cytometry for detection, identification and percentage measurement of T and B lymphocytes. Descriptive methods, analysis of variance and multivariate logistic regression were used for statistical data analysis. We found respective correlation between serum granulysin level and severity of clinical presentation. These findings can be a good input for further studies, since there is no relevant prognostic parameter of chickenpox in everyday clinical practice. Granulysin, as a therapeutic, also deserves to be a point of interests in the future. If we prove its potential to stop dissemination of human herpes viruses, possibilities to use it in some life-threatening forms of viral disease can be very valuable.

  7. Quantification of febuxostat polymorphs using powder X-ray diffraction technique.

    PubMed

    Qiu, Jing-bo; Li, Gang; Sheng, Yue; Zhu, Mu-rong

    2015-03-25

    Febuxostat is a pharmaceutical compound with more than 20 polymorphs of which form A is most widely used and usually exists in a mixed polymorphic form with form G. In the present study, a quantification method for polymorphic form A and form G of febuxostat (FEB) has been developed using powder X-ray diffraction (PXRD). Prior to development of a quantification method, pure polymorphic form A and form G are characterized. A continuous scan with a scan rate of 3° min(-1) over an angular range of 3-40° 2θ is applied for the construction of the calibration curve using the characteristic peaks of form A at 12.78° 2θ (I/I0100%) and form G at 11.72° 2θ (I/I0100%). The linear regression analysis data for the calibration plots shows good linear relationship with R(2)=0.9985 with respect to peak area in the concentration range 10-60 wt.%. The method is validated for precision, recovery and ruggedness. The limits of detection and quantitation are 1.5% and 4.6%, respectively. The obtained results prove that the method is repeatable, sensitive and accurate. The proposed developed PXRD method can be applied for the quantitative analysis of mixtures of febuxostat polymorphs (forms A and G). Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Reliability and validity of the personality inventory for DSM-5 (PID-5): predicting DSM-IV personality disorders and psychopathy in community-dwelling Italian adults.

    PubMed

    Fossati, Andrea; Krueger, Robert F; Markon, Kristian E; Borroni, Serena; Maffei, Cesare

    2013-12-01

    In order to assess the internal consistency, factor structure, and ability to recover DSM-IV personality disorders (PDs) of the Personality Inventory for DSM-5 (PID-5) scales, 710 Italian adult community dwelling volunteers were administered the Italian translation of the PID-5, as well as the Italian translation of the Personality Diagnostic Questionnaire-4+ (PDQ-4+). Cronbach's alpha values were >.70 for all PID-5 facet scales and greater than .90 for all PID-5 domain scales. Parallel analysis and confirmatory factor analysis supported the theoretical five-factor model of the PID-5 trait scales. Regression analyses showed that both PID-5 trait and domain scales explained a substantial amount of variance in the PDQ-4+ PD scales, with the exception of the Passive-Aggressive PD scale. When the PID-5 was administered to a second independent sample of 389 Italian adult community dwelling volunteers, the basic psychometric properties of the scale were replicated. In this second sample, the PID-5 trait and domain scales proved to be significant predictors of psychopathy measures. As a whole, the results of the present study support the hypothesis that the PID-5 is a reliable instrument which is able to recover DSM-IV PDs, as well as to capture personality pathology that is not included in the DSM-IV (namely, psychopathy).

  9. Magnetic resonance T1 gradient-echo imaging in hepatolithiasis.

    PubMed

    Safar, F; Kamura, T; Okamuto, K; Sasai, K; Gejyo, F

    2005-01-01

    We examined the role of magnetic resonance T1-weighted gradient-echo (MRT1-GE) imaging in hepatolithiasis. MRT1-GE, precontrast computed tomography (CT), and magnetic resonance cholangiopancreatography (MRCP) of 10 patients with hepatolithiasis were compared for their diagnostic accuracies in the detection and localization of intrahepatic calculi. The diagnosis of hepatolithiasis was confirmed by surgery. For localization of the stone, we divided the bile ducts into six areas: right and left hepatic ducts and bile ducts of the lateral, medial, right anterior, and right posterior segments of the liver. Chemical analysis of the stones was performed in eight patients. The total number of segments proved by surgery to contain stones was 18. Although not significantly different, the sensitivity of MRT1-GE was 77.8% (14 of 18 segments), higher than that of MRCP (66.7%, 12 of 18 segments) and that of CT (50%, nine of 18 segments). The sensitivity of magnetic resonance imaging (MRCP + MRT1) was significantly higher than that of CT (p < 0.01). Multiple logistic regression analysis showed that the result of surgery was significantly affected only by the result of magnetic resonance imaging. On MRT1-GE, all the depicted stones appeared as high-intensity signal areas within the low-intensity bile duct irrespective of their chemical composition. MRT1-GE imaging provides complementary information concerning hepatolithiasis.

  10. Development and validation of a reversed-phase high-performance thin-layer chromatography-densitometric method for determination of atorvastatin calcium in bulk drug and tablets.

    PubMed

    Shirkhedkar, Atul A; Surana, Sanjay J

    2010-01-01

    Atorvastatin calcium is a synthetic HMG-CoA reductase inhibitor that is used as a cholesterol-lowering agent. A simple, sensitive, selective, and precise RP-HPTLC-densitometric determination of atorvastatin calcium both as bulk drug and from pharmaceutical formulation was developed and validated according to International Conference on Harmonization guidelines. The method used aluminum sheets precoated with silica gel 60 RP18F254S as the stationary phase, and the mobile phase consisted of methanol-water (3.5 + 1.5, v/v). The system gave a compact band for atorvastatin calcium with an Rf value of 0.62 +/- 0.02. Densitometric quantification was carried out at 246 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with r = 0.9992 in the working concentration range of 100-800 ng/band. The method was validated for precision, accuracy, ruggedness, robustness, specificity, recovery, LOD, and LOQ. The LOD and LOQ were 6 and 18 ng, respectively. The drug underwent hydrolysis when subjected to acidic conditions and was found to be stable under alkali, oxidation, dry heat, and photodegradation conditions. Statistical analysis proved that the developed RP-HPTLC-densitometry method is reproducible and selective and that it can be applied for identification and quantitative determination of atorvastatin calcium in bulk drug and tablet formulation.

  11. The impact of socioeconomic factors on municipal solid waste generation in São Paulo, Brazil.

    PubMed

    Vieira, Victor H Argentino de Morais; Matheus, Dácio R

    2018-01-01

    Social factors have not been sufficiently explored in municipal solid waste management studies. Latin America has produced even fewer studies with this approach; technical and economic investigations have prevailed. We explored the impacts of socioeconomic factors on municipal solid waste generation in Greater Sao Paulo, which includes 39 municipalities. We investigated the relations between municipal solid waste generation and social factors by Pearson's correlation coefficient. The Student's t-test (at p ← 0.01) proved significance, and further regression analysis was performed with significant factors. We considered 10 socioeconomic factors: population, rural population, density, life expectancy, education (secondary, high and undergraduate level), income per capita, inequality and human development. A later multicollinearity analysis resulted in the determination of inequality (r p = 0.625) and income per capita (r p = 0.607) as major drivers. The results showed the relevance of considering social aspects in municipal solid waste management and isolated inequality as an important factor in planning. Inequality must be used as a complementary factor to income, rather than being used exclusively. Inequality may explain differences of waste generation between areas with similar incomes because of consumption patterns. Therefore, unequal realities demand unequal measures to avoid exacerbation, for example, pay-as-you-throw policies instead of uniform fees. Unequal realities also highlight the importance of tiering policies beyond the waste sector, such as sustainable consumption.

  12. The microcomputer scientific software series 2: general linear model--regression.

    Treesearch

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  13. Assessing Wheat Traits by Spectral Reflectance: Do We Really Need to Focus on Predicted Trait-Values or Directly Identify the Elite Genotypes Group?

    PubMed Central

    Garriga, Miguel; Romero-Bravo, Sebastián; Estrada, Félix; Escobar, Alejandro; Matus, Iván A.; del Pozo, Alejandro; Astudillo, Cesar A.; Lobos, Gustavo A.

    2017-01-01

    Phenotyping, via remote and proximal sensing techniques, of the agronomic and physiological traits associated with yield potential and drought adaptation could contribute to improvements in breeding programs. In the present study, 384 genotypes of wheat (Triticum aestivum L.) were tested under fully irrigated (FI) and water stress (WS) conditions. The following traits were evaluated and assessed via spectral reflectance: Grain yield (GY), spikes per square meter (SM2), kernels per spike (KPS), thousand-kernel weight (TKW), chlorophyll content (SPAD), stem water soluble carbohydrate concentration and content (WSC and WSCC, respectively), carbon isotope discrimination (Δ13C), and leaf area index (LAI). The performances of spectral reflectance indices (SRIs), four regression algorithms (PCR, PLSR, ridge regression RR, and SVR), and three classification methods (PCA-LDA, PLS-DA, and kNN) were evaluated for the prediction of each trait. For the classification approaches, two classes were established for each trait: The lower 80% of the trait variability range (Class 1) and the remaining 20% (Class 2 or elite genotypes). Both the SRIs and regression methods performed better when data from FI and WS were combined. The traits that were best estimated by SRIs and regression methods were GY and Δ13C. For most traits and conditions, the estimations provided by RR and SVR were the same, or better than, those provided by the SRIs. PLS-DA showed the best performance among the categorical methods and, unlike the SRI and regression models, most traits were relatively well-classified within a specific hydric condition (FI or WS), proving that classification approach is an effective tool to be explored in future studies related to genotype selection. PMID:28337210

  14. Methods for estimating the magnitude and frequency of peak discharges of rural, unregulated streams in Virginia

    USGS Publications Warehouse

    Bisese, James A.

    1995-01-01

    Methods are presented for estimating the peak discharges of rural, unregulated streams in Virginia. A Pearson Type III distribution is fitted to the logarithms of the unregulated annual peak-discharge records from 363 stream-gaging stations in Virginia to estimate the peak discharge at these stations for recurrence intervals of 2 to 500 years. Peak-discharge characteristics for 284 unregulated stations are divided into eight regions based on physiographic province, and regressed on basin characteristics, including drainage area, main channel length, main channel slope, mean basin elevation, percentage of forest cover, mean annual precipitation, and maximum rainfall intensity. Regression equations for each region are computed by use of the generalized least-squares method, which accounts for spatial and temporal correlation between nearby gaging stations. This regression technique weights the significance of each station to the regional equation based on the length of records collected at each cation, the correlation between annual peak discharges among the stations, and the standard deviation of the annual peak discharge for each station.Drainage area proved to be the only significant explanatory variable in four regions, while other regions have as many as three significant variables. Standard errors of the regression equations range from 30 to 80 percent. Alternate equations using drainage area only are provided for the five regions with more than one significant explanatory variable.Methods and sample computations are provided to estimate peak discharges at gaged and engaged sites in Virginia for recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, and to adjust the regression estimates for sites on gaged streams where nearby gaging-station records are available.

  15. Evaluating Machine Learning Regression Algorithms for Operational Retrieval of Biophysical Parameters: Opportunities for Sentinel

    NASA Astrophysics Data System (ADS)

    Verrelst, Jochem; Rivera, J. P.; Alonso, L.; Guanter, L.; Moreno, J.

    2012-04-01

    ESA’s upcoming satellites Sentinel-2 (S2) and Sentinel-3 (S3) aim to ensure continuity for Landsat 5/7, SPOT- 5, SPOT-Vegetation and Envisat MERIS observations by providing superspectral images of high spatial and temporal resolution. S2 and S3 will deliver near real-time operational products with a high accuracy for land monitoring. This unprecedented data availability leads to an urgent need for developing robust and accurate retrieval methods. Machine learning regression algorithms could be powerful candidates for the estimation of biophysical parameters from satellite reflectance measurements because of their ability to perform adaptive, nonlinear data fitting. By using data from the ESA-led field campaign SPARC (Barrax, Spain), it was recently found [1] that Gaussian processes regression (GPR) outperformed competitive machine learning algorithms such as neural networks, support vector regression) and kernel ridge regression both in terms of accuracy and computational speed. For various Sentinel configurations (S2-10m, S2- 20m, S2-60m and S3-300m) three important biophysical parameters were estimated: leaf chlorophyll content (Chl), leaf area index (LAI) and fractional vegetation cover (FVC). GPR was the only method that reached the 10% precision required by end users in the estimation of Chl. In view of implementing the regressor into operational monitoring applications, here the portability of locally trained GPR models to other images was evaluated. The associated confidence maps proved to be a good indicator for evaluating the robustness of the trained models. Consistent retrievals were obtained across the different images, particularly over agricultural sites. To make the method suitable for operational use, however, the poorer confidences over bare soil areas suggest that the training dataset should be expanded with inputs from various land cover types.

  16. Assessing Wheat Traits by Spectral Reflectance: Do We Really Need to Focus on Predicted Trait-Values or Directly Identify the Elite Genotypes Group?

    PubMed

    Garriga, Miguel; Romero-Bravo, Sebastián; Estrada, Félix; Escobar, Alejandro; Matus, Iván A; Del Pozo, Alejandro; Astudillo, Cesar A; Lobos, Gustavo A

    2017-01-01

    Phenotyping, via remote and proximal sensing techniques, of the agronomic and physiological traits associated with yield potential and drought adaptation could contribute to improvements in breeding programs. In the present study, 384 genotypes of wheat ( Triticum aestivum L.) were tested under fully irrigated (FI) and water stress (WS) conditions. The following traits were evaluated and assessed via spectral reflectance: Grain yield (GY), spikes per square meter (SM2), kernels per spike (KPS), thousand-kernel weight (TKW), chlorophyll content (SPAD), stem water soluble carbohydrate concentration and content (WSC and WSCC, respectively), carbon isotope discrimination (Δ 13 C), and leaf area index (LAI). The performances of spectral reflectance indices (SRIs), four regression algorithms (PCR, PLSR, ridge regression RR, and SVR), and three classification methods (PCA-LDA, PLS-DA, and k NN) were evaluated for the prediction of each trait. For the classification approaches, two classes were established for each trait: The lower 80% of the trait variability range (Class 1) and the remaining 20% (Class 2 or elite genotypes). Both the SRIs and regression methods performed better when data from FI and WS were combined. The traits that were best estimated by SRIs and regression methods were GY and Δ 13 C. For most traits and conditions, the estimations provided by RR and SVR were the same, or better than, those provided by the SRIs. PLS-DA showed the best performance among the categorical methods and, unlike the SRI and regression models, most traits were relatively well-classified within a specific hydric condition (FI or WS), proving that classification approach is an effective tool to be explored in future studies related to genotype selection.

  17. USAF (United States Air Force) Stability and Control DATCOM (Data Compendium)

    DTIC Science & Technology

    1978-04-01

    regression analysis involves the study of a group of variables to determine their effect on a given parameter. Because of the empirical nature of this...regression analysis of mathematical statistics. In general, a regression analysis involves the study of a group of variables to determine their effect on a...Excperiment, OSR TN 58-114, MIT Fluid Dynamics Research Group Rapt. 57-5, 1957. (U) 90. Kennet, H., and Ashley, H.: Review of Unsteady Aerodynamic Studies in

  18. A Method of Calculating Functional Independence Measure at Discharge from Functional Independence Measure Effectiveness Predicted by Multiple Regression Analysis Has a High Degree of Predictive Accuracy.

    PubMed

    Tokunaga, Makoto; Watanabe, Susumu; Sonoda, Shigeru

    2017-09-01

    Multiple linear regression analysis is often used to predict the outcome of stroke rehabilitation. However, the predictive accuracy may not be satisfactory. The objective of this study was to elucidate the predictive accuracy of a method of calculating motor Functional Independence Measure (mFIM) at discharge from mFIM effectiveness predicted by multiple regression analysis. The subjects were 505 patients with stroke who were hospitalized in a convalescent rehabilitation hospital. The formula "mFIM at discharge = mFIM effectiveness × (91 points - mFIM at admission) + mFIM at admission" was used. By including the predicted mFIM effectiveness obtained through multiple regression analysis in this formula, we obtained the predicted mFIM at discharge (A). We also used multiple regression analysis to directly predict mFIM at discharge (B). The correlation between the predicted and the measured values of mFIM at discharge was compared between A and B. The correlation coefficients were .916 for A and .878 for B. Calculating mFIM at discharge from mFIM effectiveness predicted by multiple regression analysis had a higher degree of predictive accuracy of mFIM at discharge than that directly predicted. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  19. Large signal-to-noise ratio quantification in MLE for ARARMAX models

    NASA Astrophysics Data System (ADS)

    Zou, Yiqun; Tang, Xiafei

    2014-06-01

    It has been shown that closed-loop linear system identification by indirect method can be generally transferred to open-loop ARARMAX (AutoRegressive AutoRegressive Moving Average with eXogenous input) estimation. For such models, the gradient-related optimisation with large enough signal-to-noise ratio (SNR) can avoid the potential local convergence in maximum likelihood estimation. To ease the application of this condition, the threshold SNR needs to be quantified. In this paper, we build the amplitude coefficient which is an equivalence to the SNR and prove the finiteness of the threshold amplitude coefficient within the stability region. The quantification of threshold is achieved by the minimisation of an elaborately designed multi-variable cost function which unifies all the restrictions on the amplitude coefficient. The corresponding algorithm based on two sets of physically realisable system input-output data details the minimisation and also points out how to use the gradient-related method to estimate ARARMAX parameters when local minimum is present as the SNR is small. Then, the algorithm is tested on a theoretical AutoRegressive Moving Average with eXogenous input model for the derivation of the threshold and a gas turbine engine real system for model identification, respectively. Finally, the graphical validation of threshold on a two-dimensional plot is discussed.

  20. Vaccine-instructed intratumoral IFN-γ enables regression of autochthonous mouse prostate cancer in allogeneic T-cell transplantation.

    PubMed

    Hess Michelini, Rodrigo; Manzo, Teresa; Sturmheit, Tabea; Basso, Veronica; Rocchi, Martina; Freschi, Massimo; Listopad, Joanna; Blankenstein, Thomas; Bellone, Matteo; Mondino, Anna

    2013-08-01

    Vaccination can synergize with transplantation of allogeneic hematopoietic stem cells to cure hematologic malignancies, but the basis for this synergy is not understood to the degree where such approaches could be effective for treating solid tumors. We investigated this issue in a transgenic mouse model of prostate cancer treated by transplantation of a nonmyeloablative MHC-matched, single Y chromosome-encoded, or multiple minor histocompatibility antigen-mismatched hematopoietic cell preparation. Here, we report that tumor-directed vaccination after allogeneic hematopoietic stem cell transplantation and donor lymphocyte infusion is essential for acute graft versus tumor responses, tumor regression, and prolonged survival. Vaccination proved essential for generation of CD8(+) IFN-γ(+) tumor-directed effector cells in secondary lymphoid organs and also for IFN-γ(+) upregulation at the tumor site, which in turn instructed local expression of proinflammatory chemokines and intratumoral recruitment of donor-derived T cells for disease regression. Omitting vaccination, transplanting IFN-γ-deficient donor T cells, or depleting alloreactive T cells all compromised intratumoral IFN-γ-driven inflammation and lymphocyte infiltration, abolishing antitumor responses and therapeutic efficacy of the combined approach. Our findings argue that posttransplant tumor-directed vaccination is critical to effectively direct donor T cells to the tumor site in cooperation with allogeneic hematopoietic cell transplantation. ©2013 AACR.

  1. Anti-correlated networks, global signal regression, and the effects of caffeine in resting-state functional MRI.

    PubMed

    Wong, Chi Wah; Olafsson, Valur; Tal, Omer; Liu, Thomas T

    2012-10-15

    Resting-state functional connectivity magnetic resonance imaging is proving to be an essential tool for the characterization of functional networks in the brain. Two of the major networks that have been identified are the default mode network (DMN) and the task positive network (TPN). Although prior work indicates that these two networks are anti-correlated, the findings are controversial because the anti-correlations are often found only after the application of a pre-processing step, known as global signal regression, that can produce artifactual anti-correlations. In this paper, we show that, for subjects studied in an eyes-closed rest state, caffeine can significantly enhance the detection of anti-correlations between the DMN and TPN without the need for global signal regression. In line with these findings, we find that caffeine also leads to widespread decreases in connectivity and global signal amplitude. Using a recently introduced geometric model of global signal effects, we demonstrate that these decreases are consistent with the removal of an additive global signal confound. In contrast to the effects observed in the eyes-closed rest state, caffeine did not lead to significant changes in global functional connectivity in the eyes-open rest state. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Evaluation of energy consumption during aerobic sewage sludge treatment in dairy wastewater treatment plant.

    PubMed

    Dąbrowski, Wojciech; Żyłka, Radosław; Malinowski, Paweł

    2017-02-01

    The subject of the research conducted in an operating dairy wastewater treatment plant (WWTP) was to examine electric energy consumption during sewage sludge treatment. The excess sewage sludge was aerobically stabilized and dewatered with a screw press. Organic matter varied from 48% to 56% in sludge after stabilization and dewatering. It proves that sludge was properly stabilized and it was possible to apply it as a fertilizer. Measurement factors for electric energy consumption for mechanically dewatered sewage sludge were determined, which ranged between 0.94 and 1.5 kWhm -3 with the average value at 1.17 kWhm -3 . The shares of devices used for sludge dewatering and aerobic stabilization in the total energy consumption of the plant were also established, which were 3% and 25% respectively. A model of energy consumption during sewage sludge treatment was estimated according to experimental data. Two models were applied: linear regression for dewatering process and segmented linear regression for aerobic stabilization. The segmented linear regression model was also applied to total energy consumption during sewage sludge treatment in the examined dairy WWTP. The research constitutes an introduction for further studies on defining a mathematical model used to optimize electric energy consumption by dairy WWTPs. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Spectroscopic Determination of Aboveground Biomass in Grasslands Using Spectral Transformations, Support Vector Machine and Partial Least Squares Regression

    PubMed Central

    Marabel, Miguel; Alvarez-Taboada, Flor

    2013-01-01

    Aboveground biomass (AGB) is one of the strategic biophysical variables of interest in vegetation studies. The main objective of this study was to evaluate the Support Vector Machine (SVM) and Partial Least Squares Regression (PLSR) for estimating the AGB of grasslands from field spectrometer data and to find out which data pre-processing approach was the most suitable. The most accurate model to predict the total AGB involved PLSR and the Maximum Band Depth index derived from the continuum removed reflectance in the absorption features between 916–1,120 nm and 1,079–1,297 nm (R2 = 0.939, RMSE = 7.120 g/m2). Regarding the green fraction of the AGB, the Area Over the Minimum index derived from the continuum removed spectra provided the most accurate model overall (R2 = 0.939, RMSE = 3.172 g/m2). Identifying the appropriate absorption features was proved to be crucial to improve the performance of PLSR to estimate the total and green aboveground biomass, by using the indices derived from those spectral regions. Ordinary Least Square Regression could be used as a surrogate for the PLSR approach with the Area Over the Minimum index as the independent variable, although the resulting model would not be as accurate. PMID:23925082

  4. Cross-Sectional HIV Incidence Surveillance: A Benchmarking of Approaches for Estimating the 'Mean Duration of Recent Infection'.

    PubMed

    Kassanjee, Reshma; De Angelis, Daniela; Farah, Marian; Hanson, Debra; Labuschagne, Jan Phillipus Lourens; Laeyendecker, Oliver; Le Vu, Stéphane; Tom, Brian; Wang, Rui; Welte, Alex

    2017-03-01

    The application of biomarkers for 'recent' infection in cross-sectional HIV incidence surveillance requires the estimation of critical biomarker characteristics. Various approaches have been employed for using longitudinal data to estimate the Mean Duration of Recent Infection (MDRI) - the average time in the 'recent' state. In this systematic benchmarking of MDRI estimation approaches, a simulation platform was used to measure accuracy and precision of over twenty approaches, in thirty scenarios capturing various study designs, subject behaviors and test dynamics that may be encountered in practice. Results highlight that assuming a single continuous sojourn in the 'recent' state can produce substantial bias. Simple interpolation provides useful MDRI estimates provided subjects are tested at regular intervals. Regression performs the best - while 'random effects' describe the subject-clustering in the data, regression models without random effects proved easy to implement, stable, and of similar accuracy in scenarios considered; robustness to parametric assumptions was improved by regressing 'recent'/'non-recent' classifications rather than continuous biomarker readings. All approaches were vulnerable to incorrect assumptions about subjects' (unobserved) infection times. Results provided show the relationships between MDRI estimation performance and the number of subjects, inter-visit intervals, missed visits, loss to follow-up, and aspects of biomarker signal and noise.

  5. Use of Multiple Regression and Use-Availability Analyses in Determining Habitat Selection by Gray Squirrels (Sciurus Carolinensis)

    Treesearch

    John W. Edwards; Susan C. Loeb; David C. Guynn

    1994-01-01

    Multiple regression and use-availability analyses are two methods for examining habitat selection. Use-availability analysis is commonly used to evaluate macrohabitat selection whereas multiple regression analysis can be used to determine microhabitat selection. We compared these techniques using behavioral observations (n = 5534) and telemetry locations (n = 2089) of...

  6. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  7. Enzyme replacement therapy for Anderson-Fabry disease: A complementary overview of a Cochrane publication through a linear regression and a pooled analysis of proportions from cohort studies

    PubMed Central

    El Dib, Regina; Gomaa, Huda; Ortiz, Alberto; Politei, Juan; Kapoor, Anil; Barreto, Fellype

    2017-01-01

    Background Anderson-Fabry disease (AFD) is an X-linked recessive inborn error of glycosphingolipid metabolism caused by a deficiency of alpha-galactosidase A. Renal failure, heart and cerebrovascular involvement reduce survival. A Cochrane review provided little evidence on the use of enzyme replacement therapy (ERT). We now complement this review through a linear regression and a pooled analysis of proportions from cohort studies. Objectives To evaluate the efficacy and safety of ERT for AFD. Materials and methods For the systematic review, a literature search was performed, from inception to March 2016, using Medline, EMBASE and LILACS. Inclusion criteria were cohort studies, patients with AFD on ERT or natural history, and at least one patient-important outcome (all-cause mortality, renal, cardiovascular or cerebrovascular events, and adverse events) reported. The pooled proportion and the confidence interval (CI) are shown for each outcome. Simple linear regressions for composite endpoints were performed. Results 77 cohort studies involving 15,305 participants proved eligible. The pooled proportions were as follows: a) for renal complications, agalsidase alfa 15.3% [95% CI 0.048, 0.303; I2 = 77.2%, p = 0.0005]; agalsidase beta 6% [95% CI 0.04, 0.07; I2 = not applicable]; and untreated patients 21.4% [95% CI 0.1522, 0.2835; I2 = 89.6%, p<0.0001]. Effect differences favored agalsidase beta compared to untreated patients; b) for cardiovascular complications, agalsidase alfa 28% [95% CI 0.07, 0.55; I2 = 96.7%, p<0.0001]; agalsidase beta 7% [95% CI 0.05, 0.08; I2 = not applicable]; and untreated patients 26.2% [95% CI 0.149, 0.394; I2 = 98.8%, p<0.0001]. Effect differences favored agalsidase beta compared to untreated patients; and c) for cerebrovascular complications, agalsidase alfa 11.1% [95% CI 0.058, 0.179; I2 = 70.5%, p = 0.0024]; agalsidase beta 3.5% [95% CI 0.024, 0.046; I2 = 0%, p = 0.4209]; and untreated patients 18.3% [95% CI 0.129, 0.245; I2 = 95% p < 0.0001]. Effect differences favored agalsidase beta over agalsidase alfa or untreated patients. A linear regression showed that Fabry patients receiving agalsidase alfa are more likely to have higher rates of composite endpoints compared to those receiving agalsidase beta. Conclusions Agalsidase beta is associated to a significantly lower incidence of renal, cardiovascular and cerebrovascular events than no ERT, and to a significantly lower incidence of cerebrovascular events than agalsidase alfa. In view of these results, the use of agalsidase beta for preventing major organ complications related to AFD can be recommended. PMID:28296917

  8. Enzyme replacement therapy for Anderson-Fabry disease: A complementary overview of a Cochrane publication through a linear regression and a pooled analysis of proportions from cohort studies.

    PubMed

    El Dib, Regina; Gomaa, Huda; Ortiz, Alberto; Politei, Juan; Kapoor, Anil; Barreto, Fellype

    2017-01-01

    Anderson-Fabry disease (AFD) is an X-linked recessive inborn error of glycosphingolipid metabolism caused by a deficiency of alpha-galactosidase A. Renal failure, heart and cerebrovascular involvement reduce survival. A Cochrane review provided little evidence on the use of enzyme replacement therapy (ERT). We now complement this review through a linear regression and a pooled analysis of proportions from cohort studies. To evaluate the efficacy and safety of ERT for AFD. For the systematic review, a literature search was performed, from inception to March 2016, using Medline, EMBASE and LILACS. Inclusion criteria were cohort studies, patients with AFD on ERT or natural history, and at least one patient-important outcome (all-cause mortality, renal, cardiovascular or cerebrovascular events, and adverse events) reported. The pooled proportion and the confidence interval (CI) are shown for each outcome. Simple linear regressions for composite endpoints were performed. 77 cohort studies involving 15,305 participants proved eligible. The pooled proportions were as follows: a) for renal complications, agalsidase alfa 15.3% [95% CI 0.048, 0.303; I2 = 77.2%, p = 0.0005]; agalsidase beta 6% [95% CI 0.04, 0.07; I2 = not applicable]; and untreated patients 21.4% [95% CI 0.1522, 0.2835; I2 = 89.6%, p<0.0001]. Effect differences favored agalsidase beta compared to untreated patients; b) for cardiovascular complications, agalsidase alfa 28% [95% CI 0.07, 0.55; I2 = 96.7%, p<0.0001]; agalsidase beta 7% [95% CI 0.05, 0.08; I2 = not applicable]; and untreated patients 26.2% [95% CI 0.149, 0.394; I2 = 98.8%, p<0.0001]. Effect differences favored agalsidase beta compared to untreated patients; and c) for cerebrovascular complications, agalsidase alfa 11.1% [95% CI 0.058, 0.179; I2 = 70.5%, p = 0.0024]; agalsidase beta 3.5% [95% CI 0.024, 0.046; I2 = 0%, p = 0.4209]; and untreated patients 18.3% [95% CI 0.129, 0.245; I2 = 95% p < 0.0001]. Effect differences favored agalsidase beta over agalsidase alfa or untreated patients. A linear regression showed that Fabry patients receiving agalsidase alfa are more likely to have higher rates of composite endpoints compared to those receiving agalsidase beta. Agalsidase beta is associated to a significantly lower incidence of renal, cardiovascular and cerebrovascular events than no ERT, and to a significantly lower incidence of cerebrovascular events than agalsidase alfa. In view of these results, the use of agalsidase beta for preventing major organ complications related to AFD can be recommended.

  9. Long-term outcomes in patients with rheumatologic disorders undergoing percutaneous coronary intervention: a BAsel Stent Kosten-Effektivitäts Trial-PROspective Validation Examination (BASKET-PROVE) sub-study.

    PubMed

    Nochioka, Kotaro; Biering-Sørensen, Tor; Hansen, Kim Wadt; Sørensen, Rikke; Pedersen, Sune; Jørgensen, Peter Godsk; Iversen, Allan; Shimokawa, Hiroaki; Jeger, Raban; Kaiser, Christoph; Pfisterer, Matthias; Galatius, Søren

    2017-12-01

    Rheumatologic disorders are characterised by inflammation and an increased risk of coronary artery disease (CAD). However, the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing percutaneous coronary intervention (PCI) is unknown. Thus, we aimed to examine the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing PCI. A post-hoc analysis was performed in 4605 patients (age: 63.3 ± 11.0 years; male: 76.6%) with ST-segment elevation myocardial infarction (STEMI; n = 1396), non-STEMI ( n = 1541), and stable CAD ( n = 1668) from the all-comer stent trials, the BAsel Stent Kosten-Effektivitäts Trial-PROspective Validation Examination (BASKET-PROVE) I and II trials. We evaluated the association between rheumatologic disorders and 2-year major adverse cardiac events (MACEs; cardiac death, nonfatal myocardial infarction (MI), and target vessel revascularisation (TVR)) by Cox regression analysis. Patients with rheumatologic disorders ( n = 197) were older, more often female, had a higher prevalence of renal disease, multi-vessel coronary disease, and bifurcation lesions, and had longer total stent lengths. During the 2-year follow-up, the MACE rate was 8.6% in the total cohort. After adjustment for potential confounders, rheumatologic disorders were associated with MACEs in the total cohort (adjusted hazard ratio: 1.55; 95% confidence interval (CI): 1.04-2.31) driven by the STEMI subgroup (adjusted hazard ratio: 2.38; 95% CI: 1.26-4.51). In all patients, rheumatologic disorders were associated with all-cause death (adjusted hazard ratio: 2.05; 95% CI: 1.14-3.70), cardiac death (adjusted hazard ratio: 2.63; 95% CI: 1.27-5.43), and non-fatal MI (adjusted hazard ratio: 2.64; 95% CI: 1.36-5.13), but not with TVR (adjusted hazard ratio: 0.81; 95% CI: 0.41-1.58). The presence of rheumatologic disorders appears to be independently associated with worse outcome in CAD patients undergoing PCI. This calls for further studies and focus on this high-risk group of patients following PCI.

  10. [Application of SAS macro to evaluated multiplicative and additive interaction in logistic and Cox regression in clinical practices].

    PubMed

    Nie, Z Q; Ou, Y Q; Zhuang, J; Qu, Y J; Mai, J Z; Chen, J M; Liu, X Q

    2016-05-01

    Conditional logistic regression analysis and unconditional logistic regression analysis are commonly used in case control study, but Cox proportional hazard model is often used in survival data analysis. Most literature only refer to main effect model, however, generalized linear model differs from general linear model, and the interaction was composed of multiplicative interaction and additive interaction. The former is only statistical significant, but the latter has biological significance. In this paper, macros was written by using SAS 9.4 and the contrast ratio, attributable proportion due to interaction and synergy index were calculated while calculating the items of logistic and Cox regression interactions, and the confidence intervals of Wald, delta and profile likelihood were used to evaluate additive interaction for the reference in big data analysis in clinical epidemiology and in analysis of genetic multiplicative and additive interactions.

  11. Prediction by regression and intrarange data scatter in surface-process studies

    USGS Publications Warehouse

    Toy, T.J.; Osterkamp, W.R.; Renard, K.G.

    1993-01-01

    Modeling is a major component of contemporary earth science, and regression analysis occupies a central position in the parameterization, calibration, and validation of geomorphic and hydrologic models. Although this methodology can be used in many ways, we are primarily concerned with the prediction of values for one variable from another variable. Examination of the literature reveals considerable inconsistency in the presentation of the results of regression analysis and the occurrence of patterns in the scatter of data points about the regression line. Both circumstances confound utilization and evaluation of the models. Statisticians are well aware of various problems associated with the use of regression analysis and offer improved practices; often, however, their guidelines are not followed. After a review of the aforementioned circumstances and until standard criteria for model evaluation become established, we recommend, as a minimum, inclusion of scatter diagrams, the standard error of the estimate, and sample size in reporting the results of regression analyses for most surface-process studies. ?? 1993 Springer-Verlag.

  12. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  13. Noninvasive spectral imaging of skin chromophores based on multiple regression analysis aided by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa

    2011-08-01

    In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.

  14. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    EPA Pesticide Factsheets

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  15. On Blowing Trumpets to the Tulips: To Prove or Not to Prove the Null Hypothesis--Comment on Bosch, Steinkamp, and Boller (2006)

    ERIC Educational Resources Information Center

    Wilson, David B.; Shadish, William R.

    2006-01-01

    The H. Bosch, F. Steinkamp, and E. Boller (see record 2006-08436-001) meta-analysis reaches mixed and cautious conclusions about the possibility of psychokinesis. The authors argue that, for both methodological and philosophical reasons, it is nearly impossible to draw any conclusions from this body of research. The authors do not agree that any…

  16. Jordan Country Analysis Brief

    EIA Publications

    2014-01-01

    Jordan, unlike its immediate neighbors, does not possess significant energy resources. As of January 2014, the Oil & Gas Journal estimated Jordan's proved oil reserves at just 1 million barrels and its proved natural gas reserves at slightly more than 200 billion cubic feet (Bcf). Oil shale resources have the potential to increase Jordan's reserves significantly, and the country plans to build the first oil shale-fired electricity generation facility in the Middle East after 2017.

  17. Clinical evaluation of a novel population-based regression analysis for detecting glaucomatous visual field progression.

    PubMed

    Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C

    2011-04-01

    The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF clusters. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Using "Excel" for White's Test--An Important Technique for Evaluating the Equality of Variance Assumption and Model Specification in a Regression Analysis

    ERIC Educational Resources Information Center

    Berenson, Mark L.

    2013-01-01

    There is consensus in the statistical literature that severe departures from its assumptions invalidate the use of regression modeling for purposes of inference. The assumptions of regression modeling are usually evaluated subjectively through visual, graphic displays in a residual analysis but such an approach, taken alone, may be insufficient…

  19. Rex fortran 4 system for combinatorial screening or conventional analysis of multivariate regressions

    Treesearch

    L.R. Grosenbaugh

    1967-01-01

    Describes an expansible computerized system that provides data needed in regression or covariance analysis of as many as 50 variables, 8 of which may be dependent. Alternatively, it can screen variously generated combinations of independent variables to find the regression with the smallest mean-squared-residual, which will be fitted if desired. The user can easily...

  20. Online ranking by projecting.

    PubMed

    Crammer, Koby; Singer, Yoram

    2005-01-01

    We discuss the problem of ranking instances. In our framework, each instance is associated with a rank or a rating, which is an integer in 1 to k. Our goal is to find a rank-prediction rule that assigns each instance a rank that is as close as possible to the instance's true rank. We discuss a group of closely related online algorithms, analyze their performance in the mistake-bound model, and prove their correctness. We describe two sets of experiments, with synthetic data and with the EachMovie data set for collaborative filtering. In the experiments we performed, our algorithms outperform online algorithms for regression and classification applied to ranking.

  1. NOAA AVHRR and its uses for rainfall and evapotranspiration monitoring

    NASA Technical Reports Server (NTRS)

    Kerr, Yann H.; Imbernon, J.; Dedieu, G.; Hautecoeur, O.; Lagouarde, J. P.

    1989-01-01

    NOAA-7 Advanced Very High Resolution Radiometer (AVHRR) Global Vegetation Indices (GVI) were used during the 1986 rainy season (June-September) over Senegal to monitor rainfall. The satellite data were used in conjunction with ground-based measurements so as to derive empirical relationships between rainfall and GVI. The regression obtained was then used to map the total rainfall corresponding to the growing season, yielding good results. Normalized Difference Vegetation Indices (NDVI) derived from High Resolution Picture Transmission (HRPT) data were also compared with actual evapotranspiration (ET) data and proved to be closely correlated with it with a time lapse of 20 days.

  2. Increased depression and anxiety in infertile Japanese women resulting from lack of husband's support and feelings of stress.

    PubMed

    Matsubayashi, Hidehiko; Hosaka, Takashi; Izumi, Shun-ichiro; Suzuki, Takahiro; Kondo, Akane; Makino, Tsunehisa

    2004-01-01

    We report that infertile women in Japan as well as in the Western world have high levels of emotional distress, anxiety, and depression. The reasons for anxiety and depression in infertile women are easy to presume but remain unclear. We conducted the present study to assess the relationship between the anxiety and depression of infertile Japanese women and their thought processes and emotional well-being with regard to their infertility. A cross-sectional questionnaire was administered to 101 infertile Japanese women who visited the infertility clinic at Tokai University. Inventories included the Hospital Anxiety and Depression Scale (HADS) and our original infertility questionnaire, which is composed of 22 questions to assess attitudes and emotional status in facing the stigma of infertility. After factor analysis, comparison between the HADS and the infertility questionnaire was made with simultaneous multiple regression analyses. Anxiety and depression in childless Japanese women were significantly associated with lack of husband's support and feeling stress. Our findings should prove useful in designing and implementing psychological support programs for infertile Japanese women. Psychological interventions to relieve or diminish these conditions might have significant therapeutic benefits for women attending infertility clinics in Japan.

  3. Contrast-enhanced ultrasound may distinguish gallbladder adenoma from cholesterol polyps: a prospective case-control study.

    PubMed

    Fei, Xiang; Lu, Wen-Ping; Luo, Yu-Kun; Xu, Jian-Hon; Li, Yan-Mi; Shi, Huai-Yin; Jiao, Zi-Yu; Li, Hong-tian

    2015-10-01

    The aim of this study was to find the independent risk factors related with gallbladder (GB) adenoma compared to cholesterol polyp by contrast-enhanced ultrasound (CEUS). Between January 2010 and September 2014, a total of 122 consecutive patients undergoing cholecystectomy for GB polypoid lesions were enrolled. Before cholecystectomy, each patient underwent conventional US and CEUS examination and all image features were documented. The patients were divided into adenoma group and cholesterol polyp group according to the pathological findings. All the image features between two groups were statistically compared. There were differences in patient age, lesion size, echogenicity, and vascularity of lesion between two groups (P < 0.05). There were differences in stalk width and enhancement intensity between the two groups (P < 0.05). Multiple logistic regression analysis proved that enhancement intensity, stalk of lesion, and vascularity were the independent risk factors related with GB adenoma (P < 0.05). CEUS could offer useful information to distinguish adenoma from cholesterol polyp. The treatment algorithm for gallbladder polyp lesions would likely benefit from CEUS as a routine imaging investigation, especially in cases where the polyp is larger than 1 cm.

  4. Formulation design and optimization of mouth dissolve tablets of nimesulide using vacuum drying technique.

    PubMed

    Gohel, Mukesh; Patel, Madhabhai; Amin, Avani; Agrawal, Ruchi; Dave, Rikita; Bariya, Nehal

    2004-04-26

    The purpose of this research was to develop mouth dissolve tablets of nimesulide. Granules containing nimesulide, camphor, crospovidone, and lactose were prepared by wet granulation technique. Camphor was sublimed from the dried granules by exposure to vacuum. The porous granules were then compressed. Alternatively, tablets were first prepared and later exposed to vacuum. The tablets were evaluated for percentage friability, wetting time, and disintegration time. In the investigation, a 32 full factorial design was used to investigate the joint influence of 2 formulation variables: amount of camphor and crospovidone. The results of multiple linear regression analysis revealed that for obtaining a rapidly disintegrating dosage form, tablets should be prepared using an optimum concentration of camphor and a higher percentage of crospovidone. A contour plot is also presented to graphically represent the effect of the independent variables on the disintegration time and percentage friability. A checkpoint batch was also prepared to prove the validity of the evolved mathematical model. Sublimation of camphor from tablets resulted in superior tablets as compared with the tablets prepared from granules that were exposed to vacuum. The systematic formulation approach helped in understanding the effect of formulation processing variables.

  5. Marker-based quantitative genetics in the wild?: the heritability and genetic correlation of chemical defenses in eucalyptus.

    PubMed

    Andrew, R L; Peakall, R; Wallis, I R; Wood, J T; Knight, E J; Foley, W J

    2005-12-01

    Marker-based methods for estimating heritability and genetic correlation in the wild have attracted interest because traditional methods may be impractical or introduce bias via G x E effects, mating system variation, and sampling effects. However, they have not been widely used, especially in plants. A regression-based approach, which uses a continuous measure of genetic relatedness, promises to be particularly appropriate for use in plants with mixed-mating systems and overlapping generations. Using this method, we found significant narrow-sense heritability of foliar defense chemicals in a natural population of Eucalyptus melliodora. We also demonstrated a genetic basis for the phenotypic correlation underlying an ecological example of conditioned flavor aversion involving different biosynthetic pathways. Our results revealed that heritability estimates depend on the spatial scale of the analysis in a way that offers insight into the distribution of genetic and environmental variance. This study is the first to successfully use a marker-based method to measure quantitative genetic parameters in a tree. We suggest that this method will prove to be a useful tool in other studies and offer some recommendations for future applications of the method.

  6. Fractal structures and fractal functions as disease indicators

    USGS Publications Warehouse

    Escos, J.M; Alados, C.L.; Emlen, J.M.

    1995-01-01

    Developmental instability is an early indicator of stress, and has been used to monitor the impacts of human disturbance on natural ecosystems. Here we investigate the use of different measures of developmental instability on two species, green peppers (Capsicum annuum), a plant, and Spanish ibex (Capra pyrenaica), an animal. For green peppers we compared the variance in allometric relationship between control plants, and a treatment group infected with the tomato spotted wilt virus. The results show that infected plants have a greater variance about the allometric regression line than the control plants. We also observed a reduction in complexity of branch structure in green pepper with a viral infection. Box-counting fractal dimension of branch architecture declined under stress infection. We also tested the reduction in complexity of behavioral patterns under stress situations in Spanish ibex (Capra pyrenaica). Fractal dimension of head-lift frequency distribution measures predator detection efficiency. This dimension decreased under stressful conditions, such as advanced pregnancy and parasitic infection. Feeding distribution activities reflect food searching efficiency. Power spectral analysis proves to be the most powerful tool for character- izing fractal behavior, revealing a reduction in complexity of time distribution activity under parasitic infection.

  7. Optimization of extraction of chitin from procambarus clarkia shell by Box-Behnken design

    NASA Astrophysics Data System (ADS)

    Dong, Fang; Qiu, Hailong; Jia, Shaoqian; Dai, Cuiping; Kong, Qingxin; Xu, Changliang

    2018-06-01

    This paper investigated the optimizing extraction processing of chitin from procambarus clarkia shell by Box-Behnken design. Firstly, four independent variables were explored in single factor experiments, namely, concentration of hydrochloric acid, soaking time, concentration of sodium hydroxide and reaction time. Then, based on the results of the above experiments, four factors and three levels experiments were planned by Box-Behnken design. According to the experimental results, we harvested a second-order polynomial equation using multiple regression analysis. In addition, the optimum extraction process of chitin of the model was obtained: concentration of HCl solution 1.54mol/L, soaking time 19.87h, concentration of NaOH solution 2.9mol/L and reaction time 3.54h. For proving the accuracy of the model, we finished the verification experiment under the following conditions: concentration of hydrochloric acid 1.5mol/L, soaking time 20h, concentration of sodium hydroxide 3mol/L and reaction time 3.5h. The actual yield of chitin reached 18.76%, which was very close to the predicted yield (18.66%) of the model. The result indicated that the optimum extraction processing of chitin was feasible and practical.

  8. Development of a rapid method for the automatic classification of biological agents' fluorescence spectral signatures

    NASA Astrophysics Data System (ADS)

    Carestia, Mariachiara; Pizzoferrato, Roberto; Gelfusa, Michela; Cenciarelli, Orlando; Ludovici, Gian Marco; Gabriele, Jessica; Malizia, Andrea; Murari, Andrea; Vega, Jesus; Gaudio, Pasquale

    2015-11-01

    Biosecurity and biosafety are key concerns of modern society. Although nanomaterials are improving the capacities of point detectors, standoff detection still appears to be an open issue. Laser-induced fluorescence of biological agents (BAs) has proved to be one of the most promising optical techniques to achieve early standoff detection, but its strengths and weaknesses are still to be fully investigated. In particular, different BAs tend to have similar fluorescence spectra due to the ubiquity of biological endogenous fluorophores producing a signal in the UV range, making data analysis extremely challenging. The Universal Multi Event Locator (UMEL), a general method based on support vector regression, is commonly used to identify characteristic structures in arrays of data. In the first part of this work, we investigate fluorescence emission spectra of different simulants of BAs and apply UMEL for their automatic classification. In the second part of this work, we elaborate a strategy for the application of UMEL to the discrimination of different BAs' simulants spectra. Through this strategy, it has been possible to discriminate between these BAs' simulants despite the high similarity of their fluorescence spectra. These preliminary results support the use of SVR methods to classify BAs' spectral signatures.

  9. Heart rate variability is reduced in underweight and overweight healthy adult women.

    PubMed

    Triggiani, Antonio Ivano; Valenzano, Anna; Ciliberti, Michela Anna Pia; Moscatelli, Fiorenzo; Villani, Stefano; Monda, Marcellino; Messina, Giovanni; Federici, Antonio; Babiloni, Claudio; Cibelli, Giuseppe

    2017-03-01

    Heart rate variability (HRV) is altered in obese subjects, but whether this is true also in underweight (UW) subjects is still under debate. We investigated the HRV profile in a sample of healthy adult women and its association with adiposity. Five-minute resting state electrocardiographic activity was recorded in 69 subjects grouped according to their body mass index, [23 normal weight (NW), 23 overweight/obese (OW) and 23 UW). Body fat mass (FM) was measured by bio-impedance. Frequency- and time-domain analyses were performed. Compared to NW, UW and OW subjects showed a significant decrease in HRV indices, as revealed by spectral analysis. No differences were observed between UW and OW subjects. A second-order polynomial regression unveiled an inverted U-shaped relationship between FM extent and HRV indices. A decrease of HRV indices was associated with changes in FM extent, proving that in UW and OW subjects, the adaptive flexibility of autonomic cardiac function was reduced. These findings provide important clues to guide future studies addressed to determine how changes in adiposity and autonomic cardiac function may contribute to health risk. © 2015 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  10. The effect of environmental performance and accounting characteristics to earnings informativeness

    NASA Astrophysics Data System (ADS)

    Herawaty, V.

    2018-01-01

    The objective of this empirical study is to analyze the influence of environmental performance and company’s accounting characteristics to earnings informativeness proxied by earnings response coefficient (ERC) on manufacturing companies listed on Indonesia Stock Exchange and consistently follow the PROPER assessment in 2010-2014. One of the company’s considerations is to create the green environment reflecting its environmental measures, drawing investors to respond to the company’s environmental performance. The data were obtained from Indonesian Capital Market Directory (ICMD), the Indonesia Stock Exchange homepage, the company’s annual reports, the decree of the Minister of Environment. The samples used in this research are 27 go public manufacturing companies listed on Indonesia Stock Exchange that consistently follow the PROPER in 2010-2014. The sampling technique used was the purposive method. This research uses multiple regression analysis. The results show that the environmental performance and profitability have a positive influence to earnings informativeness, while leverage has a negative influence to earnings informativeness. Growth opportunities as a control variable has a positive effect on earnings informativeness. This research has proved that the environmental performance is crucial through observing the investors’ reaction in the capital market.

  11. [Person-organization fit as a mediator of relationship between work environment and stress among social workers].

    PubMed

    Waszkowska, Małlgorzata; Andysz, Aleksandra; Merecz, Dorota

    2014-01-01

    Occupational stress of social workers is associated with various psychosocial hazards in the work environment. Some of them affect person-organization fit (P-O fit). The aim of the study was to verify a hypothesis on the mediating role of P-O fit in the relationship between work environment and stress. The research was based on a sample of 500 social workers directly involved in social work. The data were obtained using the Person-Organization Fit Questionnaire by Czarnota-Bojarska, the Work Environment Questionnaire developed by the Department of Occupational Psychology, Nofer Institute of Occupational Medicine, Łódź, and the Perceived Stress Scale (PSS-10) by Cohen et al. As revealed by the regression analysis of the 4 analyzed work environment factors, only organizational politics was significantly related with perceived stress. Complementary and supplementary dimensions of P-O fit and identification with organization were the mediators of the relationship between organizational policies and stress, but only complementary fit proved to be a total mediator. The results of the study suggest that person-organization fit, especially its complementary aspect, is an essential determinant of accomplishing the core functions of social work and good practice among social workers.

  12. Plasma homovanillic acid in adolescents with bulimia nervosa.

    PubMed

    Castro-Fornieles, Josefina; Deulofeu, Ramón; Martínez-Mallen, Esteve; Baeza, Immaculada; Fernández, Lorena; Lázaro, Luisa; Toro, Josep; Vila, Montserrat; Bernardo, Miquel

    2009-12-30

    Dopaminergic abnormalities in bulimia nervosa have been reported in some studies, but results are not consistent across studies. In the present study, clinical characteristics, plasma level of homovanillic acid (pHVA) and two scales - the Eating Attitudes Test (EAT) and the Beck Depression Inventory (BDI) - were assessed in 36 adolescent bulimia nervosa patients (mean age16.3 years, S.D. 1.1) who were consecutively seen on an Eating Disorder Unit. Levels of pHVA were also measured in 16 healthy control adolescents from the general population. Patients had significantly higher mean pHVA than controls. Eighteen patients (50%) had a pHVA level equal to or higher than the mean of control subjects plus one standard deviation, and this group of patients had significantly higher mean BDI scores and non-significantly higher mean EAT scores, although they did not differ from the other patients in age, time elapsed since the onset of disorder, body mass index and number of binges or vomits. Moreover, in logistic regression analysis the BDI score proved to be an independent predictor of high pHVA. The level of pHVA is increased in bulimia nervosa patients with high scores on measures of depressive and eating symptomatology.

  13. Pleiotrophin as a potential biomarker in breast cancer patients.

    PubMed

    Ma, Jiequn; Kong, Ying; Nan, Haocheng; Qu, Shengyang; Fu, Xiao; Jiang, Lili; Wang, Wenjuan; Guo, Hui; Zhao, Shounian; He, Jianjun; Nan, Kejun

    2017-03-01

    Pleiotrophin (PTN), a multifunctional growth factor, is up-regulated in many tumors. PTN is reported to play an important role in the regulation of several cellular processes. The objective of this study is to evaluate the clinical significance of PTN as a tumor marker in breast cancer (BC). Serum PTN levels were detected in 105 BC patients and 40 healthy volunteers using ELISA. In addition, PTN expression was examined in 80 BC tissues in a nested case-control study by immunohistochemistry. Serum PTN levels were elevated in BC patients compared to healthy controls. Area under receiver operating characteristic (ROC) curve was 0.878 (95% CI: 0.824-0.932). The sensitivity of serum PTN was superior to CEA and CA15-3. High serum PTN levels were associated with TNM stage, histology grade, and distant metastasis. Moreover, serum PTN levels decreased significantly after surgical treatment. In BC tissues, PTN expression was significantly higher in BC tissues relative to paired paracancerous tissues. Tissue PTN expression proved to be a prognostic factor for breast cancer according to multivariable logistic regression analysis. PTN could be considered as a potential biomarker for the presence of breast cancer. Copyright © 2016. Published by Elsevier B.V.

  14. A quick on-line state of health estimation method for Li-ion battery with incremental capacity curves processed by Gaussian filter

    NASA Astrophysics Data System (ADS)

    Li, Yi; Abdel-Monem, Mohamed; Gopalakrishnan, Rahul; Berecibar, Maitane; Nanini-Maury, Elise; Omar, Noshin; van den Bossche, Peter; Van Mierlo, Joeri

    2018-01-01

    This paper proposes an advanced state of health (SoH) estimation method for high energy NMC lithium-ion batteries based on the incremental capacity (IC) analysis. IC curves are used due to their ability of detect and quantify battery degradation mechanism. A simple and robust smoothing method is proposed based on Gaussian filter to reduce the noise on IC curves, the signatures associated with battery ageing can therefore be accurately identified. A linear regression relationship is found between the battery capacity with the positions of features of interest (FOIs) on IC curves. Results show that the developed SoH estimation function from one single battery cell is able to evaluate the SoH of other batteries cycled under different cycling depth with less than 2.5% maximum errors, which proves the robustness of the proposed method on SoH estimation. With this technique, partial charging voltage curves can be used for SoH estimation and the testing time can be therefore largely reduced. This method shows great potential to be applied in reality, as it only requires static charging curves and can be easily implemented in battery management system (BMS).

  15. Deductive Evaluation: Formal Code Analysis With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben. L

    2016-01-01

    We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.

  16. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    PubMed

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.

  17. Applications of statistics to medical science, III. Correlation and regression.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    In this third part of a series surveying medical statistics, the concepts of correlation and regression are reviewed. In particular, methods of linear regression and logistic regression are discussed. Arguments related to survival analysis will be made in a subsequent paper.

  18. Identification of molecular descriptors for design of novel Isoalloxazine derivatives as potential Acetylcholinesterase inhibitors against Alzheimer's disease.

    PubMed

    Gurung, Arun Bahadur; Aguan, Kripamoy; Mitra, Sivaprasad; Bhattacharjee, Atanu

    2017-06-01

    In Alzheimer's disease (AD), the level of Acetylcholine (ACh) neurotransmitter is reduced. Since Acetylcholinesterase (AChE) cleaves ACh, inhibitors of AChE are very much sought after for AD treatment. The side effects of current inhibitors necessitate development of newer AChE inhibitors. Isoalloxazine derivatives have proved to be promising (AChE) inhibitors. However, their structure-activity relationship studies have not been reported till date. In the present work, various quantitative structure-activity relationship (QSAR) building methods such as multiple linear regression (MLR), partial least squares ,and principal component regression were employed to derive 3D-QSAR models using steric and electrostatic field descriptors. Statistically significant model was obtained using MLR coupled with stepwise selection method having r 2  = .9405, cross validated r 2 (q 2 ) = .6683, and a high predictability (pred_r 2  = .6206 and standard error, pred_r 2 se = .2491). Steric and electrostatic contribution plot revealed three electrostatic fields E_496, E_386 and E_577 and one steric field S_60 contributing towards biological activity. A ligand-based 3D-pharmacophore model was generated consisting of eight pharmacophore features. Isoalloxazine derivatives were docked against human AChE, which revealed critical residues implicated in hydrogen bonds as well as hydrophobic interactions. The binding modes of docked complexes (AChE_IA1 and AChE_IA14) were validated by molecular dynamics simulation which showed their stable trajectories in terms of root mean square deviation and molecular mechanics/Poisson-Boltzmann surface area binding free energy analysis revealed key residues contributing significantly to overall binding energy. The present study may be useful in the design of more potent Isoalloxazine derivatives as AChE inhibitors.

  19. Hybrid boosters for future launch vehicles

    NASA Astrophysics Data System (ADS)

    Dargies, E.; Lo, R. E.

    There is a striking similarity in the design of the US Space Transportation System, the European ARI-ANE 5P and the Japanese II-II: they all use a high energy cryogenic core stage along with two large solid propellant rocket boosters (SRB's) in order to provide for a high lift-off thrust level. Prior to last years disasters with Challenger and Titan it was widely held that SRB's were cheap, uncomplicated and safe. Even before the revelation by these accidents of severe safety hazards, shuttle operations demonstrated that the SRB's were by no means as cheap as reusable systems ought to be. In addition, they became known as sources of considerable environmental pollution. In contrast, hybrid rocket propulsion systems offer the following potential advantages: • much higher savety level (TNT equivalent almost zero, shut-down capability in case of ignition failure of one unit, inert against unbonding) • choice of non-toxic propellant combinations • equal or higher specific performance For these reasons, system analysis were carried out to examine hybrids as potential alternative to SRB's. Various analytical tools (mass- and performance models, trajectory simulation etc.) were developed for parametrical studies of hybrid propulsion systems. Special attention was devoted to the well-known primary concern of hybrids: geometrical design of the solid fuel grain and regression rate of the ablating surface. Experimental data were used as input wherever possible. In 1985 first studies were carried out to find possible fields of application for hybrid rocket engines. A mass model and a performance model for hybrid rocket motors were developed, taking into account the peculiarities of hybrid combustion as there are i.e. low regression rate and shifting mixture ratio during operation. After some analytical work was done, hybrids proved to be a promising alternative to SRB's. Compared with solids, hybrids offer many advantages.

  20. [Predictors of physical incapacity degree to chronic hemodialysis patients in Kinshasa : Key role of the residual diuresis].

    PubMed

    Mokoli, Vieux Momeme; Bukabau, Justine Busanga; Izeidi, Patrick Parmba Osa; Luse, Jeanine Losa; Mukendi, Stéphane Kalambay; Mashinda, Désiré Kulimba; Makulo, Jean Robert Rissassy; Sumaili, Ernest Kiswaya; Lepira, François Bompeka; Nseka, Nazaire Mangani

    2016-12-01

    Identifying predictors of physical incapacity degree in patients on chronic hemodialysis in Kinshasa. Bicentric analytical study, between January 2007 and July 2013. Degree of physical handicap was evaluated at 6months of hemodialysis based on the scale of Rosser. Logistic regression sought the predictors of no or light physical incapacity (Rosser<3) vs. moderate to maximum (Rosser≥3). P was set at 0.05. One hundred twenty-seven patients (127) patients received at least 6months of hemodialysis (53.3±11years; 73.2 % male), 79 (62.2 %) had no or light incapacity and 48 (37.8 %) moderate to maximum. Predictors of lower physical incapacity in univaried analysis were: secured funding, high socioeconomic level, lack of diabetes mellitus, high body weight, normal systolic and diastolic blood pressure, residual diuresis 3months later, hemoglobin and hematocrit, low comorbidity, arteriovenous fistula, erythropoietin, at least 12hours of hemodialysis per week and lack of intradialytic complications. After logistic regression, a high residual diuresis 3months of hemodialysis has proved an independent predictor of lower physical Incapacity (aOR 0.998; P=0.024) next to the lack of diabetes mellitus (aOR 0.239; P=0.024), good control of systolic (aOR 0.958; P=0.013) and diastolic (aOR 1.089; P=0.003) blood pressure and the use of erythropoietin (aOR 5.687; P=0.004). Preserving residual diuresis is associated with lower physical incapacity and must be integrated in the management in hemodialysis. Copyright © 2016 Association Société de néphrologie. Published by Elsevier SAS. All rights reserved.

  1. Development and validation of a predictive model for excessive postpartum blood loss: A retrospective, cohort study.

    PubMed

    Rubio-Álvarez, Ana; Molina-Alarcón, Milagros; Arias-Arias, Ángel; Hernández-Martínez, Antonio

    2018-03-01

    postpartum haemorrhage is one of the leading causes of maternal morbidity and mortality worldwide. Despite the use of uterotonics agents as preventive measure, it remains a challenge to identify those women who are at increased risk of postpartum bleeding. to develop and to validate a predictive model to assess the risk of excessive bleeding in women with vaginal birth. retrospective cohorts study. "Mancha-Centro Hospital" (Spain). the elaboration of the predictive model was based on a derivation cohort consisting of 2336 women between 2009 and 2011. For validation purposes, a prospective cohort of 953 women between 2013 and 2014 were employed. Women with antenatal fetal demise, multiple pregnancies and gestations under 35 weeks were excluded METHODS: we used a multivariate analysis with binary logistic regression, Ridge Regression and areas under the Receiver Operating Characteristic curves to determine the predictive ability of the proposed model. there was 197 (8.43%) women with excessive bleeding in the derivation cohort and 63 (6.61%) women in the validation cohort. Predictive factors in the final model were: maternal age, primiparity, duration of the first and second stages of labour, neonatal birth weight and antepartum haemoglobin levels. Accordingly, the predictive ability of this model in the derivation cohort was 0.90 (95% CI: 0.85-0.93), while it remained 0.83 (95% CI: 0.74-0.92) in the validation cohort. this predictive model is proved to have an excellent predictive ability in the derivation cohort, and its validation in a latter population equally shows a good ability for prediction. This model can be employed to identify women with a higher risk of postpartum haemorrhage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Developmental trajectories of paediatric headache - sex-specific analyses and predictors.

    PubMed

    Isensee, Corinna; Fernandez Castelao, Carolin; Kröner-Herwig, Birgit

    2016-01-01

    Headache is the most common pain disorder in children and adolescents and is associated with diverse dysfunctions and psychological symptoms. Several studies evidenced sex-specific differences in headache frequency. Until now no study exists that examined sex-specific patterns of change in paediatric headache across time and included pain-related somatic and (socio-)psychological predictors. Latent Class Growth Analysis (LCGA) was used in order to identify different trajectory classes of headache across four annual time points in a population-based sample (n = 3 227; mean age 11.34 years; 51.2 % girls). In multinomial logistic regression analyses the influence of several predictors on the class membership was examined. For girls, a four-class model was identified as the best fitting model. While the majority of girls reported no (30.5 %) or moderate headache frequencies (32.5 %) across time, one class with a high level of headache days (20.8 %) and a class with an increasing headache frequency across time (16.2 %) were identified. For boys a two class model with a 'no headache class' (48.6 %) and 'moderate headache class' (51.4 %) showed the best model fit. Regarding logistic regression analyses, migraine and parental headache proved to be stable predictors across sexes. Depression/anxiety was a significant predictor for all pain classes in girls. Life events, dysfunctional stress coping and school burden were also able to differentiate at least between some classes in both sexes. The identified trajectories reflect sex-specific differences in paediatric headache, as seen in the number and type of classes extracted. The documented risk factors can deliver ideas for preventive actions and considerations for treatment programmes.

  3. Applications of stochastic models and geostatistical analyses to study sources and spatial patterns of soil heavy metals in a metalliferous industrial district of China.

    PubMed

    Zhong, Buqing; Liang, Tao; Wang, Lingqing; Li, Kexin

    2014-08-15

    An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Combined Socio-Behavioral Evaluation Improves the Differential Diagnosis Between the Behavioral Variant of Frontotemporal Dementia and Alzheimer's Disease: In Search of Neuropsychological Markers.

    PubMed

    Dodich, Alessandra; Cerami, Chiara; Cappa, Stefano F; Marcone, Alessandra; Golzi, Valeria; Zamboni, Michele; Giusti, Maria Cristina; Iannaccone, Sandro

    2018-01-01

    Current diagnostic criteria for behavioral variant of frontotemporal dementia (bvFTD) and typical Alzheimer's disease (AD) include a differential pattern of neuropsychological impairments (episodic memory deficit in typical AD and dysexecutive syndrome in bvFTD). There is, however, large evidence of a frequent overlap in neuropsychological features, making the differential diagnosis extremely difficult. In this retrospective study, we evaluated the diagnostic value of different cognitive and neurobehavioral markers in bvFTD and AD patient groups. We included 95 dementia patients with a clinical and biomarker evidence of bvFTD (n = 48) or typical AD (n = 47) pathology. A clinical 2-year follow-up confirmed clinical classification. Performances at basic cognitive tasks (memory, executive functions, visuo-spatial, language) as well as social cognition skills and neurobehavioral profiles have been recorded. A stepwise logistic regression model compared the neuropsychological profiles between groups and assessed the accuracy of cognitive and neurobehavioral markers in discriminating bvFTD from AD. Statistical comparison between patient groups proved social cognition and episodic memory impairments as main cognitive signatures of bvFTD and AD neuropsychological profiles, respectively. Only half of bvFTD patients showed attentive/executive deficits, questioning their role as cognitive marker of bvFTD. Notably, the large majority of bvFTD sample (i.e., 70%) poorly performed at delayed recall tasks. Logistic regression analysis identified social cognition performances, Frontal Behavioral Inventory and Mini-Mental State Examination scores as the best combination in distinguishing bvFTD from AD. Social cognition tasks and socio-behavioral questionnaires are recommended in clinical settings to improve the accuracy of early diagnosis of bvFTD.

  5. Network Intrusion Detection Based on a General Regression Neural Network Optimized by an Improved Artificial Immune Algorithm

    PubMed Central

    Wu, Jianfa; Peng, Dahao; Li, Zhuping; Zhao, Li; Ling, Huanzhang

    2015-01-01

    To effectively and accurately detect and classify network intrusion data, this paper introduces a general regression neural network (GRNN) based on the artificial immune algorithm with elitist strategies (AIAE). The elitist archive and elitist crossover were combined with the artificial immune algorithm (AIA) to produce the AIAE-GRNN algorithm, with the aim of improving its adaptivity and accuracy. In this paper, the mean square errors (MSEs) were considered the affinity function. The AIAE was used to optimize the smooth factors of the GRNN; then, the optimal smooth factor was solved and substituted into the trained GRNN. Thus, the intrusive data were classified. The paper selected a GRNN that was separately optimized using a genetic algorithm (GA), particle swarm optimization (PSO), and fuzzy C-mean clustering (FCM) to enable a comparison of these approaches. As shown in the results, the AIAE-GRNN achieves a higher classification accuracy than PSO-GRNN, but the running time of AIAE-GRNN is long, which was proved first. FCM and GA-GRNN were eliminated because of their deficiencies in terms of accuracy and convergence. To improve the running speed, the paper adopted principal component analysis (PCA) to reduce the dimensions of the intrusive data. With the reduction in dimensionality, the PCA-AIAE-GRNN decreases in accuracy less and has better convergence than the PCA-PSO-GRNN, and the running speed of the PCA-AIAE-GRNN was relatively improved. The experimental results show that the AIAE-GRNN has a higher robustness and accuracy than the other algorithms considered and can thus be used to classify the intrusive data. PMID:25807466

  6. Ambiguity and uncertainty tolerance, need for cognition, and their association with stress. A study among Italian practicing physicians

    PubMed Central

    Iannello, Paola; Mottini, Anna; Tirelli, Simone; Riva, Silvia; Antonietti, Alessandro

    2017-01-01

    ABSTRACT Medical practice is inherently ambiguous and uncertain. The physicians’ ability to tolerate ambiguity and uncertainty has been proved to have a great impact on clinical practice. The primary aim of the present study was to test the hypothesis that higher degree of physicians’ ambiguity and uncertainty intolerance and higher need for cognitive closure will predict higher work stress. Two hundred and twelve physicians (mean age = 42.94 years; SD = 10.72) from different medical specialties with different levels of expertise were administered a set of questionnaires measuring perceived levels of work-related stress, individual ability to tolerate ambiguity, stress deriving from uncertainty, and personal need for cognitive closure. A linear regression analysis was performed to examine which variables predict the perceived level of stress. The regression model was statistically significant [R2 = .32; F(10,206) = 8.78, p ≤ .001], thus showing that, after controlling for gender and medical specialty, ambiguity and uncertainty tolerance, decisiveness (a dimension included in need for closure), and the years of practice were significant predictors of perceived work-related stress. Findings from the present study have some implications for medical education. Given the great impact that the individual ability to tolerate ambiguity and uncertainty has on the physicians’ level of perceived work-related stress, it would be worth paying particular attention to such a skill in medical education settings. It would be crucial to introduce or to empower educational tools and strategies that could increase medical students’ ability to tolerate ambiguity and uncertainty. Abbreviations: JSQ: Job stress questionnaire; NFCS: Need for cognitive closure scale; PRU: Physicians’ reactions to uncertainty; TFA: Tolerance for ambiguity PMID:28178917

  7. Ambiguity and uncertainty tolerance, need for cognition, and their association with stress. A study among Italian practicing physicians.

    PubMed

    Iannello, Paola; Mottini, Anna; Tirelli, Simone; Riva, Silvia; Antonietti, Alessandro

    2017-01-01

    Medical practice is inherently ambiguous and uncertain. The physicians' ability to tolerate ambiguity and uncertainty has been proved to have a great impact on clinical practice. The primary aim of the present study was to test the hypothesis that higher degree of physicians' ambiguity and uncertainty intolerance and higher need for cognitive closure will predict higher work stress. Two hundred and twelve physicians (mean age = 42.94 years; SD = 10.72) from different medical specialties with different levels of expertise were administered a set of questionnaires measuring perceived levels of work-related stress, individual ability to tolerate ambiguity, stress deriving from uncertainty, and personal need for cognitive closure. A linear regression analysis was performed to examine which variables predict the perceived level of stress. The regression model was statistically significant [R 2  = .32; F(10,206) = 8.78, p ≤ .001], thus showing that, after controlling for gender and medical specialty, ambiguity and uncertainty tolerance, decisiveness (a dimension included in need for closure), and the years of practice were significant predictors of perceived work-related stress. Findings from the present study have some implications for medical education. Given the great impact that the individual ability to tolerate ambiguity and uncertainty has on the physicians' level of perceived work-related stress, it would be worth paying particular attention to such a skill in medical education settings. It would be crucial to introduce or to empower educational tools and strategies that could increase medical students' ability to tolerate ambiguity and uncertainty. JSQ: Job stress questionnaire; NFCS: Need for cognitive closure scale; PRU: Physicians' reactions to uncertainty; TFA: Tolerance for ambiguity.

  8. Process model comparison and transferability across bioreactor scales and modes of operation for a mammalian cell bioprocess.

    PubMed

    Craven, Stephen; Shirsat, Nishikant; Whelan, Jessica; Glennon, Brian

    2013-01-01

    A Monod kinetic model, logistic equation model, and statistical regression model were developed for a Chinese hamster ovary cell bioprocess operated under three different modes of operation (batch, bolus fed-batch, and continuous fed-batch) and grown on two different bioreactor scales (3 L bench-top and 15 L pilot-scale). The Monod kinetic model was developed for all modes of operation under study and predicted cell density, glucose glutamine, lactate, and ammonia concentrations well for the bioprocess. However, it was computationally demanding due to the large number of parameters necessary to produce a good model fit. The transferability of the Monod kinetic model structure and parameter set across bioreactor scales and modes of operation was investigated and a parameter sensitivity analysis performed. The experimentally determined parameters had the greatest influence on model performance. They changed with scale and mode of operation, but were easily calculated. The remaining parameters, which were fitted using a differential evolutionary algorithm, were not as crucial. Logistic equation and statistical regression models were investigated as alternatives to the Monod kinetic model. They were less computationally intensive to develop due to the absence of a large parameter set. However, modeling of the nutrient and metabolite concentrations proved to be troublesome due to the logistic equation model structure and the inability of both models to incorporate a feed. The complexity, computational load, and effort required for model development has to be balanced with the necessary level of model sophistication when choosing which model type to develop for a particular application. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  9. Roundness variation in JPEG images affects the automated process of nuclear immunohistochemical quantification: correction with a linear regression model.

    PubMed

    López, Carlos; Jaén Martinez, Joaquín; Lejeune, Marylène; Escrivà, Patricia; Salvadó, Maria T; Pons, Lluis E; Alvaro, Tomás; Baucells, Jordi; García-Rojo, Marcial; Cugat, Xavier; Bosch, Ramón

    2009-10-01

    The volume of digital image (DI) storage continues to be an important problem in computer-assisted pathology. DI compression enables the size of files to be reduced but with the disadvantage of loss of quality. Previous results indicated that the efficiency of computer-assisted quantification of immunohistochemically stained cell nuclei may be significantly reduced when compressed DIs are used. This study attempts to show, with respect to immunohistochemically stained nuclei, which morphometric parameters may be altered by the different levels of JPEG compression, and the implications of these alterations for automated nuclear counts, and further, develops a method for correcting this discrepancy in the nuclear count. For this purpose, 47 DIs from different tissues were captured in uncompressed TIFF format and converted to 1:3, 1:23 and 1:46 compression JPEG images. Sixty-five positive objects were selected from these images, and six morphological parameters were measured and compared for each object in TIFF images and those of the different compression levels using a set of previously developed and tested macros. Roundness proved to be the only morphological parameter that was significantly affected by image compression. Factors to correct the discrepancy in the roundness estimate were derived from linear regression models for each compression level, thereby eliminating the statistically significant differences between measurements in the equivalent images. These correction factors were incorporated in the automated macros, where they reduced the nuclear quantification differences arising from image compression. Our results demonstrate that it is possible to carry out unbiased automated immunohistochemical nuclear quantification in compressed DIs with a methodology that could be easily incorporated in different systems of digital image analysis.

  10. In silico prediction of nematic transition temperature for liquid crystals using quantitative structure-property relationship approaches.

    PubMed

    Fatemi, Mohammad Hossein; Ghorbanzad'e, Mehdi

    2009-11-01

    Quantitative structure-property relationship models for the prediction of the nematic transition temperature (T (N)) were developed by using multilinear regression analysis and a feedforward artificial neural network (ANN). A collection of 42 thermotropic liquid crystals was chosen as the data set. The data set was divided into three sets: for training, and an internal and external test set. Training and internal test sets were used for ANN model development, and the external test set was used for evaluation of the predictive power of the model. In order to build the models, a set of six descriptors were selected by the best multilinear regression procedure of the CODESSA program. These descriptors were: atomic charge weighted partial negatively charged surface area, relative negative charged surface area, polarity parameter/square distance, minimum most negative atomic partial charge, molecular volume, and the A component of moment of inertia, which encode geometrical and electronic characteristics of molecules. These descriptors were used as inputs to ANN. The optimized ANN model had 6:6:1 topology. The standard errors in the calculation of T (N) for the training, internal, and external test sets using the ANN model were 1.012, 4.910, and 4.070, respectively. To further evaluate the ANN model, a crossvalidation test was performed, which produced the statistic Q (2) = 0.9796 and standard deviation of 2.67 based on predicted residual sum of square. Also, the diversity test was performed to ensure the model's stability and prove its predictive capability. The obtained results reveal the suitability of ANN for the prediction of T (N) for liquid crystals using molecular structural descriptors.

  11. Methods for inferring health-related social networks among coworkers from online communication patterns.

    PubMed

    Matthews, Luke J; DeWan, Peter; Rula, Elizabeth Y

    2013-01-01

    Studies of social networks, mapped using self-reported contacts, have demonstrated the strong influence of social connections on the propensity for individuals to adopt or maintain healthy behaviors and on their likelihood to adopt health risks such as obesity. Social network analysis may prove useful for businesses and organizations that wish to improve the health of their populations by identifying key network positions. Health traits have been shown to correlate across friendship ties, but evaluating network effects in large coworker populations presents the challenge of obtaining sufficiently comprehensive network data. The purpose of this study was to evaluate methods for using online communication data to generate comprehensive network maps that reproduce the health-associated properties of an offline social network. In this study, we examined three techniques for inferring social relationships from email traffic data in an employee population using thresholds based on: (1) the absolute number of emails exchanged, (2) logistic regression probability of an offline relationship, and (3) the highest ranked email exchange partners. As a model of the offline social network in the same population, a network map was created using social ties reported in a survey instrument. The email networks were evaluated based on the proportion of survey ties captured, comparisons of common network metrics, and autocorrelation of body mass index (BMI) across social ties. Results demonstrated that logistic regression predicted the greatest proportion of offline social ties, thresholding on number of emails exchanged produced the best match to offline network metrics, and ranked email partners demonstrated the strongest autocorrelation of BMI. Since each method had unique strengths, researchers should choose a method based on the aspects of offline behavior of interest. Ranked email partners may be particularly useful for purposes related to health traits in a social network.

  12. Methods for Inferring Health-Related Social Networks among Coworkers from Online Communication Patterns

    PubMed Central

    Matthews, Luke J.; DeWan, Peter; Rula, Elizabeth Y.

    2013-01-01

    Studies of social networks, mapped using self-reported contacts, have demonstrated the strong influence of social connections on the propensity for individuals to adopt or maintain healthy behaviors and on their likelihood to adopt health risks such as obesity. Social network analysis may prove useful for businesses and organizations that wish to improve the health of their populations by identifying key network positions. Health traits have been shown to correlate across friendship ties, but evaluating network effects in large coworker populations presents the challenge of obtaining sufficiently comprehensive network data. The purpose of this study was to evaluate methods for using online communication data to generate comprehensive network maps that reproduce the health-associated properties of an offline social network. In this study, we examined three techniques for inferring social relationships from email traffic data in an employee population using thresholds based on: (1) the absolute number of emails exchanged, (2) logistic regression probability of an offline relationship, and (3) the highest ranked email exchange partners. As a model of the offline social network in the same population, a network map was created using social ties reported in a survey instrument. The email networks were evaluated based on the proportion of survey ties captured, comparisons of common network metrics, and autocorrelation of body mass index (BMI) across social ties. Results demonstrated that logistic regression predicted the greatest proportion of offline social ties, thresholding on number of emails exchanged produced the best match to offline network metrics, and ranked email partners demonstrated the strongest autocorrelation of BMI. Since each method had unique strengths, researchers should choose a method based on the aspects of offline behavior of interest. Ranked email partners may be particularly useful for purposes related to health traits in a social network. PMID:23418436

  13. High Serum Phosphorus Level Is Associated with Left Ventricular Diastolic Dysfunction in Peritoneal Dialysis Patients.

    PubMed

    Ye, Min; Tian, Na; Liu, Yanqiu; Li, Wei; Lin, Hong; Fan, Rui; Li, Cuiling; Liu, Donghong; Yao, Fengjuan

    We initiated this study to explore the relationships of serum phosphorus level with left ventricular ultrasound features and diastolic function in peritoneal dialysis (PD) patients. 174 patients with end-stage renal disease (ESRD) receiving PD were enrolled in this retrospective observational study. Conventional echocardiography examination and tissue Doppler imaging (TDI) were performed in each patient. Clinical information and laboratory data were also collected. Analyses of echocardiographic features were performed according to phosphorus quartiles groups. And multivariate regression models were used to determine the association between serum phosphorus and Left ventricular diastolic dysfunction (LVDD). With the increase of serum phosphorus levels, patients on PD showed an increased tissue Doppler-derived E/e' ratio of lateral wall (P < 0.001), indicating a deterioration of left ventricular diastolic function. Steady growths of left atrium and left ventricular diameters as well as increase of left ventricular muscle mass were also observed across the increasing quartiles of phosphorus, while left ventricular ejection fraction remained normal. In a multivariate analysis, the regression coefficient for E/e' ratio in the highest phosphorus quartile was almost threefold higher relative to those in the lowest quartile group. And compared with patients in the lowest phosphorus quartile (<1.34 mmol/L) those in the highest phosphorus quartile (>1.95 mmol/L) had a more than fivefold increased odds of E/e' ratio >15. Our study showed an early impairment of left ventricular diastolic function in peritoneal dialysis patients. High serum phosphorus level was independently associated with greater risk of LVDD in these patients. Whether serum phosphorus will be a useful target for prevention or improvement of LVDD remains to be proved by further studies.

  14. The effectiveness of formative assessment with understanding by design (UbD) stages in forming habits of mind in prospective teachers

    NASA Astrophysics Data System (ADS)

    Gloria, R. Y.; Sudarmin, S.; Wiyanto; Indriyanti, D. R.

    2018-03-01

    Habits of mind are intelligent thinking dispositions that every individual needs to have, and it needs an effort to form them as expected. A behavior can be formed by continuous practice; therefore the student's habits of mind can also be formed and trained. One effort that can be used to encourage the formation of habits of mind is a formative assessment strategy with the stages of UbD (Understanding by Design), and a study needs to be done to prove it. This study aims to determine the contribution of formative assessment to the value of habits of mind owned by prospective teachers. The method used is a quantitative method with a quasi-experimental design. To determine the effectiveness of formative assessment with Ubd stages on the formation of habits of mind, correlation test and regression analysis were conducted in the formative assessment questionnaire consisting of three components, i.e. feed back, peer assessment and self assessment, and habits of mind. The result of the research shows that from the three components of Formative Assessment, only Feedback component does not show correlation to students’ habits of mind (r = 0.323). While peer assessment component (r = 0. 732) and self assessment component (r = 0.625), both indicate correlation. From the regression test the overall component of the formative assessment contributed to the habits of mind at 57.1%. From the result of the research, it can be concluded that the formative assessment with Ubd stages is effective and contributes in forming the student's habits of mind; the formative assessment components that contributed the most are the peer assessment and self assessment. The greatest contribution goes to the Thinking interdependently category.

  15. Development of variable pathlength UV-vis spectroscopy combined with partial-least-squares regression for wastewater chemical oxygen demand (COD) monitoring.

    PubMed

    Chen, Baisheng; Wu, Huanan; Li, Sam Fong Yau

    2014-03-01

    To overcome the challenging task to select an appropriate pathlength for wastewater chemical oxygen demand (COD) monitoring with high accuracy by UV-vis spectroscopy in wastewater treatment process, a variable pathlength approach combined with partial-least squares regression (PLSR) was developed in this study. Two new strategies were proposed to extract relevant information of UV-vis spectral data from variable pathlength measurements. The first strategy was by data fusion with two data fusion levels: low-level data fusion (LLDF) and mid-level data fusion (MLDF). Predictive accuracy was found to improve, indicated by the lower root-mean-square errors of prediction (RMSEP) compared with those obtained for single pathlength measurements. Both fusion levels were found to deliver very robust PLSR models with residual predictive deviations (RPD) greater than 3 (i.e. 3.22 and 3.29, respectively). The second strategy involved calculating the slopes of absorbance against pathlength at each wavelength to generate slope-derived spectra. Without the requirement to select the optimal pathlength, the predictive accuracy (RMSEP) was improved by 20-43% as compared to single pathlength spectroscopy. Comparing to nine-factor models from fusion strategy, the PLSR model from slope-derived spectroscopy was found to be more parsimonious with only five factors and more robust with residual predictive deviation (RPD) of 3.72. It also offered excellent correlation of predicted and measured COD values with R(2) of 0.936. In sum, variable pathlength spectroscopy with the two proposed data analysis strategies proved to be successful in enhancing prediction performance of COD in wastewater and showed high potential to be applied in on-line water quality monitoring. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Impacts of education level and employment status on health-related quality of life in multiple sclerosis patients.

    PubMed

    Šabanagić-Hajrić, Selma; Alajbegović, Azra

    2015-02-01

    To evaluate the impacts of education level and employment status on health-related quality of life (HRQoL) in multiple sclerosis patients. This study included 100 multiple sclerosis patients treated at the Department of Neurology, Clinical Center of the University of Sarajevo. Inclusion criteria were the Expanded Disability Status Scale (EDSS) score between 1.0 and 6.5, age between 18 and 65 years, stable disease on enrollment. Quality of life (QoL) was evaluated by the Multiple Sclerosis Quality of Life-54 questionnaire (MSQoL-54). Mann-Whitney and Kruskal-Wallis test were used for comparisons. Linear regression analyses were performed to evaluate prediction value of educational level and employment status in predicting MSQOL-54 physical and mental composite scores. Full employment status had positive impact on physical health (54.85 vs. 37.90; p les than 0.001) and mental health (59.55 vs. 45.90; p les than 0.001) composite scores. Employment status retained its independent predictability for both physical (r(2)=0.105) and mental (r(2)=0.076) composite scores in linear regression analysis. Patients with college degree had slightly higher median value of physical (49.36 vs. 45.30) and mental health composite score (66.74 vs. 55.62) comparing to others, without statistically significant difference. Employment proved to be an important factor in predicting quality of life in multiple sclerosis patients. Higher education level may determine better QOL but without significant predictive value. Sustained employment and development of vocational rehabilitation programs for MS patients living in the country with high unemployment level is an important factor in improving both physical and mental health outcomes in MS patients.

  17. INNOVATIVE INSTRUMENTATION AND ANALYSIS OF THE TEMPERATURE MEASUREMENT FOR HIGH TEMPERATURE GASIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong W. Lee

    During this reporting period, the literature survey including the gasifier temperature measurement literature, the ultrasonic application and its background study in cleaning application, and spray coating process are completed. The gasifier simulator (cold model) testing has been successfully conducted. Four factors (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. The Analysis of Variance (ANOVA) was applied to analyze the test data. The analysis shows that all four factors are significant to the temperature measurements in the gasifier simulator (cold model). The regression analysis for the case with the normalizedmore » room temperature shows that linear model fits the temperature data with 82% accuracy (18% error). The regression analysis for the case without the normalized room temperature shows 72.5% accuracy (27.5% error). The nonlinear regression analysis indicates a better fit than that of the linear regression. The nonlinear regression model's accuracy is 88.7% (11.3% error) for normalized room temperature case, which is better than the linear regression analysis. The hot model thermocouple sleeve design and fabrication are completed. The gasifier simulator (hot model) design and the fabrication are completed. The system tests of the gasifier simulator (hot model) have been conducted and some modifications have been made. Based on the system tests and results analysis, the gasifier simulator (hot model) has met the proposed design requirement and the ready for system test. The ultrasonic cleaning method is under evaluation and will be further studied for the gasifier simulator (hot model) application. The progress of this project has been on schedule.« less

  18. The Economic Value of Mangroves: A Meta-Analysis

    Treesearch

    Marwa Salem; D. Evan Mercer

    2012-01-01

    This paper presents a synthesis of the mangrove ecosystem valuation literature through a meta-regression analysis. The main contribution of this study is that it is the first meta-analysis focusing solely on mangrove forests, whereas previous studies have included different types of wetlands. The number of studies included in the regression analysis is 44 for a total...

  19. Estimation of diffusion coefficients from voltammetric signals by support vector and gaussian process regression

    PubMed Central

    2014-01-01

    Background Support vector regression (SVR) and Gaussian process regression (GPR) were used for the analysis of electroanalytical experimental data to estimate diffusion coefficients. Results For simulated cyclic voltammograms based on the EC, Eqr, and EqrC mechanisms these regression algorithms in combination with nonlinear kernel/covariance functions yielded diffusion coefficients with higher accuracy as compared to the standard approach of calculating diffusion coefficients relying on the Nicholson-Shain equation. The level of accuracy achieved by SVR and GPR is virtually independent of the rate constants governing the respective reaction steps. Further, the reduction of high-dimensional voltammetric signals by manual selection of typical voltammetric peak features decreased the performance of both regression algorithms compared to a reduction by downsampling or principal component analysis. After training on simulated data sets, diffusion coefficients were estimated by the regression algorithms for experimental data comprising voltammetric signals for three organometallic complexes. Conclusions Estimated diffusion coefficients closely matched the values determined by the parameter fitting method, but reduced the required computational time considerably for one of the reaction mechanisms. The automated processing of voltammograms according to the regression algorithms yields better results than the conventional analysis of peak-related data. PMID:24987463

  20. Introduction to methodology of dose-response meta-analysis for binary outcome: With application on software.

    PubMed

    Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang

    2018-05-01

    Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  1. Predictors of postoperative outcomes of cubital tunnel syndrome treatments using multiple logistic regression analysis.

    PubMed

    Suzuki, Taku; Iwamoto, Takuji; Shizu, Kanae; Suzuki, Katsuji; Yamada, Harumoto; Sato, Kazuki

    2017-05-01

    This retrospective study was designed to investigate prognostic factors for postoperative outcomes for cubital tunnel syndrome (CubTS) using multiple logistic regression analysis with a large number of patients. Eighty-three patients with CubTS who underwent surgeries were enrolled. The following potential prognostic factors for disease severity were selected according to previous reports: sex, age, type of surgery, disease duration, body mass index, cervical lesion, presence of diabetes mellitus, Workers' Compensation status, preoperative severity, and preoperative electrodiagnostic testing. Postoperative severity of disease was assessed 2 years after surgery by Messina's criteria which is an outcome measure specifically for CubTS. Bivariate analysis was performed to select candidate prognostic factors for multiple linear regression analyses. Multiple logistic regression analysis was conducted to identify the association between postoperative severity and selected prognostic factors. Both bivariate and multiple linear regression analysis revealed only preoperative severity as an independent risk factor for poor prognosis, while other factors did not show any significant association. Although conflicting results exist regarding prognosis of CubTS, this study supports evidence from previous studies and concludes early surgical intervention portends the most favorable prognosis. Copyright © 2017 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  2. Drug treatment rates with beta-blockers and ACE-inhibitors/angiotensin receptor blockers and recurrences in takotsubo cardiomyopathy: A meta-regression analysis.

    PubMed

    Brunetti, Natale Daniele; Santoro, Francesco; De Gennaro, Luisa; Correale, Michele; Gaglione, Antonio; Di Biase, Matteo

    2016-07-01

    In a recent paper Singh et al. analyzed the effect of drug treatment on recurrence of takotsubo cardiomyopathy (TTC) in a comprehensive meta-analysis. The study found that recurrence rates were independent of clinic utilization of BB prescription, but inversely correlated with ACEi/ARB prescription: authors therefore conclude that ACEi/ARB rather than BB may reduce risk of recurrence. We aimed to re-analyze data reported in the study, now weighted for populations' size, in a meta-regression analysis. After multiple meta-regression analysis, we found a significant regression between rates of prescription of ACEi and rates of recurrence of TTC; regression was not statistically significant for BBs. On the bases of our re-analysis, we confirm that rates of recurrence of TTC are lower in populations of patients with higher rates of treatment with ACEi/ARB. That could not necessarily imply that ACEi may prevent recurrence of TTC, but barely that, for example, rates of recurrence are lower in cohorts more compliant with therapy or more prescribed with ACEi because more carefully followed. Randomized prospective studies are surely warranted. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  4. Examination of influential observations in penalized spline regression

    NASA Astrophysics Data System (ADS)

    Türkan, Semra

    2013-10-01

    In parametric or nonparametric regression models, the results of regression analysis are affected by some anomalous observations in the data set. Thus, detection of these observations is one of the major steps in regression analysis. These observations are precisely detected by well-known influence measures. Pena's statistic is one of them. In this study, Pena's approach is formulated for penalized spline regression in terms of ordinary residuals and leverages. The real data and artificial data are used to see illustrate the effectiveness of Pena's statistic as to Cook's distance on detecting influential observations. The results of the study clearly reveal that the proposed measure is superior to Cook's Distance to detect these observations in large data set.

  5. Robust analysis of trends in noisy tokamak confinement data using geodesic least squares regression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdoolaege, G., E-mail: geert.verdoolaege@ugent.be; Laboratory for Plasma Physics, Royal Military Academy, B-1000 Brussels; Shabbir, A.

    Regression analysis is a very common activity in fusion science for unveiling trends and parametric dependencies, but it can be a difficult matter. We have recently developed the method of geodesic least squares (GLS) regression that is able to handle errors in all variables, is robust against data outliers and uncertainty in the regression model, and can be used with arbitrary distribution models and regression functions. We here report on first results of application of GLS to estimation of the multi-machine scaling law for the energy confinement time in tokamaks, demonstrating improved consistency of the GLS results compared to standardmore » least squares.« less

  6. Study of the role of tumor necrosis factor-α (-308 G/A) and interleukin-10 (-1082 G/A) polymorphisms as potential risk factors to acute kidney injury in patients with severe sepsis using high-resolution melting curve analysis.

    PubMed

    Hashad, Doaa I; Elsayed, Eman T; Helmy, Tamer A; Elawady, Samier M

    2017-11-01

    Septic acute kidney injury (AKI) is a prevalent complication in intensive care units with an increased incidence of complications. The aim of the present study was to assess the use of high-resolution melting curve (HRM) analysis in investigating whether the genetic polymorphisms; -308 G/A of tumor necrosis factor-α (TNF-α), and -1082 G /A of Interleukin-10 (IL-10) genes may predispose patients diagnosed with severe sepsis to the development of AKI. One hundred and fifty patients with severe sepsis participated in the present study; only sixty-six developed AKI. Both polymorphisms were studied using HRM analysis. The low producer genotype of both studied polymorphism of TNF-α and IL-10 genes was associated with AKI. Using logistic regression analysis, the low producer genotypes remained an independent risk factor for AKI. A statistically significant difference was detected between both studied groups as regards the low producer genotype in both TNF-α (-308 G/A) and interleukin-10 (IL-10) (-1082 G/A) polymorphisms being prevalent in patients developing AKI. Principle conclusions: The low producer genotypes of both TNF-α (-308 G/A) and IL-10 (-1082 G/A) polymorphisms could be considered a risk factor for the development of AKI in critically ill patients with severe sepsis, thus management technique implemented for this category should be modulated rescuing this sector of patients from the grave deterioration to acute kidney injury. Using HRM for genotyping proved to be a highly efficient, simple, cost-effective genotyping technique that is most appropriate for the routine study of large-scale samples.

  7. Switching from subcutaneous insulin injection to oral vildagliptin administration in hemodialysis patients with type 2 diabetes: a pilot study.

    PubMed

    Yoshida, Naoshi; Babazono, Tetsuya; Hanai, Ko; Uchigata, Yasuko

    2016-08-01

    We conducted this pilot study to examine efficacy and safety of switching from subcutaneous injection of insulin to oral administration of a DPP-4 inhibitor, vildagliptin, in type 2 diabetic patients undergoing hemodialysis. Consecutive type 2 diabetic patients on hemodialysis who were switched from insulin to vildagliptin between August 2010 and April 2011 were extracted from the hospital database. In patients whose post-switch increase in glycated albumin (GA) levels was <1.5 % without resuming insulin at least 24 weeks, the switch was defined as efficacious. In patients who resumed insulin therapy due to worsening of glycemic control or in patients whose GA levels increased by 1.5 % or more, the switch was considered inefficacious. To predict patients in whom switch to vildagliptin proved efficacious, receiver-operating characteristic (ROC) analysis and logistic regression analysis were performed. A total of 20 patients were extracted; insulin dose was 12 ± 4 units/day; levels of GA and HbA1c was 21.0 ± 3.7 % and 6.5 ± 0.6 %, respectively. Among them, 11 patients were efficaciously switched to vildagliptin. ROC analysis and logistic analysis showed that patients with a shorter duration of diabetes, as well as lower levels of GA and HbA1c, appeared to have a higher likelihood of successful treatment switches. None of the patients developed hypoglycemic symptoms, ketoacidosis, or serious adverse events. In conclusion, efficacious change from insulin to vildagliptin was possible in approximately a half of type 2 diabetic dialysis patients. Long-term follow-up studies including large number of patients are needed to confirm these results.

  8. Tumor invasiveness defined by IASLC/ATS/ERS classification of ground-glass nodules can be predicted by quantitative CT parameters.

    PubMed

    Zhou, Qian-Jun; Zheng, Zhi-Chun; Zhu, Yong-Qiao; Lu, Pei-Ji; Huang, Jia; Ye, Jian-Ding; Zhang, Jie; Lu, Shun; Luo, Qing-Quan

    2017-05-01

    To investigate the potential value of CT parameters to differentiate ground-glass nodules between noninvasive adenocarcinoma and invasive pulmonary adenocarcinoma (IPA) as defined by IASLC/ATS/ERS classification. We retrospectively reviewed 211 patients with pathologically proved stage 0-IA lung adenocarcinoma which appeared as subsolid nodules, from January 2012 to January 2013 including 137 pure ground glass nodules (pGGNs) and 74 part-solid nodules (PSNs). Pathological data was classified under the 2011 IASLC/ATS/ERS classification. Both quantitative and qualitative CT parameters were used to determine the tumor invasiveness between noninvasive adenocarcinomas and IPAs. There were 154 noninvasive adenocarcinomas and 57 IPAs. In pGGNs, CT size and area, one-dimensional mean CT value and bubble lucency were significantly different between noninvasive adenocarcinomas and IPAs on univariate analysis. Multivariate regression and ROC analysis revealed that CT size and one-dimensional mean CT value were predictive of noninvasive adenocarcinomas compared to IPAs. Optimal cutoff value was 13.60 mm (sensitivity, 75.0%; specificity, 99.6%), and -583.60 HU (sensitivity, 68.8%; specificity, 66.9%). In PSNs, there were significant differences in CT size and area, solid component area, solid proportion, one-dimensional mean and maximum CT value, three-dimensional (3D) mean CT value between noninvasive adenocarcinomas and IPAs on univariate analysis. Multivariate and ROC analysis showed that CT size and 3D mean CT value were significantly differentiators. Optimal cutoff value was 19.64 mm (sensitivity, 53.7%; specificity, 93.9%), -571.63 HU (sensitivity, 85.4%; specificity, 75.8%). For pGGNs, CT size and one-dimensional mean CT value are determinants for tumor invasiveness. For PSNs, tumor invasiveness can be predicted by CT size and 3D mean CT value.

  9. Teaching Content Analysis through "Harry Potter"

    ERIC Educational Resources Information Center

    Messinger, Adam M.

    2012-01-01

    Content analysis is a valuable research tool for social scientists that unfortunately can prove challenging to teach to undergraduate students. Published classroom exercises designed to teach content analysis have thus far been predominantly envisioned as lengthy projects for upper-level courses. A brief and engaging exercise may be more…

  10. The influence of baseline marijuana use on treatment of cocaine dependence: application of an informative-priors bayesian approach.

    PubMed

    Green, Charles; Schmitz, Joy; Lindsay, Jan; Pedroza, Claudia; Lane, Scott; Agnelli, Rob; Kjome, Kimberley; Moeller, F Gerard

    2012-01-01

    Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1) examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2) apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect. A secondary data analysis of two previously published, double-blind, randomized controlled trials provided complete data for the historical (Study 1: N = 64 placebo), and current (Study 2: N = 113) data sets. Negative binomial regression evaluated Treatment Effectiveness Scores (TES) as a function of medication condition (levodopa/carbidopa, placebo), baseline marijuana use (days in past 30), and their interaction. Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES. The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect.

  11. Spatial and Temporal Characteristics of Insulator Contaminations Revealed by Daily Observations of Equivalent Salt Deposit Density

    PubMed Central

    Ruan, Ling; Han, Ge; Zhu, Zhongmin; Zhang, Miao; Gong, Wei

    2015-01-01

    The accurate estimation of deposits adhering on insulators is of great significance to prevent pollution flashovers which cause huge costs worldwide. Researchers have developed sensors using different technologies to monitor insulator contamination on a fine time scale. However, there is lack of analysis of these data to reveal spatial and temporal characteristics of insulator contamination, and as a result the scheduling of periodical maintenance of power facilities is highly dependent on personal experience. Owing to the deployment of novel sensors, daily Equivalent Salt Deposit Density (ESDD) observations of over two years were collected and analyzed for the first time. Results from 16 sites distributed in four regions of Hubei demonstrated that spatial heterogeneity can be seen at both the fine and coarse geographical scales, suggesting that current polluted area maps are necessary but are not sufficient conditions to guide the maintenance of power facilities. Both the local emission and the regional air pollution condition exert evident influences on deposit accumulation. A relationship between ESDD and PM10 was revealed by using regression analysis, proving that air pollution exerts influence on pollution accumulations on insulators. Moreover, the seasonality of ESDD was discovered for the first time by means of time series analysis, which could help engineers select appropriate times to clean the contamination. Besides, the trend component shows that the ESDD increases in a negative exponential fashion with the accumulation date (ESDD = a − b × exp(−time)) at a long time scale in real environments. PMID:25643058

  12. A new computer aided diagnosis system for evaluation of chronic liver disease with ultrasound shear wave elastography imaging.

    PubMed

    Gatos, Ilias; Tsantis, Stavros; Spiliopoulos, Stavros; Karnabatidis, Dimitris; Theotokas, Ioannis; Zoumpoulis, Pavlos; Loupas, Thanasis; Hazle, John D; Kagadis, George C

    2016-03-01

    Classify chronic liver disease (CLD) from ultrasound shear-wave elastography (SWE) imaging by means of a computer aided diagnosis (CAD) system. The proposed algorithm employs an inverse mapping technique (red-green-blue to stiffness) to quantify 85 SWE images (54 healthy and 31 with CLD). Texture analysis is then applied involving the automatic calculation of 330 first and second order textural features from every transformed stiffness value map to determine functional features that characterize liver elasticity and describe liver condition for all available stages. Consequently, a stepwise regression analysis feature selection procedure is utilized toward a reduced feature subset that is fed into the support vector machines (SVMs) classification algorithm in the design of the CAD system. With regard to the mapping procedure accuracy, the stiffness map values had an average difference of 0.01 ± 0.001 kPa compared to the quantification results derived from the color-box provided by the built-in software of the ultrasound system. Highest classification accuracy from the SVM model was 87.0% with sensitivity and specificity values of 83.3% and 89.1%, respectively. Receiver operating characteristic curves analysis gave an area under the curve value of 0.85 with [0.77-0.89] confidence interval. The proposed CAD system employing color to stiffness mapping and classification algorithms offered superior results, comparing the already published clinical studies. It could prove to be of value to physicians improving the diagnostic accuracy of CLD and can be employed as a second opinion tool for avoiding unnecessary invasive procedures.

  13. Fast classification and compositional analysis of cornstover fractions using Fourier transform near-infrared techniques.

    PubMed

    Philip Ye, X; Liu, Lu; Hayes, Douglas; Womac, Alvin; Hong, Kunlun; Sokhansanj, Shahab

    2008-10-01

    The objectives of this research were to determine the variation of chemical composition across botanical fractions of cornstover, and to probe the potential of Fourier transform near-infrared (FT-NIR) techniques in qualitatively classifying separated cornstover fractions and in quantitatively analyzing chemical compositions of cornstover by developing calibration models to predict chemical compositions of cornstover based on FT-NIR spectra. Large variations of cornstover chemical composition for wide calibration ranges, which is required by a reliable calibration model, were achieved by manually separating the cornstover samples into six botanical fractions, and their chemical compositions were determined by conventional wet chemical analyses, which proved that chemical composition varies significantly among different botanical fractions of cornstover. Different botanic fractions, having total saccharide content in descending order, are husk, sheath, pith, rind, leaf, and node. Based on FT-NIR spectra acquired on the biomass, classification by Soft Independent Modeling of Class Analogy (SIMCA) was employed to conduct qualitative classification of cornstover fractions, and partial least square (PLS) regression was used for quantitative chemical composition analysis. SIMCA was successfully demonstrated in classifying botanical fractions of cornstover. The developed PLS model yielded root mean square error of prediction (RMSEP %w/w) of 0.92, 1.03, 0.17, 0.27, 0.21, 1.12, and 0.57 for glucan, xylan, galactan, arabinan, mannan, lignin, and ash, respectively. The results showed the potential of FT-NIR techniques in combination with multivariate analysis to be utilized by biomass feedstock suppliers, bioethanol manufacturers, and bio-power producers in order to better manage bioenergy feedstocks and enhance bioconversion.

  14. The Influence of Baseline Marijuana Use on Treatment of Cocaine Dependence: Application of an Informative-Priors Bayesian Approach

    PubMed Central

    Green, Charles; Schmitz, Joy; Lindsay, Jan; Pedroza, Claudia; Lane, Scott; Agnelli, Rob; Kjome, Kimberley; Moeller, F. Gerard

    2012-01-01

    Background: Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1) examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2) apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect. Method: A secondary data analysis of two previously published, double-blind, randomized controlled trials provided complete data for the historical (Study 1: N = 64 placebo), and current (Study 2: N = 113) data sets. Negative binomial regression evaluated Treatment Effectiveness Scores (TES) as a function of medication condition (levodopa/carbidopa, placebo), baseline marijuana use (days in past 30), and their interaction. Results: Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES. Conclusion: The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect. PMID:23115553

  15. Retro-regression--another important multivariate regression improvement.

    PubMed

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  16. Regression analysis for solving diagnosis problem of children's health

    NASA Astrophysics Data System (ADS)

    Cherkashina, Yu A.; Gerget, O. M.

    2016-04-01

    The paper includes results of scientific researches. These researches are devoted to the application of statistical techniques, namely, regression analysis, to assess the health status of children in the neonatal period based on medical data (hemostatic parameters, parameters of blood tests, the gestational age, vascular-endothelial growth factor) measured at 3-5 days of children's life. In this paper a detailed description of the studied medical data is given. A binary logistic regression procedure is discussed in the paper. Basic results of the research are presented. A classification table of predicted values and factual observed values is shown, the overall percentage of correct recognition is determined. Regression equation coefficients are calculated, the general regression equation is written based on them. Based on the results of logistic regression, ROC analysis was performed, sensitivity and specificity of the model are calculated and ROC curves are constructed. These mathematical techniques allow carrying out diagnostics of health of children providing a high quality of recognition. The results make a significant contribution to the development of evidence-based medicine and have a high practical importance in the professional activity of the author.

  17. Regression analysis using dependent Polya trees.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J

    2013-11-30

    Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.

  18. A comparison of methods for the analysis of binomial clustered outcomes in behavioral research.

    PubMed

    Ferrari, Alberto; Comelli, Mario

    2016-12-01

    In behavioral research, data consisting of a per-subject proportion of "successes" and "failures" over a finite number of trials often arise. This clustered binary data are usually non-normally distributed, which can distort inference if the usual general linear model is applied and sample size is small. A number of more advanced methods is available, but they are often technically challenging and a comparative assessment of their performances in behavioral setups has not been performed. We studied the performances of some methods applicable to the analysis of proportions; namely linear regression, Poisson regression, beta-binomial regression and Generalized Linear Mixed Models (GLMMs). We report on a simulation study evaluating power and Type I error rate of these models in hypothetical scenarios met by behavioral researchers; plus, we describe results from the application of these methods on data from real experiments. Our results show that, while GLMMs are powerful instruments for the analysis of clustered binary outcomes, beta-binomial regression can outperform them in a range of scenarios. Linear regression gave results consistent with the nominal level of significance, but was overall less powerful. Poisson regression, instead, mostly led to anticonservative inference. GLMMs and beta-binomial regression are generally more powerful than linear regression; yet linear regression is robust to model misspecification in some conditions, whereas Poisson regression suffers heavily from violations of the assumptions when used to model proportion data. We conclude providing directions to behavioral scientists dealing with clustered binary data and small sample sizes. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  20. Regression analysis of informative current status data with the additive hazards model.

    PubMed

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

Top