Sample records for factor analysis method

  1. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  2. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  3. Comparisons of Exploratory and Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…

  4. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  5. Global sensitivity analysis for urban water quality modelling: Terminology, convergence and comparison of different methods

    NASA Astrophysics Data System (ADS)

    Vanrolleghem, Peter A.; Mannina, Giorgio; Cosenza, Alida; Neumann, Marc B.

    2015-03-01

    Sensitivity analysis represents an important step in improving the understanding and use of environmental models. Indeed, by means of global sensitivity analysis (GSA), modellers may identify both important (factor prioritisation) and non-influential (factor fixing) model factors. No general rule has yet been defined for verifying the convergence of the GSA methods. In order to fill this gap this paper presents a convergence analysis of three widely used GSA methods (SRC, Extended FAST and Morris screening) for an urban drainage stormwater quality-quantity model. After the convergence was achieved the results of each method were compared. In particular, a discussion on peculiarities, applicability, and reliability of the three methods is presented. Moreover, a graphical Venn diagram based classification scheme and a precise terminology for better identifying important, interacting and non-influential factors for each method is proposed. In terms of convergence, it was shown that sensitivity indices related to factors of the quantity model achieve convergence faster. Results for the Morris screening method deviated considerably from the other methods. Factors related to the quality model require a much higher number of simulations than the number suggested in literature for achieving convergence with this method. In fact, the results have shown that the term "screening" is improperly used as the method may exclude important factors from further analysis. Moreover, for the presented application the convergence analysis shows more stable sensitivity coefficients for the Extended-FAST method compared to SRC and Morris screening. Substantial agreement in terms of factor fixing was found between the Morris screening and Extended FAST methods. In general, the water quality related factors exhibited more important interactions than factors related to water quantity. Furthermore, in contrast to water quantity model outputs, water quality model outputs were found to be characterised by high non-linearity.

  6. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    PubMed

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  7. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    ERIC Educational Resources Information Center

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  8. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    NASA Astrophysics Data System (ADS)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  9. Product competitiveness analysis for e-commerce platform of special agricultural products

    NASA Astrophysics Data System (ADS)

    Wan, Fucheng; Ma, Ning; Yang, Dongwei; Xiong, Zhangyuan

    2017-09-01

    On the basis of analyzing the influence factors of the product competitiveness of the e-commerce platform of the special agricultural products and the characteristics of the analytical methods for the competitiveness of the special agricultural products, the price, the sales volume, the postage included service, the store reputation, the popularity, etc. were selected in this paper as the dimensionality for analyzing the competitiveness of the agricultural products, and the principal component factor analysis was taken as the competitiveness analysis method. Specifically, the web crawler was adopted to capture the information of various special agricultural products in the e-commerce platform ---- chi.taobao.com. Then, the original data captured thereby were preprocessed and MYSQL database was adopted to establish the information library for the special agricultural products. Then, the principal component factor analysis method was adopted to establish the analysis model for the competitiveness of the special agricultural products, and SPSS was adopted in the principal component factor analysis process to obtain the competitiveness evaluation factor system (support degree factor, price factor, service factor and evaluation factor) of the special agricultural products. Then, the linear regression method was adopted to establish the competitiveness index equation of the special agricultural products for estimating the competitiveness of the special agricultural products.

  10. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    ERIC Educational Resources Information Center

    Baglin, James

    2014-01-01

    Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…

  11. Prediction of quality attributes of chicken breast fillets by using Vis/NIR spectroscopy combined with factor analysis method

    USDA-ARS?s Scientific Manuscript database

    Visible/near-infrared (Vis/NIR) spectroscopy with wavelength range between 400 and 2500 nm combined with factor analysis method was tested to predict quality attributes of chicken breast fillets. Quality attributes, including color (L*, a*, b*), pH, and drip loss were analyzed using factor analysis ...

  12. A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research

    ERIC Educational Resources Information Center

    Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.

    2014-01-01

    Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…

  13. Methods for Improving Information from ’Undesigned’ Human Factors Experiments.

    DTIC Science & Technology

    Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction

  14. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  15. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    ERIC Educational Resources Information Center

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…

  16. Proposal for a recovery prediction method for patients affected by acute mediastinitis

    PubMed Central

    2012-01-01

    Background An attempt to find a prediction method of death risk in patients affected by acute mediastinitis. There is not such a tool described in available literature for that serious disease. Methods The study comprised 44 consecutive cases of acute mediastinitis. General anamnesis and biochemical data were included. Factor analysis was used to extract the risk characteristic for the patients. The most valuable results were obtained for 8 parameters which were selected for further statistical analysis (all collected during few hours after admission). Three factors reached Eigenvalue >1. Clinical explanations of these combined statistical factors are: Factor1 - proteinic status (serum total protein, albumin, and hemoglobin level), Factor2 - inflammatory status (white blood cells, CRP, procalcitonin), and Factor3 - general risk (age, number of coexisting diseases). Threshold values of prediction factors were estimated by means of statistical analysis (factor analysis, Statgraphics Centurion XVI). Results The final prediction result for the patients is constructed as simultaneous evaluation of all factor scores. High probability of death should be predicted if factor 1 value decreases with simultaneous increase of factors 2 and 3. The diagnostic power of the proposed method was revealed to be high [sensitivity =90%, specificity =64%], for Factor1 [SNC = 87%, SPC = 79%]; for Factor2 [SNC = 87%, SPC = 50%] and for Factor3 [SNC = 73%, SPC = 71%]. Conclusion The proposed prediction method seems a useful emergency signal during acute mediastinitis control in affected patients. PMID:22574625

  17. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. A Note on Procrustean Rotation in Exploratory Factor Analysis: A Computer Intensive Approach to Goodness-of-Fit Evaluation.

    ERIC Educational Resources Information Center

    Raykov, Tenko; Little, Todd D.

    1999-01-01

    Describes a method for evaluating results of Procrustean rotation to a target factor pattern matrix in exploratory factor analysis. The approach, based on the bootstrap method, yields empirical approximations of the sampling distributions of: (1) differences between target elements and rotated factor pattern matrices; and (2) the overall…

  19. Application of Gray Relational Analysis Method in Comprehensive Evaluation on the Customer Satisfaction of Automobile 4S Enterprises

    NASA Astrophysics Data System (ADS)

    Cenglin, Yao

    The car sales enterprises could continuously boost sales and expand customer groups, an important method is to enhance the customer satisfaction. The customer satisfaction of car sales enterprises (4S enterprises) depends on many factors. By using the grey relational analysis method, we could perfectly combine various factors in terms of customer satisfaction. And through the vertical contrast, car sales enterprises could find specific factors which will improve customer satisfaction, thereby increase sales volume and benefits. Gray relational analysis method has become a kind of good method and means to analyze and evaluate the enterprises.

  20. Factor Analysis via Components Analysis

    ERIC Educational Resources Information Center

    Bentler, Peter M.; de Leeuw, Jan

    2011-01-01

    When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…

  1. On the stability analysis of approximate factorization methods for 3D Euler and Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1993-01-01

    The convergence characteristics of various approximate factorizations for the 3D Euler and Navier-Stokes equations are examined using the von-Neumann stability analysis method. Three upwind-difference based factorizations and several central-difference based factorizations are considered for the Euler equations. In the upwind factorizations both the flux-vector splitting methods of Steger and Warming and van Leer are considered. Analysis of the Navier-Stokes equations is performed only on the Beam and Warming central-difference scheme. The range of CFL numbers over which each factorization is stable is presented for one-, two-, and three-dimensional flow. Also presented for each factorization is the CFL number at which the maximum eigenvalue is minimized, for all Fourier components, as well as for the high frequency range only. The latter is useful for predicting the effectiveness of multigrid procedures with these schemes as smoothers. Further, local mode analysis is performed to test the suitability of using a uniform flow field in the stability analysis. Some inconsistencies in the results from previous analyses are resolved.

  2. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  3. Adjusting for multiple prognostic factors in the analysis of randomised trials

    PubMed Central

    2013-01-01

    Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size. PMID:23898993

  4. The Relation between Factor Score Estimates, Image Scores, and Principal Component Scores

    ERIC Educational Resources Information Center

    Velicer, Wayne F.

    1976-01-01

    Investigates the relation between factor score estimates, principal component scores, and image scores. The three methods compared are maximum likelihood factor analysis, principal component analysis, and a variant of rescaled image analysis. (RC)

  5. Effective factors in providing holistic care: a qualitative study.

    PubMed

    Zamanzadeh, Vahid; Jasemi, Madineh; Valizadeh, Leila; Keogh, Brian; Taleghani, Fariba

    2015-01-01

    Holistic care is a comprehensive model of caring. Previous studies have shown that most nurses do not apply this method. Examining the effective factors in nurses' provision of holistic care can help with enhancing it. Studying these factors from the point of view of nurses will generate real and meaningful concepts and can help to extend this method of caring. A qualitative study was used to identify effective factors in holistic care provision. Data gathered by interviewing 14 nurses from university hospitals in Iran were analyzed with a conventional qualitative content analysis method and by using MAXQDA (professional software for qualitative and mixed methods data analysis) software. Analysis of data revealed three main themes as effective factors in providing holistic care: The structure of educational system, professional environment, and personality traits. Establishing appropriate educational, management systems, and promoting religiousness and encouragement will induce nurses to provide holistic care and ultimately improve the quality of their caring.

  6. The Infinitesimal Jackknife with Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  7. Analysis of Social Cohesion in Health Data by Factor Analysis Method: The Ghanaian Perspective

    ERIC Educational Resources Information Center

    Saeed, Bashiru I. I.; Xicang, Zhao; Musah, A. A. I.; Abdul-Aziz, A. R.; Yawson, Alfred; Karim, Azumah

    2013-01-01

    We investigated the study of the overall social cohesion of Ghanaians. In this study, we considered the paramount interest of the involvement of Ghanaians in their communities, their views of other people and institutions, and their level of interest in both local and national politics. The factor analysis method was employed for analysis using R…

  8. Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.

    PubMed

    Price, Bertram; MacNicoll, Michael

    2015-05-01

    A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. © 2015 Society for Risk Analysis.

  9. Effective Factors in Providing Holistic Care: A Qualitative Study

    PubMed Central

    Zamanzadeh, Vahid; Jasemi, Madineh; Valizadeh, Leila; Keogh, Brian; Taleghani, Fariba

    2015-01-01

    Background: Holistic care is a comprehensive model of caring. Previous studies have shown that most nurses do not apply this method. Examining the effective factors in nurses’ provision of holistic care can help with enhancing it. Studying these factors from the point of view of nurses will generate real and meaningful concepts and can help to extend this method of caring. Materials and Methods: A qualitative study was used to identify effective factors in holistic care provision. Data gathered by interviewing 14 nurses from university hospitals in Iran were analyzed with a conventional qualitative content analysis method and by using MAXQDA (professional software for qualitative and mixed methods data analysis) software. Results: Analysis of data revealed three main themes as effective factors in providing holistic care: The structure of educational system, professional environment, and personality traits. Conclusion: Establishing appropriate educational, management systems, and promoting religiousness and encouragement will induce nurses to provide holistic care and ultimately improve the quality of their caring. PMID:26009677

  10. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  11. Analyzing the Validity of the Adult-Adolescent Parenting Inventory for Low-Income Populations

    ERIC Educational Resources Information Center

    Lawson, Michael A.; Alameda-Lawson, Tania; Byrnes, Edward

    2017-01-01

    Objectives: The purpose of this study was to examine the construct and predictive validity of the Adult-Adolescent Parenting Inventory (AAPI-2). Methods: The validity of the AAPI-2 was evaluated using multiple statistical methods, including exploratory factor analysis, confirmatory factor analysis, and latent class analysis. These analyses were…

  12. Parameter Accuracy in Meta-Analyses of Factor Structures

    ERIC Educational Resources Information Center

    Gnambs, Timo; Staufenbiel, Thomas

    2016-01-01

    Two new methods for the meta-analysis of factor loadings are introduced and evaluated by Monte Carlo simulations. The direct method pools each factor loading individually, whereas the indirect method synthesizes correlation matrices reproduced from factor loadings. The results of the two simulations demonstrated that the accuracy of…

  13. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  14. A Comparison of Measurement Equivalence Methods Based on Confirmatory Factor Analysis and Item Response Theory.

    ERIC Educational Resources Information Center

    Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.

    Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…

  15. A Comparison of Component and Factor Patterns: A Monte Carlo Approach.

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; And Others

    1982-01-01

    Factor analysis, image analysis, and principal component analysis are compared with respect to the factor patterns they would produce under various conditions. The general conclusion that is reached is that the three methods produce results that are equivalent. (Author/JKS)

  16. How Factor Analysis Can Be Used in Classification.

    ERIC Educational Resources Information Center

    Harman, Harry H.

    This is a methodological study that suggests a taxometric technique for objective classification of yeasts. It makes use of the minres method of factor analysis and groups strains of yeast according to their factor profiles. The similarities are judged in the higher-dimensional space determined by the factor analysis, but otherwise rely on the…

  17. The Hull Method for Selecting the Number of Common Factors

    ERIC Educational Resources Information Center

    Lorenzo-Seva, Urbano; Timmerman, Marieke E.; Kiers, Henk A. L.

    2011-01-01

    A common problem in exploratory factor analysis is how many factors need to be extracted from a particular data set. We propose a new method for selecting the number of major common factors: the Hull method, which aims to find a model with an optimal balance between model fit and number of parameters. We examine the performance of the method in an…

  18. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  19. Identification of atmospheric organic sources using the carbon hollow tube-gas chromatography method and factor analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cobb, G.P.; Braman, R.S.; Gilbert, R.A.

    Atmospheric organics were sampled and analyzed by using the carbon hollow tube-gas chromatography method. Chromatograms from spice mixtures, cigarettes, and ambient air were analyzed. Principal factor analysis of row order chromatographic data produces factors which are eigenchromatograms of the components in the samples. Component sources are identified from the eigenchromatograms in all experiments and the individual eigenchromatogram corresponding to a particular source is determined in most cases. Organic sources in ambient air and in cigaretts are identified with 87% certainty. Analysis of clove cigarettes allows the determination of the relative amount of clove in different cigarettes. A new nondestructive qualitymore » control method using the hollow tube-gas chromatography analysis is discussed.« less

  20. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  1. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  2. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-08-26

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.

  3. Carrageenan :the difference between PNG and KCL gel precipitation method as Lactobacillus acidophilus encapsulation material

    NASA Astrophysics Data System (ADS)

    Setijawati, D.; Nursyam, H.; Salis, H.

    2018-04-01

    The study on the effects of using of materials and methods in the preparation of the microcapsules Lactobacillus acidophilus towards the viability has been done. The research method used is experimental laboratory design. Variable research was kind of material (A) as the first factor with sub factor (A1 = Eucheuma cottonii) (A2 = Eucheuma spinosum) (A3 = mixture of Eucheuma cottonii and Eucheuma spinosum 1:1 ratio), while the second factor is a method of extraction to produce caragenan (B) with sub factor (B1 = Philipine Natural Grade modification) (B2 = KCl gel Press Precipitation). Analysis of different influences uses Analysis Of Varians followed by Fisher’s test. Analysis of data uses Mini tab 16. The results shows that the kind of extraction factors and methods gave significantly different effects on the viability of Lactobacillus acidophilus. The highest mean of Viablity obtained in the treatment of materials with a mixture of Eucheuma cottonii and Eucheuma spinosum and used KCl Gel Press method is equal to 7.14 log (CFU / mL). It is ssuggested using of kappa-iota carrageenanmixture asencapsulation material with KCl Gel Press method on Lactobacillus acidophilus microencapsulation process because it treatment gavethe highest average of Lactobacillus acidophilus viability.

  4. Scale-Free Nonparametric Factor Analysis: A User-Friendly Introduction with Concrete Heuristic Examples.

    ERIC Educational Resources Information Center

    Mittag, Kathleen Cage

    Most researchers using factor analysis extract factors from a matrix of Pearson product-moment correlation coefficients. A method is presented for extracting factors in a non-parametric way, by extracting factors from a matrix of Spearman rho (rank correlation) coefficients. It is possible to factor analyze a matrix of association such that…

  5. Strategic Analysis and Plan for Implementing Telemedicine at Fort Greely

    DTIC Science & Technology

    2003-03-01

    Analysis The Situational Analysis tool assessed the environmental, market , and organizational factors involved in a Fort Greely telemedicine... Factors ): Medicaid reimbursement is now approved for Alaska regardless of method of healthcare delivery. Market Factors (Customers): The influx of...arrive are Active National Guardsmen and Fort Greely Telemedicine 50 their families. Market Factors (Services): Fairbanks Memorial Hospital (FMH) can

  6. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  7. Cross-Cultural Adaptation and Validation of the MPAM-R to Brazilian Portuguese and Proposal of a New Method to Calculate Factor Scores

    PubMed Central

    Albuquerque, Maicon R.; Lopes, Mariana C.; de Paula, Jonas J.; Faria, Larissa O.; Pereira, Eveline T.; da Costa, Varley T.

    2017-01-01

    In order to understand the reasons that lead individuals to practice physical activity, researchers developed the Motives for Physical Activity Measure-Revised (MPAM-R) scale. In 2010, a translation of MPAM-R to Portuguese and its validation was performed. However, psychometric measures were not acceptable. In addition, factor scores in some sports psychology scales are calculated by the mean of scores by items of the factor. Nevertheless, it seems appropriate that items with higher factor loadings, extracted by Factor Analysis, have greater weight in the factor score, as items with lower factor loadings have less weight in the factor score. The aims of the present study are to translate, validate the MPAM-R for Portuguese versions, and investigate agreement between two methods used to calculate factor scores. Three hundred volunteers who were involved in physical activity programs for at least 6 months were collected. Confirmatory Factor Analysis of the 30 items indicated that the version did not fit the model. After excluding four items, the final model with 26 items showed acceptable model fit measures by Exploratory Factor Analysis, as well as it conceptually supports the five factors as the original proposal. When two methods are compared to calculate factors scores, our results showed that only “Enjoyment” and “Appearance” factors showed agreement between methods to calculate factor scores. So, the Portuguese version of the MPAM-R can be used in a Brazilian context, and a new proposal for the calculation of the factor score seems to be promising. PMID:28293203

  8. Estimation of the behavior factor of existing RC-MRF buildings

    NASA Astrophysics Data System (ADS)

    Vona, Marco; Mastroberti, Monica

    2018-01-01

    In recent years, several research groups have studied a new generation of analysis methods for seismic response assessment of existing buildings. Nevertheless, many important developments are still needed in order to define more reliable and effective assessment procedures. Moreover, regarding existing buildings, it should be highlighted that due to the low knowledge level, the linear elastic analysis is the only analysis method allowed. The same codes (such as NTC2008, EC8) consider the linear dynamic analysis with behavior factor as the reference method for the evaluation of seismic demand. This type of analysis is based on a linear-elastic structural model subject to a design spectrum, obtained by reducing the elastic spectrum through a behavior factor. The behavior factor (reduction factor or q factor in some codes) is used to reduce the elastic spectrum ordinate or the forces obtained from a linear analysis in order to take into account the non-linear structural capacities. The behavior factors should be defined based on several parameters that influence the seismic nonlinear capacity, such as mechanical materials characteristics, structural system, irregularity and design procedures. In practical applications, there is still an evident lack of detailed rules and accurate behavior factor values adequate for existing buildings. In this work, some investigations of the seismic capacity of the main existing RC-MRF building types have been carried out. In order to make a correct evaluation of the seismic force demand, actual behavior factor values coherent with force based seismic safety assessment procedure have been proposed and compared with the values reported in the Italian seismic code, NTC08.

  9. Assessing Management Support for Worksite Health Promotion: Psychometric Analysis of the Leading by Example (LBE) Instrument

    PubMed Central

    Della, Lindsay J.; DeJoy, David M.; Goetzel, Ron Z.; Ozminkowski, Ronald J.; Wilson, Mark G.

    2009-01-01

    Objective This paper describes the development of the Leading by Example (LBE) instrument. Methods Exploratory factor analysis was used to obtain an initial factor structure. Factor validity was evaluated using confirmatory factor analysis methods. Cronbach’s alpha and item-total correlations provided information on the reliability of the factor subscales. Results Four subscales were identified: business alignment with health promotion objectives; awareness of the health-productivity link; worksite support for health promotion; leadership support for health promotion. Factor by group comparisons revealed that the initial factor structure is effective in detecting differences in organizational support for health promotion across different employee groups Conclusions Management support for health promotion can be assessed using the LBE, a brief, self-report questionnaire. Researchers can use the LBE to diagnose, track, and evaluate worksite health promotion programs. PMID:18517097

  10. Quantitative Methods for Analysing Joint Questionnaire Data: Exploring the Role of Joint in Force Design

    DTIC Science & Technology

    2015-08-01

    the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research

  11. An Analysis of Turnover Intentions: A Reexamination of Air Force Civil Engineering Company Grade Officers

    DTIC Science & Technology

    2012-03-01

    edu 75 Appendix C Factor Analysis of Measurement Items Interrole conflict Factor Analysis (FA): Table: KMO and Bartlett’s Test Kaiser-Meyer...Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. 77 POS FA: Table: KMO and Bartlett’s...Tempo FA: Table: KMO and Bartlett’s Test Kaiser-Meyer-Olkin Measure of Sampling Adequacy. .733 Bartlett’s Test of Sphericity Approx. Chi-Square

  12. Analysis of algae growth mechanism and water bloom prediction under the effect of multi-affecting factor.

    PubMed

    Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin

    2017-03-01

    The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.

  13. Combined target factor analysis and Bayesian soft-classification of interference-contaminated samples: forensic fire debris analysis.

    PubMed

    Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh

    2012-10-10

    A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. The Empirical Verification of an Assignment of Items to Subtests: The Oblique Multiple Group Method versus the Confirmatory Common Factor Method

    ERIC Educational Resources Information Center

    Stuive, Ilse; Kiers, Henk A. L.; Timmerman, Marieke E.; ten Berge, Jos M. F.

    2008-01-01

    This study compares two confirmatory factor analysis methods on their ability to verify whether correct assignments of items to subtests are supported by the data. The confirmatory common factor (CCF) method is used most often and defines nonzero loadings so that they correspond to the assignment of items to subtests. Another method is the oblique…

  15. Method for factor analysis of GC/MS data

    DOEpatents

    Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R

    2012-09-11

    The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.

  16. Affective Outcomes of Schooling: Full-Information Item Factor Analysis of a Student Questionnaire.

    ERIC Educational Resources Information Center

    Muraki, Eiji; Engelhard, George, Jr.

    Recent developments in dichotomous factor analysis based on multidimensional item response models (Bock and Aitkin, 1981; Muthen, 1978) provide an effective method for exploring the dimensionality of questionnaire items. Implemented in the TESTFACT program, this "full information" item factor analysis accounts not only for the pairwise joint…

  17. Item Factor Analysis: Current Approaches and Future Directions

    ERIC Educational Resources Information Center

    Wirth, R. J.; Edwards, Michael C.

    2007-01-01

    The rationale underlying factor analysis applies to continuous and categorical variables alike; however, the models and estimation methods for continuous (i.e., interval or ratio scale) data are not appropriate for item-level data that are categorical in nature. The authors provide a targeted review and synthesis of the item factor analysis (IFA)…

  18. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  19. Replace-approximation method for ambiguous solutions in factor analysis of ultrasonic hepatic perfusion

    NASA Astrophysics Data System (ADS)

    Zhang, Ji; Ding, Mingyue; Yuchi, Ming; Hou, Wenguang; Ye, Huashan; Qiu, Wu

    2010-03-01

    Factor analysis is an efficient technique to the analysis of dynamic structures in medical image sequences and recently has been used in contrast-enhanced ultrasound (CEUS) of hepatic perfusion. Time-intensity curves (TICs) extracted by factor analysis can provide much more diagnostic information for radiologists and improve the diagnostic rate of focal liver lesions (FLLs). However, one of the major drawbacks of factor analysis of dynamic structures (FADS) is nonuniqueness of the result when only the non-negativity criterion is used. In this paper, we propose a new method of replace-approximation based on apex-seeking for ambiguous FADS solutions. Due to a partial overlap of different structures, factor curves are assumed to be approximately replaced by the curves existing in medical image sequences. Therefore, how to find optimal curves is the key point of the technique. No matter how many structures are assumed, our method always starts to seek apexes from one-dimensional space where the original high-dimensional data is mapped. By finding two stable apexes from one dimensional space, the method can ascertain the third one. The process can be continued until all structures are found. This technique were tested on two phantoms of blood perfusion and compared to the two variants of apex-seeking method. The results showed that the technique outperformed two variants in comparison of region of interest measurements from phantom data. It can be applied to the estimation of TICs derived from CEUS images and separation of different physiological regions in hepatic perfusion.

  20. On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Ibraheem, S. O.; Demuren, A. O.

    1994-01-01

    A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.

  1. Analysis of the influencing factors of global energy interconnection development

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; He, Yongxiu; Ge, Sifan; Liu, Lin

    2018-04-01

    Under the background of building global energy interconnection and achieving green and low-carbon development, this paper grasps a new round of energy restructuring and the trend of energy technology change, based on the present situation of global and China's global energy interconnection development, established the index system of the impact of global energy interconnection development factors. A subjective and objective weight analysis of the factors affecting the development of the global energy interconnection was conducted separately by network level analysis and entropy method, and the weights are summed up by the method of additive integration, which gives the comprehensive weight of the influencing factors and the ranking of their influence.

  2. Selecting risk factors: a comparison of discriminant analysis, logistic regression and Cox's regression model using data from the Tromsø Heart Study.

    PubMed

    Brenn, T; Arnesen, E

    1985-01-01

    For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.

  3. The Effect of Missing Data Handling Methods on Goodness of Fit Indices in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Köse, Alper

    2014-01-01

    The primary objective of this study was to examine the effect of missing data on goodness of fit statistics in confirmatory factor analysis (CFA). For this aim, four missing data handling methods; listwise deletion, full information maximum likelihood, regression imputation and expectation maximization (EM) imputation were examined in terms of…

  4. A projection operator method for the analysis of magnetic neutron form factors

    NASA Astrophysics Data System (ADS)

    Kaprzyk, S.; Van Laar, B.; Maniawski, F.

    1981-03-01

    A set of projection operators in matrix form has been derived on the basis of decomposition of the spin density into a series of fully symmetrized cubic harmonics. This set of projection operators allows a formulation of the Fourier analysis of magnetic form factors in a convenient way. The presented method is capable of checking the validity of various theoretical models used for spin density analysis up to now. The general formalism is worked out in explicit form for the fcc and bcc structures and deals with that part of spin density which is contained within the sphere inscribed in the Wigner-Seitz cell. This projection operator method has been tested on the magnetic form factors of nickel and iron.

  5. Factor Analysis Methods and Validity Evidence: A Systematic Review of Instrument Development across the Continuum of Medical Education

    ERIC Educational Resources Information Center

    Wetzel, Angela Payne

    2011-01-01

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…

  6. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    PubMed

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  7. A Study of Algorithms for Covariance Structure Analysis with Specific Comparisons Using Factor Analysis.

    ERIC Educational Resources Information Center

    Lee, S. Y.; Jennrich, R. I.

    1979-01-01

    A variety of algorithms for analyzing covariance structures are considered. Additionally, two methods of estimation, maximum likelihood, and weighted least squares are considered. Comparisons are made between these algorithms and factor analysis. (Author/JKS)

  8. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    NASA Astrophysics Data System (ADS)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  9. Factor analysis of an instrument to measure the impact of disease on daily life.

    PubMed

    Pedrosa, Rafaela Batista Dos Santos; Rodrigues, Roberta Cunha Matheus; Padilha, Kátia Melissa; Gallani, Maria Cecília Bueno Jayme; Alexandre, Neusa Maria Costa

    2016-01-01

    to verify the structure of factors of an instrument to measure the Heart Valve Disease Impact on Daily Life (IDCV) when applied to coronary artery disease patients. the study included 153 coronary artery disease patients undergoing outpatient follow-up care. The IDCV structure of factors was initially assessed by means of confirmatory factor analysis and, subsequently, by exploratory factor analysis. The Varimax rotation method was used to estimate the main components of analysis, eigenvalues greater than one for extraction of factors, and factor loading greater than 0.40 for selection of items. Internal consistency was estimated using Cronbach's alpha coefficient. confirmatory factor analysis did not confirm the original structure of factors of the IDCV. Exploratory factor analysis showed three dimensions, which together explained 78% of the measurement variance. future studies with expansion of case selection are necessary to confirm the IDCV new structure of factors.

  10. More efficient parameter estimates for factor analysis of ordinal variables by ridge generalized least squares.

    PubMed

    Yuan, Ke-Hai; Jiang, Ge; Cheng, Ying

    2017-11-01

    Data in psychology are often collected using Likert-type scales, and it has been shown that factor analysis of Likert-type data is better performed on the polychoric correlation matrix than on the product-moment covariance matrix, especially when the distributions of the observed variables are skewed. In theory, factor analysis of the polychoric correlation matrix is best conducted using generalized least squares with an asymptotically correct weight matrix (AGLS). However, simulation studies showed that both least squares (LS) and diagonally weighted least squares (DWLS) perform better than AGLS, and thus LS or DWLS is routinely used in practice. In either LS or DWLS, the associations among the polychoric correlation coefficients are completely ignored. To mend such a gap between statistical theory and empirical work, this paper proposes new methods, called ridge GLS, for factor analysis of ordinal data. Monte Carlo results show that, for a wide range of sample sizes, ridge GLS methods yield uniformly more accurate parameter estimates than existing methods (LS, DWLS, AGLS). A real-data example indicates that estimates by ridge GLS are 9-20% more efficient than those by existing methods. Rescaled and adjusted test statistics as well as sandwich-type standard errors following the ridge GLS methods also perform reasonably well. © 2017 The British Psychological Society.

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    PubMed

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  13. INSTRUMENTS MEASURING PERCEIVED RACISM/RACIAL DISCRIMINATION: REVIEW AND CRITIQUE OF FACTOR ANALYTIC TECHNIQUES

    PubMed Central

    Atkins, Rahshida

    2015-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225

  14. Impacts of a flash flood on drinking water quality: case study of areas most affected by the 2012 Beijing flood.

    PubMed

    Sun, Rubao; An, Daizhi; Lu, Wei; Shi, Yun; Wang, Lili; Zhang, Can; Zhang, Ping; Qi, Hongjuan; Wang, Qiang

    2016-02-01

    In this study, we present a method for identifying sources of water pollution and their relative contributions in pollution disasters. The method uses a combination of principal component analysis and factor analysis. We carried out a case study in three rural villages close to Beijing after torrential rain on July 21, 2012. Nine water samples were analyzed for eight parameters, namely turbidity, total hardness, total dissolved solids, sulfates, chlorides, nitrates, total bacterial count, and total coliform groups. All of the samples showed different degrees of pollution, and most were unsuitable for drinking water as concentrations of various parameters exceeded recommended thresholds. Principal component analysis and factor analysis showed that two factors, the degree of mineralization and agricultural runoff, and flood entrainment, explained 82.50% of the total variance. The case study demonstrates that this method is useful for evaluating and interpreting large, complex water-quality data sets.

  15. Gene Ranking of RNA-Seq Data via Discriminant Non-Negative Matrix Factorization.

    PubMed

    Jia, Zhilong; Zhang, Xiang; Guan, Naiyang; Bo, Xiaochen; Barnes, Michael R; Luo, Zhigang

    2015-01-01

    RNA-sequencing is rapidly becoming the method of choice for studying the full complexity of transcriptomes, however with increasing dimensionality, accurate gene ranking is becoming increasingly challenging. This paper proposes an accurate and sensitive gene ranking method that implements discriminant non-negative matrix factorization (DNMF) for RNA-seq data. To the best of our knowledge, this is the first work to explore the utility of DNMF for gene ranking. When incorporating Fisher's discriminant criteria and setting the reduced dimension as two, DNMF learns two factors to approximate the original gene expression data, abstracting the up-regulated or down-regulated metagene by using the sample label information. The first factor denotes all the genes' weights of two metagenes as the additive combination of all genes, while the second learned factor represents the expression values of two metagenes. In the gene ranking stage, all the genes are ranked as a descending sequence according to the differential values of the metagene weights. Leveraging the nature of NMF and Fisher's criterion, DNMF can robustly boost the gene ranking performance. The Area Under the Curve analysis of differential expression analysis on two benchmarking tests of four RNA-seq data sets with similar phenotypes showed that our proposed DNMF-based gene ranking method outperforms other widely used methods. Moreover, the Gene Set Enrichment Analysis also showed DNMF outweighs others. DNMF is also computationally efficient, substantially outperforming all other benchmarked methods. Consequently, we suggest DNMF is an effective method for the analysis of differential gene expression and gene ranking for RNA-seq data.

  16. Applying parallel factor analysis and Tucker-3 methods on sensory and instrumental data to establish preference maps: case study on sweet corn varieties.

    PubMed

    Gere, Attila; Losó, Viktor; Györey, Annamária; Kovács, Sándor; Huzsvai, László; Nábrádi, András; Kókai, Zoltán; Sipos, László

    2014-12-01

    Traditional internal and external preference mapping methods are based on principal component analysis (PCA). However, parallel factor analysis (PARAFAC) and Tucker-3 methods could be a better choice. To evaluate the methods, preference maps of sweet corn varieties will be introduced. A preference map of eight sweet corn varieties was established using PARAFAC and Tucker-3 methods. Instrumental data were also integrated into the maps. The triplot created by the PARAFAC model explains better how odour is separated from texture or appearance, and how some varieties are separated from others. Internal and external preference maps were created using parallel factor analysis (PARAFAC) and Tucker-3 models employing both sensory (trained panel and consumers) and instrumental parameters simultaneously. Triplots of the applied three-way models have a competitive advantage compared to the traditional biplots of the PCA-based external preference maps. The solution of PARAFAC and Tucker-3 is very similar regarding the interpretation of the first and third factors. The main difference is due to the second factor as it differentiated the attributes better. Consumers who prefer 'super sweet' varieties (they place great emphasis especially on taste) are much younger and have significantly higher incomes, and buy sweet corn products rarely (once a month). Consumers who consume sweet corn products mainly because of their texture and appearance are significantly older and include a higher ratio of men. © 2014 Society of Chemical Industry.

  17. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    ERIC Educational Resources Information Center

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  18. Application of factor analysis of infrared spectra for quantitative determination of beta-tricalcium phosphate in calcium hydroxylapatite.

    PubMed

    Arsenyev, P A; Trezvov, V V; Saratovskaya, N V

    1997-01-01

    This work represents a method, which allows to determine phase composition of calcium hydroxylapatite basing on its infrared spectrum. The method uses factor analysis of the spectral data of calibration set of samples to determine minimal number of factors required to reproduce the spectra within experimental error. Multiple linear regression is applied to establish correlation between factor scores of calibration standards and their properties. The regression equations can be used to predict the property value of unknown sample. The regression model was built for determination of beta-tricalcium phosphate content in hydroxylapatite. Statistical estimation of quality of the model was carried out. Application of the factor analysis on spectral data allows to increase accuracy of beta-tricalcium phosphate determination and expand the range of determination towards its less concentration. Reproducibility of results is retained.

  19. Scale Development: Perceived Barriers to Public Use of School Recreational Facilities

    ERIC Educational Resources Information Center

    Spengler, John O.; Ko, Yong Jae; Connaughton, Daniel P.

    2012-01-01

    Objectives: To test an original scale assessing perceived barriers among school administrators to allowing community use of school recreational facilities outside of regular school hours. Methods: Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). Results: Using EFA and CFA, we found that a model including factors of…

  20. Analyse Factorielle d'une Batterie de Tests de Comprehension Orale et Ecrite (Factor Analysis of a Battery of Tests of Listening and Reading Comprehension). Melanges Pedagogiques, 1971.

    ERIC Educational Resources Information Center

    Lonchamp, F.

    This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…

  1. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications.

    PubMed

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-11-17

    Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends' preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.

  2. Multilevel poisson regression modelling for determining factors of dengue fever cases in bandung

    NASA Astrophysics Data System (ADS)

    Arundina, Davila Rubianti; Tantular, Bertho; Pontoh, Resa Septiani

    2017-03-01

    Scralatina or Dengue Fever is a kind of fever caused by serotype virus which Flavivirus genus and be known as Dengue Virus. Dengue Fever caused by Aedes Aegipty Mosquito bites who infected by a dengue virus. The study was conducted in 151 villages in Bandung. Health Analysts believes that there are two factors that affect the dengue cases, Internal factor (individual) and external factor (environment). The data who used in this research is hierarchical data. The method is used for hierarchical data modelling is multilevel method. Which is, the level 1 is village and level 2 is sub-district. According exploration data analysis, the suitable Multilevel Method is Random Intercept Model. Penalized Quasi Likelihood (PQL) approach on multilevel Poisson is a proper analysis to determine factors that affecting dengue cases in the city of Bandung. Clean and Healthy Behavior factor from the village level have an effect on the number of cases of dengue fever in the city of Bandung. Factor from the sub-district level has no effect.

  3. Improving social connection through a communities-of-practice-inspired cognitive work analysis approach.

    PubMed

    Euerby, Adam; Burns, Catherine M

    2014-03-01

    Increasingly, people work in socially networked environments. With growing adoption of enterprise social network technologies, supporting effective social community is becoming an important factor in organizational success. Relatively few human factors methods have been applied to social connection in communities. Although team methods provide a contribution, they do not suit design for communities. Wenger's community of practice concept, combined with cognitive work analysis, provided one way of designing for community. We used a cognitive work analysis approach modified with principles for supporting communities of practice to generate a new website design. Over several months, the community using the site was studied to examine their degree of social connectedness and communication levels. Social network analysis and communications analysis, conducted at three different intervals, showed increases in connections between people and between people and organizations, as well as increased communication following the launch of the new design. In this work, we suggest that human factors approaches can be effective in social environments, when applied considering social community principles. This work has implications for the development of new human factors methods as well as the design of interfaces for sociotechnical systems that have community building requirements.

  4. Spectral compression algorithms for the analysis of very large multivariate images

    DOEpatents

    Keenan, Michael R.

    2007-10-16

    A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.

  5. The Columbia Impairment Scale: Factor Analysis Using a Community Mental Health Sample

    ERIC Educational Resources Information Center

    Singer, Jonathan B.; Eack, Shaun M.; Greeno, Catherine M.

    2011-01-01

    Objective: The objective of this study was to test the factor structure of the parent version of the Columbia Impairment Scale (CIS) in a sample of mothers who brought their children for community mental health (CMH) services (n = 280). Method: Confirmatory factor analysis (CFA) was used to test the fit of the hypothesized four-factor structure…

  6. A factor analysis of the SSQ (Speech, Spatial, and Qualities of Hearing Scale).

    PubMed

    Akeroyd, Michael A; Guy, Fiona H; Harrison, Dawn L; Suller, Sharon L

    2014-02-01

    The speech, spatial, and qualities of hearing questionnaire (SSQ) is a self-report test of auditory disability. The 49 items ask how well a listener would do in many complex listening situations illustrative of real life. The scores on the items are often combined into the three main sections or into 10 pragmatic subscales. We report here a factor analysis of the SSQ that we conducted to further investigate its statistical properties and to determine its structure. Statistical factor analysis of questionnaire data, using parallel analysis to determine the number of factors to retain, oblique rotation of factors, and a bootstrap method to estimate the confidence intervals. 1220 people who have attended MRC IHR over the last decade. We found three clear factors, essentially corresponding to the three main sections of the SSQ. They are termed "speech understanding", "spatial perception", and "clarity, separation, and identification". Thirty-five of the SSQ questions were included in the three factors. There was partial evidence for a fourth factor, "effort and concentration", representing two more questions. These results aid in the interpretation and application of the SSQ and indicate potential methods for generating average scores.

  7. Non-negative matrix factorization in texture feature for classification of dementia with MRI data

    NASA Astrophysics Data System (ADS)

    Sarwinda, D.; Bustamam, A.; Ardaneswari, G.

    2017-07-01

    This paper investigates applications of non-negative matrix factorization as feature selection method to select the features from gray level co-occurrence matrix. The proposed approach is used to classify dementia using MRI data. In this study, texture analysis using gray level co-occurrence matrix is done to feature extraction. In the feature extraction process of MRI data, we found seven features from gray level co-occurrence matrix. Non-negative matrix factorization selected three features that influence of all features produced by feature extractions. A Naïve Bayes classifier is adapted to classify dementia, i.e. Alzheimer's disease, Mild Cognitive Impairment (MCI) and normal control. The experimental results show that non-negative factorization as feature selection method able to achieve an accuracy of 96.4% for classification of Alzheimer's and normal control. The proposed method also compared with other features selection methods i.e. Principal Component Analysis (PCA).

  8. Logistic Regression and Path Analysis Method to Analyze Factors influencing Students’ Achievement

    NASA Astrophysics Data System (ADS)

    Noeryanti, N.; Suryowati, K.; Setyawan, Y.; Aulia, R. R.

    2018-04-01

    Students' academic achievement cannot be separated from the influence of two factors namely internal and external factors. The first factors of the student (internal factors) consist of intelligence (X1), health (X2), interest (X3), and motivation of students (X4). The external factors consist of family environment (X5), school environment (X6), and society environment (X7). The objects of this research are eighth grade students of the school year 2016/2017 at SMPN 1 Jiwan Madiun sampled by using simple random sampling. Primary data are obtained by distributing questionnaires. The method used in this study is binary logistic regression analysis that aims to identify internal and external factors that affect student’s achievement and how the trends of them. Path Analysis was used to determine the factors that influence directly, indirectly or totally on student’s achievement. Based on the results of binary logistic regression, variables that affect student’s achievement are interest and motivation. And based on the results obtained by path analysis, factors that have a direct impact on student’s achievement are students’ interest (59%) and students’ motivation (27%). While the factors that have indirect influences on students’ achievement, are family environment (97%) and school environment (37).

  9. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications

    PubMed Central

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-01-01

    Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172

  10. Influential Observations in Principal Factor Analysis.

    ERIC Educational Resources Information Center

    Tanaka, Yutaka; Odaka, Yoshimasa

    1989-01-01

    A method is proposed for detecting influential observations in iterative principal factor analysis. Theoretical influence functions are derived for two components of the common variance decomposition. The major mathematical tool is the influence function derived by Tanaka (1988). (SLD)

  11. Detecting Outliers in Factor Analysis Using the Forward Search Algorithm

    ERIC Educational Resources Information Center

    Mavridis, Dimitris; Moustaki, Irini

    2008-01-01

    In this article we extend and implement the forward search algorithm for identifying atypical subjects/observations in factor analysis models. The forward search has been mainly developed for detecting aberrant observations in regression models (Atkinson, 1994) and in multivariate methods such as cluster and discriminant analysis (Atkinson, Riani,…

  12. Exploratory Bi-factor Analysis: The Oblique Case.

    PubMed

    Jennrich, Robert I; Bentler, Peter M

    2012-07-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (Psychometrika 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (Psychometrika 76:537-549, 2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bifactor rotation criterion designed to produce a rotated loading matrix that has an approximate bi-factor structure. Among other things this can be used as an aid in finding an explicit bi-factor structure for use in a confirmatory bi-factor analysis. They considered only orthogonal rotation. The purpose of this paper is to consider oblique rotation and to compare it to orthogonal rotation. Because there are many more oblique rotations of an initial loading matrix than orthogonal rotations, one expects the oblique results to approximate a bi-factor structure better than orthogonal rotations and this is indeed the case. A surprising result arises when oblique bi-factor rotation methods are applied to ideal data.

  13. An effective method to accurately calculate the phase space factors for β - β - decay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neacsu, Andrei; Horoi, Mihai

    2016-01-01

    Accurate calculations of the electron phase space factors are necessary for reliable predictions of double-beta decay rates and for the analysis of the associated electron angular and energy distributions. Here, we present an effective method to calculate these phase space factors that takes into account the distorted Coulomb field of the daughter nucleus, yet it allows one to easily calculate the phase space factors with good accuracy relative to the most exact methods available in the recent literature.

  14. Method for exploiting bias in factor analysis using constrained alternating least squares algorithms

    DOEpatents

    Keenan, Michael R.

    2008-12-30

    Bias plays an important role in factor analysis and is often implicitly made use of, for example, to constrain solutions to factors that conform to physical reality. However, when components are collinear, a large range of solutions may exist that satisfy the basic constraints and fit the data equally well. In such cases, the introduction of mathematical bias through the application of constraints may select solutions that are less than optimal. The biased alternating least squares algorithm of the present invention can offset mathematical bias introduced by constraints in the standard alternating least squares analysis to achieve factor solutions that are most consistent with physical reality. In addition, these methods can be used to explicitly exploit bias to provide alternative views and provide additional insights into spectral data sets.

  15. Enhancing the estimation of blood pressure using pulse arrival time and two confounding factors.

    PubMed

    Baek, Hyun Jae; Kim, Ko Keun; Kim, Jung Soo; Lee, Boreom; Park, Kwang Suk

    2010-02-01

    A new method of blood pressure (BP) estimation using multiple regression with pulse arrival time (PAT) and two confounding factors was evaluated in clinical and unconstrained monitoring situations. For the first analysis with clinical data, electrocardiogram (ECG), photoplethysmogram (PPG) and invasive BP signals were obtained by a conventional patient monitoring device during surgery. In the second analysis, ECG, PPG and non-invasive BP were measured using systems developed to obtain data under conditions in which the subject was not constrained. To enhance the performance of BP estimation methods, heart rate (HR) and arterial stiffness were considered as confounding factors in regression analysis. The PAT and HR were easily extracted from ECG and PPG signals. For arterial stiffness, the duration from the maximum derivative point to the maximum of the dicrotic notch in the PPG signal, a parameter called TDB, was employed. In two experiments that normally cause BP variation, the correlation between measured BP and the estimated BP was investigated. Multiple-regression analysis with the two confounding factors improved correlation coefficients for diastolic blood pressure and systolic blood pressure to acceptable confidence levels, compared to existing methods that consider PAT only. In addition, reproducibility for the proposed method was determined using constructed test sets. Our results demonstrate that non-invasive, non-intrusive BP estimation can be obtained using methods that can be applied in both clinical and daily healthcare situations.

  16. A symmetrical subtraction combined with interpolated values for eliminating scattering from fluorescence EEM data

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Liu, Xiaofei; Wang, Yutian

    2016-08-01

    Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components.

  17. Confirmatory Factor Analysis of the Elementary School Success Profile for Teachers

    ERIC Educational Resources Information Center

    Webber, Kristina C.; Rizo, Cynthia F.; Bowen, Natasha K.

    2012-01-01

    Objectives: This study examines the factor structure and scale quality of data collected with the online Elementary School Success Profile (ESSP) for Teachers from a sample of teachers of 1,145 third through fifth graders. Methods: Confirmatory factor analysis (CFA) using Mplus and weighted least squares means and variances adjusted (WLSMV)…

  18. Contribution of artificial intelligence to the knowledge of prognostic factors in Hodgkin's lymphoma.

    PubMed

    Buciński, Adam; Marszałł, Michał Piotr; Krysiński, Jerzy; Lemieszek, Andrzej; Załuski, Jerzy

    2010-07-01

    Hodgkin's lymphoma is one of the most curable malignancies and most patients achieve a lasting complete remission. In this study, artificial neural network (ANN) analysis was shown to provide significant factors with regard to 5-year recurrence after lymphoma treatment. Data from 114 patients treated for Hodgkin's disease were available for evaluation and comparison. A total of 31 variables were subjected to ANN analysis. The ANN approach as an advanced multivariate data processing method was shown to provide objective prognostic data. Some of these prognostic factors are consistent or even identical to the factors evaluated earlier by other statistical methods.

  19. Job Performance as Multivariate Dynamic Criteria: Experience Sampling and Multiway Component Analysis.

    PubMed

    Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz

    2010-08-06

    Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.

  20. Exploratory factor analysis of the 12-item Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being Scale in people newly diagnosed with advanced cancer.

    PubMed

    Bai, Mei; Dixon, Jane K

    2014-01-01

    The purpose of this study was to reexamine the factor pattern of the 12-item Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being Scale (FACIT-Sp-12) using exploratory factor analysis in people newly diagnosed with advanced cancer. Principal components analysis (PCA) and 3 common factor analysis methods were used to explore the factor pattern of the FACIT-Sp-12. Factorial validity was assessed in association with quality of life (QOL). Principal factor analysis (PFA), iterative PFA, and maximum likelihood suggested retrieving 3 factors: Peace, Meaning, and Faith. Both Peace and Meaning positively related to QOL, whereas only Peace uniquely contributed to QOL. This study supported the 3-factor model of the FACIT-Sp-12. Suggestions for revision of items and further validation of the identified factor pattern were provided.

  1. Comparison of Two- and Three-Dimensional Methods for Analysis of Trunk Kinematic Variables in the Golf Swing.

    PubMed

    Smith, Aimée C; Roberts, Jonathan R; Wallace, Eric S; Kong, Pui; Forrester, Stephanie E

    2016-02-01

    Two-dimensional methods have been used to compute trunk kinematic variables (flexion/extension, lateral bend, axial rotation) and X-factor (difference in axial rotation between trunk and pelvis) during the golf swing. Recent X-factor studies advocated three-dimensional (3D) analysis due to the errors associated with two-dimensional (2D) methods, but this has not been investigated for all trunk kinematic variables. The purpose of this study was to compare trunk kinematic variables and X-factor calculated by 2D and 3D methods to examine how different approaches influenced their profiles during the swing. Trunk kinematic variables and X-factor were calculated for golfers from vectors projected onto the global laboratory planes and from 3D segment angles. Trunk kinematic variable profiles were similar in shape; however, there were statistically significant differences in trunk flexion (-6.5 ± 3.6°) at top of backswing and trunk right-side lateral bend (8.7 ± 2.9°) at impact. Differences between 2D and 3D X-factor (approximately 16°) could largely be explained by projection errors introduced to the 2D analysis through flexion and lateral bend of the trunk and pelvis segments. The results support the need to use a 3D method for kinematic data calculation to accurately analyze the golf swing.

  2. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rankmore » impacts both overcompleteness and sparsity.« less

  3. Missing in space: an evaluation of imputation methods for missing data in spatial analysis of risk factors for type II diabetes.

    PubMed

    Baker, Jannah; White, Nicole; Mengersen, Kerrie

    2014-11-20

    Spatial analysis is increasingly important for identifying modifiable geographic risk factors for disease. However, spatial health data from surveys are often incomplete, ranging from missing data for only a few variables, to missing data for many variables. For spatial analyses of health outcomes, selection of an appropriate imputation method is critical in order to produce the most accurate inferences. We present a cross-validation approach to select between three imputation methods for health survey data with correlated lifestyle covariates, using as a case study, type II diabetes mellitus (DM II) risk across 71 Queensland Local Government Areas (LGAs). We compare the accuracy of mean imputation to imputation using multivariate normal and conditional autoregressive prior distributions. Choice of imputation method depends upon the application and is not necessarily the most complex method. Mean imputation was selected as the most accurate method in this application. Selecting an appropriate imputation method for health survey data, after accounting for spatial correlation and correlation between covariates, allows more complete analysis of geographic risk factors for disease with more confidence in the results to inform public policy decision-making.

  4. Quantitative analysis of Si1-xGex alloy films by SIMS and XPS depth profiling using a reference material

    NASA Astrophysics Data System (ADS)

    Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong

    2018-02-01

    Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.

  5. Exploring factors that influence work analysis data: A meta-analysis of design choices, purposes, and organizational context.

    PubMed

    DuVernet, Amy M; Dierdorff, Erich C; Wilson, Mark A

    2015-09-01

    Work analysis is fundamental to designing effective human resource systems. The current investigation extends previous research by identifying the differential effects of common design decisions, purposes, and organizational contexts on the data generated by work analyses. The effects of 19 distinct factors that span choices of descriptor, collection method, rating scale, and data source, as well as project purpose and organizational features, are explored. Meta-analytic results cumulated from 205 articles indicate that many of these variables hold significant consequences for work analysis data. Factors pertaining to descriptor choice, collection method, rating scale, and the purpose for conducting the work analysis each showed strong associations with work analysis data. The source of the work analysis information and organizational context in which it was conducted displayed fewer relationships. Findings can be used to inform choices work analysts make about methodology and postcollection evaluations of work analysis information. (c) 2015 APA, all rights reserved).

  6. Generalized Structured Component Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Takane, Yoshio

    2004-01-01

    We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…

  7. An independent component analysis confounding factor correction framework for identifying broad impact expression quantitative trait loci

    PubMed Central

    Ju, Jin Hyun; Crystal, Ronald G.

    2017-01-01

    Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In light of these results, we discuss the broad impact eQTL that have been previously reported from the analysis of human data and suggest that considerable caution should be exercised when making biological inferences based on these reported eQTL. PMID:28505156

  8. An independent component analysis confounding factor correction framework for identifying broad impact expression quantitative trait loci.

    PubMed

    Ju, Jin Hyun; Shenoy, Sushila A; Crystal, Ronald G; Mezey, Jason G

    2017-05-01

    Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In light of these results, we discuss the broad impact eQTL that have been previously reported from the analysis of human data and suggest that considerable caution should be exercised when making biological inferences based on these reported eQTL.

  9. Confirmatory factor analysis of the Child Health Questionnaire-Parent Form 50 in a predominantly minority sample.

    PubMed

    Hepner, Kimberly A; Sechrest, Lee

    2002-12-01

    The Child Health Questionnaire-Parent Form 50 (CHQ-PF50; Landgraf JM et al., The CHQ User's Manual. Boston, MA: The Health Institute, New England Medical Centre, 1996) appears to be a useful method of assessing children's health. The CHQ-PF50 is designed to measure general functional status and well-being and is available in several versions to suit the needs of the health researcher. Several publications have reported favorably on the psychometric properties of the CHQ. Landgraf et al. reported the results of an exploratory factor analysis at the scale level that provided evidence for a two-factor structure representing physical and psychosocial dimensions of health. In order to cross-validate and extend these results, a confirmatory factor analysis was conducted with an independent sample of generally healthy, predominantly minority children. Results of the analysis indicate that a two-factor model provides a good fit to the data, confirming previous exploratory analyses with this questionnaire. One additional method factor seems likely because of the substantial similarity of three of the scales, but that does not affect the substantive two-factor interpretation overall.

  10. Towards a Probabilistic Preliminary Design Criterion for Buckling Critical Composite Shells

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann; Hilburger, Mark W.

    2003-01-01

    A probability-based analysis method for predicting buckling loads of compression-loaded laminated-composite shells is presented, and its potential as a basis for a new shell-stability design criterion is demonstrated and discussed. In particular, a database containing information about specimen geometry, material properties, and measured initial geometric imperfections for a selected group of laminated-composite cylindrical shells is used to calculate new buckling-load "knockdown factors". These knockdown factors are shown to be substantially improved, and hence much less conservative than the corresponding deterministic knockdown factors that are presently used by industry. The probability integral associated with the analysis is evaluated by using two methods; that is, by using the exact Monte Carlo method and by using an approximate First-Order Second- Moment method. A comparison of the results from these two methods indicates that the First-Order Second-Moment method yields results that are conservative for the shells considered. Furthermore, the results show that the improved, reliability-based knockdown factor presented always yields a safe estimate of the buckling load for the shells examined.

  11. Determination of effective loss factors in reduced SEA models

    NASA Astrophysics Data System (ADS)

    Chimeno Manguán, M.; Fernández de las Heras, M. J.; Roibás Millán, E.; Simón Hidalgo, F.

    2017-01-01

    The definition of Statistical Energy Analysis (SEA) models for large complex structures is highly conditioned by the classification of the structure elements into a set of coupled subsystems and the subsequent determination of the loss factors representing both the internal damping and the coupling between subsystems. The accurate definition of the complete system can lead to excessively large models as the size and complexity increases. This fact can also rise practical issues for the experimental determination of the loss factors. This work presents a formulation of reduced SEA models for incomplete systems defined by a set of effective loss factors. This reduced SEA model provides a feasible number of subsystems for the application of the Power Injection Method (PIM). For structures of high complexity, their components accessibility can be restricted, for instance internal equipments or panels. For these cases the use of PIM to carry out an experimental SEA analysis is not possible. New methods are presented for this case in combination with the reduced SEA models. These methods allow defining some of the model loss factors that could not be obtained through PIM. The methods are validated with a numerical analysis case and they are also applied to an actual spacecraft structure with accessibility restrictions: a solar wing in folded configuration.

  12. Application of Monte Carlo techniques to transient thermal modeling of cavity radiometers having diffuse-specular surfaces

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.; Eskin, L. D.

    1981-01-01

    A viable alternative to the net exchange method of radiative analysis which is equally applicable to diffuse and diffuse-specular enclosures is presented. It is particularly more advantageous to use than the net exchange method in the case of a transient thermal analysis involving conduction and storage of energy as well as radiative exchange. A new quantity, called the distribution factor is defined which replaces the angle factor and the configuration factor. Once obtained, the array of distribution factors for an ensemble of surface elements which define an enclosure permits the instantaneous net radiative heat fluxes to all of the surfaces to be computed directly in terms of the known surface temperatures at that instant. The formulation of the thermal model is described, as is the determination of distribution factors by application of a Monte Carlo analysis. The results show that when fewer than 10,000 packets are emitted, an unsatisfactory approximation for the distribution factors is obtained, but that 10,000 packets is sufficient.

  13. Necessary but Insufficient

    PubMed Central

    2017-01-01

    Abstract Cross-national data production in social science research has increased dramatically in recent decades. Assessing the comparability of data is necessary before drawing substantive conclusions that are based on cross-national data. Researchers assessing data comparability typically use either quantitative methods such as multigroup confirmatory factor analysis or qualitative methods such as online probing. Because both methods have complementary strengths and weaknesses, this study applies both multigroup confirmatory factor analysis and online probing in a mixed-methods approach to assess the comparability of constructive patriotism and nationalism, two important concepts in the study of national identity. Previous measurement invariance tests failed to achieve scalar measurement invariance, which prohibits a cross-national comparison of latent means (Davidov 2009). The arrival of the 2013 ISSP Module on National Identity has encouraged a reassessment of both constructs and a push to understand why scalar invariance cannot be achieved. Using the example of constructive patriotism and nationalism, this study demonstrates how the combination of multigroup confirmatory factor analysis and online probing can uncover and explain issues related to cross-national comparability. PMID:28579643

  14. Development of quantitative security optimization approach for the picture archives and carrying system between a clinic and a rehabilitation center

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko

    2002-05-01

    The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.

  15. Grouping of Bulgarian wines according to grape variety by using statistical methods

    NASA Astrophysics Data System (ADS)

    Milev, M.; Nikolova, Kr.; Ivanova, Ir.; Minkova, St.; Evtimov, T.; Krustev, St.

    2017-12-01

    68 different types of Bulgarian wines were studied in accordance with 9 optical parameters as follows: color parameters in XYZ and SIE Lab color systems, lightness, Hue angle, chroma, fluorescence intensity and emission wavelength. The main objective of this research is using hierarchical cluster analysis to evaluate the similarity and the distance between examined different types of Bulgarian wines and their grouping based on physical parameters. We have found that wines are grouped in clusters on the base of the degree of identity between them. There are two main clusters each one with two subclusters. The first one contains white wines and Sira, the second contains red wines and rose. The results from cluster analysis are presented graphically by a dendrogram. The other statistical technique used is factor analysis performed by the Method of Principal Components (PCA). The aim is to reduce the large number of variables to a few factors by grouping the correlated variables into one factor and subdividing the noncorrelated variables into different factors. Moreover the factor analysis provided the possibility to determine the parameters with the greatest influence over the distribution of samples in different clusters. In our study after the rotation of the factors with Varimax method the parameters were combined into two factors, which explain about 80 % of the total variation. The first one explains the 61.49% and correlates with color characteristics, the second one explains 18.34% from the variation and correlates with the parameters connected with fluorescence spectroscopy.

  16. Multivariate meta-analysis of prognostic factor studies with multiple cut-points and/or methods of measurement.

    PubMed

    Riley, Richard D; Elia, Eleni G; Malin, Gemma; Hemming, Karla; Price, Malcolm P

    2015-07-30

    A prognostic factor is any measure that is associated with the risk of future health outcomes in those with existing disease. Often, the prognostic ability of a factor is evaluated in multiple studies. However, meta-analysis is difficult because primary studies often use different methods of measurement and/or different cut-points to dichotomise continuous factors into 'high' and 'low' groups; selective reporting is also common. We illustrate how multivariate random effects meta-analysis models can accommodate multiple prognostic effect estimates from the same study, relating to multiple cut-points and/or methods of measurement. The models account for within-study and between-study correlations, which utilises more information and reduces the impact of unreported cut-points and/or measurement methods in some studies. The applicability of the approach is improved with individual participant data and by assuming a functional relationship between prognostic effect and cut-point to reduce the number of unknown parameters. The models provide important inferential results for each cut-point and method of measurement, including the summary prognostic effect, the between-study variance and a 95% prediction interval for the prognostic effect in new populations. Two applications are presented. The first reveals that, in a multivariate meta-analysis using published results, the Apgar score is prognostic of neonatal mortality but effect sizes are smaller at most cut-points than previously thought. In the second, a multivariate meta-analysis of two methods of measurement provides weak evidence that microvessel density is prognostic of mortality in lung cancer, even when individual participant data are available so that a continuous prognostic trend is examined (rather than cut-points). © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  17. Assessment of current AASHTO LRFD methods for static pile capacity analysis in Rhode Island soils.

    DOT National Transportation Integrated Search

    2013-07-01

    This report presents an assessment of current AASHTO LRFD methods for static pile capacity analysis in Rhode : Island soils. Current static capacity methods and associated resistance factors are based on pile load test data in sands : and clays. Some...

  18. Influencing factors and kinetics analysis on the leaching of iron from boron carbide waste-scrap with ultrasound-assisted method.

    PubMed

    Li, Xin; Xing, Pengfei; Du, Xinghong; Gao, Shuaibo; Chen, Chen

    2017-09-01

    In this paper, the ultrasound-assisted leaching of iron from boron carbide waste-scrap was investigated and the optimization of different influencing factors had also been performed. The factors investigated were acid concentration, liquid-solid ratio, leaching temperature, ultrasonic power and frequency. The leaching of iron with conventional method at various temperatures was also performed. The results show the maximum iron leaching ratios are 87.4%, 94.5% for 80min-leaching with conventional method and 50min-leaching with ultrasound assistance, respectively. The leaching of waste-scrap with conventional method fits the chemical reaction-controlled model. The leaching with ultrasound assistance fits chemical reaction-controlled model, diffusion-controlled model for the first stage and second stage, respectively. The assistance of ultrasound can greatly improve the iron leaching ratio, accelerate the leaching rate, shorten leaching time and lower the residual iron, comparing with conventional method. The advantages of ultrasound-assisted leaching were also confirmed by the SEM-EDS analysis and elemental analysis of the raw material and leached solid samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. The scalar and electromagnetic form factors of the nucleon in dispersively improved Chiral EFT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alarcon, Jose Manuel

    We present a method for calculating the nucleon form factors of G-parity-even operators. This method combines chiral effective field theory (χEFT) and dispersion theory. Through unitarity we factorize the imaginary part of the form factors into a perturbative part, calculable with χEFT, and a non-perturbative part, obtained through other methods. We consider the scalar and electromagnetic (EM) form factors of the nucleon. The results show an important improvement compared to standard chiral calculations, and can be used in analysis of the low-energy properties of the nucleon.

  20. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    PubMed

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  1. A factor analysis of the SSQ (Speech, Spatial, and Qualities of Hearing Scale)

    PubMed Central

    2014-01-01

    Objective The speech, spatial, and qualities of hearing questionnaire (SSQ) is a self-report test of auditory disability. The 49 items ask how well a listener would do in many complex listening situations illustrative of real life. The scores on the items are often combined into the three main sections or into 10 pragmatic subscales. We report here a factor analysis of the SSQ that we conducted to further investigate its statistical properties and to determine its structure. Design Statistical factor analysis of questionnaire data, using parallel analysis to determine the number of factors to retain, oblique rotation of factors, and a bootstrap method to estimate the confidence intervals. Study sample 1220 people who have attended MRC IHR over the last decade. Results We found three clear factors, essentially corresponding to the three main sections of the SSQ. They are termed “speech understanding”, “spatial perception”, and “clarity, separation, and identification”. Thirty-five of the SSQ questions were included in the three factors. There was partial evidence for a fourth factor, “effort and concentration”, representing two more questions. Conclusions These results aid in the interpretation and application of the SSQ and indicate potential methods for generating average scores. PMID:24417459

  2. Sufficient Forecasting Using Factor Models

    PubMed Central

    Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei

    2017-01-01

    We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537

  3. A symmetrical subtraction combined with interpolated values for eliminating scattering from fluorescence EEM data.

    PubMed

    Xu, Jing; Liu, Xiaofei; Wang, Yutian

    2016-08-05

    Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize this gap in best practices and subsequently to promote instrument development research that is more consistent through the peer-review process.

  5. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    PubMed

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  6. Loss Factor Estimation Using the Impulse Response Decay Method on a Stiffened Structure

    NASA Technical Reports Server (NTRS)

    Cabell, Randolph; Schiller, Noah; Allen, Albert; Moeller, Mark

    2009-01-01

    High-frequency vibroacoustic modeling is typically performed using energy-based techniques such as Statistical Energy Analysis (SEA). Energy models require an estimate of the internal damping loss factor. Unfortunately, the loss factor is difficult to estimate analytically, and experimental methods such as the power injection method can require extensive measurements over the structure of interest. This paper discusses the implications of estimating damping loss factors using the impulse response decay method (IRDM) from a limited set of response measurements. An automated procedure for implementing IRDM is described and then evaluated using data from a finite element model of a stiffened, curved panel. Estimated loss factors are compared with loss factors computed using a power injection method and a manual curve fit. The paper discusses the sensitivity of the IRDM loss factor estimates to damping of connected subsystems and the number and location of points in the measurement ensemble.

  7. Franck-Condon Factors for Diatomics: Insights and Analysis Using the Fourier Grid Hamiltonian Method

    ERIC Educational Resources Information Center

    Ghosh, Supriya; Dixit, Mayank Kumar; Bhattacharyya, S. P.; Tembe, B. L.

    2013-01-01

    Franck-Condon factors (FCFs) play a crucial role in determining the intensities of the vibrational bands in electronic transitions. In this article, a relatively simple method to calculate the FCFs is illustrated. An algorithm for the Fourier Grid Hamiltonian (FGH) method for computing the vibrational wave functions and the corresponding energy…

  8. ADHD and Method Variance: A Latent Variable Approach Applied to a Nationally Representative Sample of College Freshmen

    ERIC Educational Resources Information Center

    Konold, Timothy R.; Glutting, Joseph J.

    2008-01-01

    This study employed a correlated trait-correlated method application of confirmatory factor analysis to disentangle trait and method variance from measures of attention-deficit/hyperactivity disorder obtained at the college level. The two trait factors were "Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition" ("DSM-IV")…

  9. Using Module Analysis for Multiple Choice Responses: A New Method Applied to Force Concept Inventory Data

    ERIC Educational Resources Information Center

    Brewe, Eric; Bruun, Jesper; Bearden, Ian G.

    2016-01-01

    We describe "Module Analysis for Multiple Choice Responses" (MAMCR), a new methodology for carrying out network analysis on responses to multiple choice assessments. This method is used to identify modules of non-normative responses which can then be interpreted as an alternative to factor analysis. MAMCR allows us to identify conceptual…

  10. Calibrated Bayes Factors Should Not Be Used: A Reply to Hoijtink, van Kooten, and Hulsker.

    PubMed

    Morey, Richard D; Wagenmakers, Eric-Jan; Rouder, Jeffrey N

    2016-01-01

    Hoijtink, Kooten, and Hulsker ( 2016 ) present a method for choosing the prior distribution for an analysis with Bayes factor that is based on controlling error rates, which they advocate as an alternative to our more subjective methods (Morey & Rouder, 2014 ; Rouder, Speckman, Sun, Morey, & Iverson, 2009 ; Wagenmakers, Wetzels, Borsboom, & van der Maas, 2011 ). We show that the method they advocate amounts to a simple significance test, and that the resulting Bayes factors are not interpretable. Additionally, their method fails in common circumstances, and has the potential to yield arbitrarily high Type II error rates. After critiquing their method, we outline the position on subjectivity that underlies our advocacy of Bayes factors.

  11. A Qualitative Study on Organizational Factors Affecting Occupational Accidents

    PubMed Central

    ESKANDARI, Davood; JAFARI, Mohammad Javad; MEHRABI, Yadollah; KIAN, Mostafa Pouya; CHARKHAND, Hossein; MIRGHOTBI, Mostafa

    2017-01-01

    Background: Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts’ experiences and perception of organizational factors. Methods: This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Results: Eleven organizational factors’ sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. Conclusion: The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents. PMID:28435824

  12. Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis

    ERIC Educational Resources Information Center

    Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John

    2012-01-01

    Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…

  13. Guidelines for Analysis of Socio-Cultural Factors in Health. Volume 4: Socio-Cultural Factors in Health Planning. International Health Planning Methods Series.

    ERIC Educational Resources Information Center

    Fraser, Renee White

    Intended to assist Agency for International Development (AID) officers, advisors, and health officials in incorporating health planning into national plans for economic development, this fourth of ten manuals in the International Health Planning Methods Series deals with sociocultural, psychological, and behavioral factors that affect the planning…

  14. Triangular covariance factorizations for. Ph.D. Thesis. - Calif. Univ.

    NASA Technical Reports Server (NTRS)

    Thornton, C. L.

    1976-01-01

    An improved computational form of the discrete Kalman filter is derived using an upper triangular factorization of the error covariance matrix. The covariance P is factored such that P = UDUT where U is unit upper triangular and D is diagonal. Recursions are developed for propagating the U-D covariance factors together with the corresponding state estimate. The resulting algorithm, referred to as the U-D filter, combines the superior numerical precision of square root filtering techniques with an efficiency comparable to that of Kalman's original formula. Moreover, this method is easily implemented and involves no more computer storage than the Kalman algorithm. These characteristics make the U-D method an attractive realtime filtering technique. A new covariance error analysis technique is obtained from an extension of the U-D filter equations. This evaluation method is flexible and efficient and may provide significantly improved numerical results. Cost comparisons show that for a large class of problems the U-D evaluation algorithm is noticeably less expensive than conventional error analysis methods.

  15. Exploring the Factor Structure of Neurocognitive Measures in Older Individuals

    PubMed Central

    Santos, Nadine Correia; Costa, Patrício Soares; Amorim, Liliana; Moreira, Pedro Silva; Cunha, Pedro; Cotter, Jorge; Sousa, Nuno

    2015-01-01

    Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the “best fit” model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate. PMID:25880732

  16. The Nurturant Fathering Scale: A Confirmatory Factor Analysis with an African American Sample of College Students

    ERIC Educational Resources Information Center

    Doyle, Otima; Pecukonis, Edward; Harrington, Donna

    2011-01-01

    Objective: The objective of this study was to test the factor structure of the "Nurturant Fathering Scale" (NFS) among an African American sample in the mid-Atlantic region that have neither Caribbean heritage nor immigration experiences but who do have diverse family structures (N = 212). Method: A confirmatory factor analysis (CFA) was conducted…

  17. Common factor analysis versus principal component analysis: choice for symptom cluster research.

    PubMed

    Kim, Hee-Ju

    2008-03-01

    The purpose of this paper is to examine differences between two factor analytical methods and their relevance for symptom cluster research: common factor analysis (CFA) versus principal component analysis (PCA). Literature was critically reviewed to elucidate the differences between CFA and PCA. A secondary analysis (N = 84) was utilized to show the actual result differences from the two methods. CFA analyzes only the reliable common variance of data, while PCA analyzes all the variance of data. An underlying hypothetical process or construct is involved in CFA but not in PCA. PCA tends to increase factor loadings especially in a study with a small number of variables and/or low estimated communality. Thus, PCA is not appropriate for examining the structure of data. If the study purpose is to explain correlations among variables and to examine the structure of the data (this is usual for most cases in symptom cluster research), CFA provides a more accurate result. If the purpose of a study is to summarize data with a smaller number of variables, PCA is the choice. PCA can also be used as an initial step in CFA because it provides information regarding the maximum number and nature of factors. In using factor analysis for symptom cluster research, several issues need to be considered, including subjectivity of solution, sample size, symptom selection, and level of measure.

  18. Insight into dementia care management using social-behavioral theory and mixed methods.

    PubMed

    Connor, Karen; McNeese-Smith, Donna; van Servellen, Gwen; Chang, Betty; Lee, Martin; Cheng, Eric; Hajar, Abdulrahman; Vickrey, Barbara G

    2009-01-01

    For health organizations (private and public) to advance their care-management programs, to use resources effectively and efficiently, and to improve patient outcomes, it is germane to isolate and quantify care-management activities and to identify overarching domains. The aims of this study were to identify and report on an application of mixed methods of qualitative statistical techniques, based on a theoretical framework, and to construct variables for factor analysis and exploratory factor analytic steps for identifying domains of dementia care management. Care-management activity data were extracted from the care plans of 181 pairs of individuals (with dementia and their informal caregivers) who had participated in the intervention arm of a randomized controlled trial of a dementia care-management program. Activities were organized into types, using card-sorting methods, influenced by published theoretical constructs on self-efficacy and general strain theory. These activity types were mapped in the initial data set to construct variables for exploratory factor analysis. Principal components extraction with varimax and promax rotations was used to estimate the number of factors. Cronbach's alpha was calculated for the items in each factor to assess internal consistency reliability. The two-phase card-sorting technique yielded 45 activity types out of 450 unique activities. Exploratory factor analysis produced four care-management domains (factors): behavior management, clinical strategies and caregiver support, community agency, and safety. Internal consistency reliability (Cronbach's alpha) of items for each factor ranged from.63 for the factor "safety" to.89 for the factor "behavior management" (Factor 1). Applying a systematic method to a large set of care-management activities can identify a parsimonious number of higher order categories of variables and factors to guide the understanding of dementia care-management processes. Further application of this methodology in outcome analyses and to other data sets is necessary to test its practicality.

  19. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  20. Calculating the nutrient composition of recipes with computers.

    PubMed

    Powers, P M; Hoover, L W

    1989-02-01

    The objective of this research project was to compare the nutrient values computed by four commonly used computerized recipe calculation methods. The four methods compared were the yield factor, retention factor, summing, and simplified retention factor methods. Two versions of the summing method were modeled. Four pork entrée recipes were selected for analysis: roast pork, pork and noodle casserole, pan-broiled pork chops, and pork chops with vegetables. Assumptions were made about changes expected to occur in the ingredients during preparation and cooking. Models were designed to simulate the algorithms of the calculation methods using a microcomputer spreadsheet software package. Identical results were generated in the yield factor, retention factor, and summing-cooked models for roast pork. The retention factor and summing-cooked models also produced identical results for the recipe for pan-broiled pork chops. The summing-raw model gave the highest value for water in all four recipes and the lowest values for most of the other nutrients. A superior method or methods was not identified. However, on the basis of the capabilities provided with the yield factor and retention factor methods, more serious consideration of these two methods is recommended.

  1. Testing all six person-oriented principles in dynamic factor analysis.

    PubMed

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  2. Multiple Statistical Models Based Analysis of Causative Factors and Loess Landslides in Tianshui City, China

    NASA Astrophysics Data System (ADS)

    Su, Xing; Meng, Xingmin; Ye, Weilin; Wu, Weijiang; Liu, Xingrong; Wei, Wanhong

    2018-03-01

    Tianshui City is one of the mountainous cities that are threatened by severe geo-hazards in Gansu Province, China. Statistical probability models have been widely used in analyzing and evaluating geo-hazards such as landslide. In this research, three approaches (Certainty Factor Method, Weight of Evidence Method and Information Quantity Method) were adopted to quantitively analyze the relationship between the causative factors and the landslides, respectively. The source data used in this study are including the SRTM DEM and local geological maps in the scale of 1:200,000. 12 causative factors (i.e., altitude, slope, aspect, curvature, plan curvature, profile curvature, roughness, relief amplitude, and distance to rivers, distance to faults, distance to roads, and the stratum lithology) were selected to do correlation analysis after thorough investigation of geological conditions and historical landslides. The results indicate that the outcomes of the three models are fairly consistent.

  3. Being Single as a Social Barrier to Access Reproductive Healthcare Services by Iranian Girls

    PubMed Central

    Kohan, Shahnaz; Mohammadi, Fatemeh; Mostafavi, Firoozeh; Gholami, Ali

    2017-01-01

    Background: Iranian single women are deprived of reproductive healthcare services, though the provision of such services to the public has increased. This study aimed to explore the experiences of Iranian single women on their access to reproductive health services. Methods: A qualitative design using a conventional content analysis method was used. Semi-structured interviews were held with 17 single women and nine health providers chosen using the purposive sampling method. Results: Data analysis resulted in the development of three categories: ‘family’s attitudes and performance about single women’s reproductive healthcare,’ ‘socio-cultural factors influencing reproductive healthcare,’ and ‘cultural factors influencing being a single woman.’ Conclusion: Cultural and contextual factors affect being a single woman in every society. Therefore, healthcare providers need to identify such factors during the designing of strategies for improving the facilitation of access to reproductive healthcare services. PMID:28812794

  4. Analysis of Risk Factors for Postoperative Morbidity in Perforated Peptic Ulcer

    PubMed Central

    Kim, Jae-Myung; Jeong, Sang-Ho; Park, Soon-Tae; Choi, Sang-Kyung; Hong, Soon-Chan; Jung, Eun-Jung; Ju, Young-Tae; Jeong, Chi-Young; Ha, Woo-Song

    2012-01-01

    Purpose Emergency operations for perforated peptic ulcer are associated with a high incidence of postoperative complications. While several studies have investigated the impact of perioperative risk factors and underlying diseases on the postoperative morbidity after abdominal surgery, only a few have analyzed their role in perforated peptic ulcer disease. The purpose of this study was to determine any possible associations between postoperative morbidity and comorbid disease or perioperative risk factors in perforated peptic ulcer. Materials and Methods In total, 142 consecutive patients, who underwent surgery for perforated peptic ulcer, at a single institution, between January 2005 and October 2010 were included in this study. The clinical data concerning the patient characteristics, operative methods, and complications were collected retrospectively. Results The postoperative morbidity rate associated with perforated peptic ulcer operations was 36.6% (52/142). Univariate analysis revealed that a long operating time, the open surgical method, age (≥60), sex (female), high American Society of Anesthesiologists (ASA) score and presence of preoperative shock were significant perioperative risk factors for postoperative morbidity. Significant comorbid risk factors included hypertension, diabetes mellitus and pulmonary disease. Multivariate analysis revealed a long operating time, the open surgical method, high ASA score and the presence of preoperative shock were all independent risk factors for the postoperative morbidity in perforated peptic ulcer. Conclusions A high ASA score, preoperative shock, open surgery and long operating time of more than 150 minutes are high risk factors for morbidity. However, there is no association between postoperative morbidity and comorbid disease in patients with a perforated peptic ulcer. PMID:22500261

  5. A Dimensional Analysis of College Student Satisfaction.

    ERIC Educational Resources Information Center

    Betz, Ellen L.; And Others

    Further research on the College Student Satisfaction Questionnaire (CSSQ) is reported herein (see TM 000 049). Item responses of two groups of university students were separately analyzed by three different factor analytic methods. Three factors consistently appeared across groups and methods: Compensation, Social Life, and Working Conditions. Two…

  6. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  7. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.

  8. Philosophy of ATHEANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bley, D.C.; Cooper, S.E.; Forester, J.A.

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  9. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures.

    PubMed

    Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each structure (underlying physiological components), and the associated factor images are estimates of the spatial distribution of each factor. The aim of this study was to assess the reliability of FADS in first pass RNA and compare the results to those obtained by the ROI method which is generally considered as the routine procedure.

  10. A neuro-data envelopment analysis approach for optimization of uncorrelated multiple response problems with smaller the better type controllable factors

    NASA Astrophysics Data System (ADS)

    Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed

    2013-11-01

    In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

  11. Underlying risk factors for prescribing errors in long-term aged care: a qualitative study.

    PubMed

    Tariq, Amina; Georgiou, Andrew; Raban, Magdalena; Baysari, Melissa Therese; Westbrook, Johanna

    2016-09-01

    To identify system-related risk factors perceived to contribute to prescribing errors in Australian long-term care settings, that is, residential aged care facilities (RACFs). The study used qualitative methods to explore factors that contribute to unsafe prescribing in RACFs. Data were collected at three RACFs in metropolitan Sydney, Australia between May and November 2011. Participants included RACF managers, doctors, pharmacists and RACF staff actively involved in prescribing-related processes. Methods included non-participant observations (74 h), in-depth semistructured interviews (n=25) and artefact analysis. Detailed process activity models were developed for observed prescribing episodes supplemented by triangulated analysis using content analysis methods. System-related factors perceived to increase the risk of prescribing errors in RACFs were classified into three overarching themes: communication systems, team coordination and staff management. Factors associated with communication systems included limited point-of-care access to information, inadequate handovers, information storage across different media (paper, electronic and memory), poor legibility of charts, information double handling, multiple faxing of medication charts and reliance on manual chart reviews. Team factors included lack of established lines of responsibility, inadequate team communication and limited participation of doctors in multidisciplinary initiatives like medication advisory committee meetings. Factors related to staff management and workload included doctors' time constraints and their accessibility, lack of trained RACF staff and high RACF staff turnover. The study highlights several system-related factors including laborious methods for exchanging medication information, which often act together to contribute to prescribing errors. Multiple interventions (eg, technology systems, team communication protocols) are required to support the collaborative nature of RACF prescribing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. Comparison of point-of-care methods for preparation of platelet concentrate (platelet-rich plasma).

    PubMed

    Weibrich, Gernot; Kleis, Wilfried K G; Streckbein, Philipp; Moergel, Maximilian; Hitzler, Walter E; Hafner, Gerd

    2012-01-01

    This study analyzed the concentrations of platelets and growth factors in platelet-rich plasma (PRP), which are likely to depend on the method used for its production. The cellular composition and growth factor content of platelet concentrates (platelet-rich plasma) produced by six different procedures were quantitatively analyzed and compared. Platelet and leukocyte counts were determined on an automatic cell counter, and analysis of growth factors was performed using enzyme-linked immunosorbent assay. The principal differences between the analyzed PRP production methods (blood bank method of intermittent flow centrifuge system/platelet apheresis and by the five point-of-care methods) and the resulting platelet concentrates were evaluated with regard to resulting platelet, leukocyte, and growth factor levels. The platelet counts in both whole blood and PRP were generally higher in women than in men; no differences were observed with regard to age. Statistical analysis of platelet-derived growth factor AB (PDGF-AB) and transforming growth factor β1 (TGF-β1) showed no differences with regard to age or gender. Platelet counts and TGF-β1 concentration correlated closely, as did platelet counts and PDGF-AB levels. There were only rare correlations between leukocyte counts and PDGF-AB levels, but comparison of leukocyte counts and PDGF-AB levels demonstrated certain parallel tendencies. TGF-β1 levels derive in substantial part from platelets and emphasize the role of leukocytes, in addition to that of platelets, as a source of growth factors in PRP. All methods of producing PRP showed high variability in platelet counts and growth factor levels. The highest growth factor levels were found in the PRP prepared using the Platelet Concentrate Collection System manufactured by Biomet 3i.

  13. Development of the Career Anchors Scale among Occupational Health Nurses in Japan

    PubMed Central

    Kubo, Yoshiko; Hatono, Yoko; Kubo, Tomohide; Shimamoto, Satoko; Nakatani, Junko; Burgel, Barbara J.

    2016-01-01

    Objectives: This study aimed to develop the Career Anchors Scale among Occupational Health Nurses (CASOHN) and evaluate its reliability and validity. Methods: Scale items were developed through a qualitative inductive analysis of interview data, and items were revised following an examination of content validity by experts and occupational health nurses (OHNs), resulting in a provisional scale of 41 items. A total of 745 OHNs (response rate 45.2%) affiliated with the Japan Society for Occupational Health participated in the self-administered questionnaire survey. Results: Two items were deleted based on item-total correlations. Factor analysis was then conducted on the remaining 39 items to examine construct validity. An exploratory factor analysis with a main factor method and promax rotation resulted in the extraction of six factors. The variance contribution ratios of the six factors were 37.45, 7.01, 5.86, 4.95, 4.16, and 3.19%. The cumulative contribution ratio was 62.62%. The factors were named as follows: Demonstrating expertise and considering position in work (Factor 1); Management skills for effective work (Factor 2); Supporting health improvement in groups and organizations (Factor 3); Providing employee-focused support (Factor 4); Collaborating with occupational health team members and personnel (Factor 5); and Compatibility of work and private life (Factor 6). The confidence coefficient determined by the split-half method was 0.85. Cronbach's alpha coefficient for the overall scale was 0.95, whereas those of the six subscales were 0.88, 0.90, 0.91, 0.80, 0.85, and 0.79, respectively. Conclusions: CASOHN was found to be valid and reliable for measuring career anchors among OHNs in Japan. PMID:27725484

  14. Examining the Factor Structure and Discriminant Validity of the 12-Item General Health Questionnaire (GHQ-12) Among Spanish Postpartum Women

    ERIC Educational Resources Information Center

    Aguado, Jaume; Campbell, Alistair; Ascaso, Carlos; Navarro, Purificacion; Garcia-Esteve, Lluisa; Luciano, Juan V.

    2012-01-01

    In this study, the authors tested alternative factor models of the 12-item General Health Questionnaire (GHQ-12) in a sample of Spanish postpartum women, using confirmatory factor analysis. The authors report the results of modeling three different methods for scoring the GHQ-12 using estimation methods recommended for categorical and binary data.…

  15. Resolving and quantifying overlapped chromatographic bands by transmutation

    PubMed

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  16. Factor and prevention method of landslide event at FELCRA Semungkis, Hulu Langat, Selangor

    NASA Astrophysics Data System (ADS)

    Manap, N.; Jeyaramah, N.; Syahrom, N.

    2017-12-01

    Landslide is known as one of the powerful geological events that happens unpredictably due to natural or human factors. A study was carried out at FELCRA Semungkis, Hulu Langat which is known as one of the areas that has been affected by landslide that involving 16 causalities. The purpose of this study is to identify the main factor that causes the landslide at FELCRA Semungkis, Hulu Langat and to identify the protection method. Data was collected from three respondents working under government bodies through interview sessions. The data collected were analysed by using the content analysis method. From the results, it can be concluded that the main factors that caused the landslide to happened are the human factor and nature factor. The protection method that can be applied to stabilize the FELCRA Semungkis, Hulu Langat is by using the soil nailing method with the support of soil create system.

  17. Experimental design methods for bioengineering applications.

    PubMed

    Keskin Gündoğdu, Tuğba; Deniz, İrem; Çalışkan, Gülizar; Şahin, Erdem Sefa; Azbar, Nuri

    2016-01-01

    Experimental design is a form of process analysis in which certain factors are selected to obtain the desired responses of interest. It may also be used for the determination of the effects of various independent factors on a dependent factor. The bioengineering discipline includes many different areas of scientific interest, and each study area is affected and governed by many different factors. Briefly analyzing the important factors and selecting an experimental design for optimization are very effective tools for the design of any bioprocess under question. This review summarizes experimental design methods that can be used to investigate various factors relating to bioengineering processes. The experimental methods generally used in bioengineering are as follows: full factorial design, fractional factorial design, Plackett-Burman design, Taguchi design, Box-Behnken design and central composite design. These design methods are briefly introduced, and then the application of these design methods to study different bioengineering processes is analyzed.

  18. Teaching learning methods of an entrepreneurship curriculum.

    PubMed

    Esmi, Keramat; Marzoughi, Rahmatallah; Torkzadeh, Jafar

    2015-10-01

    One of the most significant elements of entrepreneurship curriculum design is teaching-learning methods, which plays a key role in studies and researches related to such a curriculum. It is the teaching method, and systematic, organized and logical ways of providing lessons that should be consistent with entrepreneurship goals and contents, and should also be developed according to the learners' needs. Therefore, the current study aimed to introduce appropriate, modern, and effective methods of teaching entrepreneurship and their validation. This is a mixed method research of a sequential exploratory kind conducted through two stages: a) developing teaching methods of entrepreneurship curriculum, and b) validating developed framework. Data were collected through "triangulation" (study of documents, investigating theoretical basics and the literature, and semi-structured interviews with key experts). Since the literature on this topic is very rich, and views of the key experts are vast, directed and summative content analysis was used. In the second stage, qualitative credibility of research findings was obtained using qualitative validation criteria (credibility, confirmability, and transferability), and applying various techniques. Moreover, in order to make sure that the qualitative part is reliable, reliability test was used. Moreover, quantitative validation of the developed framework was conducted utilizing exploratory and confirmatory factor analysis methods and Cronbach's alpha. The data were gathered through distributing a three-aspect questionnaire (direct presentation teaching methods, interactive, and practical-operational aspects) with 29 items among 90 curriculum scholars. Target population was selected by means of purposive sampling and representative sample. Results obtained from exploratory factor analysis showed that a three factor structure is an appropriate method for describing elements of teaching-learning methods of entrepreneurship curriculum. Moreover, the value for Kaiser Meyer Olkin measure of sampling adequacy equaled 0.72 and the value for Bartlett's test of variances homogeneity was significant at the 0.0001 level. Except for internship element, the rest had a factor load of higher than 0.3. Also, the results of confirmatory factor analysis showed the model appropriateness, and the criteria for qualitative accreditation were acceptable. Developed model can help instructors in selecting an appropriate method of entrepreneurship teaching, and it can also make sure that the teaching is on the right path. Moreover, the model is comprehensive and includes all the effective teaching methods in entrepreneurship education. It is also based on qualities, conditions, and requirements of Higher Education Institutions in Iranian cultural environment.

  19. Validation of the Child and Youth Resilience Measure (CYRM-28) on a Sample of At-Risk New Zealand Youth

    ERIC Educational Resources Information Center

    Sanders, Jackie; Munford, Robyn; Thimasarn-Anwar, Tewaporn; Liebenberg, Linda

    2017-01-01

    Purpose: This article reports on an examination of the psychometric properties of the 28-item Child and Youth Resilience Measure (CYRM-28). Methods: Exploratory factor analysis, confirmatory factor analysis, Cronbach's a, "t"-tests, correlations, and multivariate analysis of variance were applied to data collected via interviews from 593…

  20. Type I and Type II Error Rates and Overall Accuracy of the Revised Parallel Analysis Method for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Green, Samuel B.; Thompson, Marilyn S.; Levy, Roy; Lo, Wen-Juo

    2015-01-01

    Traditional parallel analysis (T-PA) estimates the number of factors by sequentially comparing sample eigenvalues with eigenvalues for randomly generated data. Revised parallel analysis (R-PA) sequentially compares the "k"th eigenvalue for sample data to the "k"th eigenvalue for generated data sets, conditioned on"k"-…

  1. Multiway analysis methods applied to the fluorescence excitation-emission dataset for the simultaneous quantification of valsartan and amlodipine in tablets

    NASA Astrophysics Data System (ADS)

    Dinç, Erdal; Ertekin, Zehra Ceren; Büker, Eda

    2017-09-01

    In this study, excitation-emission matrix datasets, which have strong overlapping bands, were processed by using four different chemometric calibration algorithms consisting of parallel factor analysis, Tucker3, three-way partial least squares and unfolded partial least squares for the simultaneous quantitative estimation of valsartan and amlodipine besylate in tablets. In analyses, preliminary separation step was not used before the application of parallel factor analysis Tucker3, three-way partial least squares and unfolded partial least squares approaches for the analysis of the related drug substances in samples. Three-way excitation-emission matrix data array was obtained by concatenating excitation-emission matrices of the calibration set, validation set, and commercial tablet samples. The excitation-emission matrix data array was used to get parallel factor analysis, Tucker3, three-way partial least squares and unfolded partial least squares calibrations and to predict the amounts of valsartan and amlodipine besylate in samples. For all the methods, calibration and prediction of valsartan and amlodipine besylate were performed in the working concentration ranges of 0.25-4.50 μg/mL. The validity and the performance of all the proposed methods were checked by using the validation parameters. From the analysis results, it was concluded that the described two-way and three-way algorithmic methods were very useful for the simultaneous quantitative resolution and routine analysis of the related drug substances in marketed samples.

  2. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  3. Characterization of primary standards for use in the HPLC analysis of the procyanidin content of cocoa and chocolate containing products.

    PubMed

    Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A

    2009-10-15

    This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.

  4. Testing Group Mean Differences of Latent Variables in Multilevel Data Using Multiple-Group Multilevel CFA and Multilevel MIMIC Modeling.

    PubMed

    Kim, Eun Sook; Cao, Chunhua

    2015-01-01

    Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.

  5. An Analysis of Selected Factors Influencing Enrollment Patterns.

    ERIC Educational Resources Information Center

    Heck, James

    This report presents an analysis of factors influencing enrollment patterns at Lake City Community College (LCCC; Florida) and recommends ways to increase enrollments at the college. Section I reviews the methods of collecting data for the report, which included interviews with key college personnel, an examination of social indicators such as…

  6. Factor Analysis for Clustered Observations.

    ERIC Educational Resources Information Center

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  7. Factor analysis of some socio-economic and demographic variables for Bangladesh.

    PubMed

    Islam, S M

    1986-01-01

    The author carries out an exploratory factor analysis of some socioeconomic and demographic variables for Bangladesh using the classical or common factor approach with the varimax rotation method. The socioeconomic and demographic indicators used in this study include literacy, rate of growth, female employment, economic development, urbanization, population density, childlessness, sex ratio, proportion of women ever married, and fertility. The 18 administrative districts of Bangladesh constitute the unit of analysis. 3 common factors--modernization, fertility, and social progress--are identified in this study to explain the correlations among the set of selected socioeconomic and demographic variables.

  8. Complex amplitude reconstruction for dynamic beam quality M2 factor measurement with self-referencing interferometer wavefront sensor.

    PubMed

    Du, Yongzhao; Fu, Yuqing; Zheng, Lixin

    2016-12-20

    A real-time complex amplitude reconstruction method for determining the dynamic beam quality M2 factor based on a Mach-Zehnder self-referencing interferometer wavefront sensor is developed. By using the proposed complex amplitude reconstruction method, full characterization of the laser beam, including amplitude (intensity profile) and phase information, can be reconstructed from a single interference pattern with the Fourier fringe pattern analysis method in a one-shot measurement. With the reconstructed complex amplitude, the beam fields at any position z along its propagation direction can be obtained by first utilizing the diffraction integral theory. Then the beam quality M2 factor of the dynamic beam is calculated according to the specified method of the Standard ISO11146. The feasibility of the proposed method is demonstrated with the theoretical analysis and experiment, including the static and dynamic beam process. The experimental method is simple, fast, and operates without movable parts and is allowed in order to investigate the laser beam in inaccessible conditions using existing methods.

  9. Text mining factor analysis (TFA) in green tea patent data

    NASA Astrophysics Data System (ADS)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  10. Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis

    PubMed Central

    Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-01-01

    Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942

  11. Psychometric properties of Connor-Davidson Resilience Scale in a Spanish sample of entrepreneurs.

    PubMed

    Manzano-García, Guadalupe; Ayala Calvo, Juan Carlos

    2013-01-01

    The literature regarding entrepreneurship suggests that the resilience of entrepreneurs may help to explain entrepreneurial success, but there is no resilience measure widely accepted by researchers. This study analyzes the psychometric properties of the Connor and Davidson Resilience Scale (CD-RISC) in a sample of Spanish entrepreneurs. A telephone survey research method was used. The participants were entrepreneurs operating in the business services sector. Interviewers telephoned a total of 900 entrepreneurs of whom 783 produced usable questionnaires. The CD-RISC was used as data collection instrument. We used principal component analysis factor and confirmatory factor analysis to determine the factor structure of the CD-RISC. Confirmatory factor analysis failed to verify the original five-factor structure of the CD-RISC, whereas principal component analysis factor yielded a 3-factor structure of resilience (hardiness, resourcefulness and optimism). In this research, 47.48% of the total variance was accounted for by three factors, and the obtained factor structure was verified through confirmatory factor analysis. The CD-RISC has been shown to be a reliable and valid tool for measuring entrepreneurs' resilience.

  12. Dysmenorrhea Characteristics of Female Students of Health School and Affecting Factors and Their Knowledge and Use of Complementary and Alternative Medicine Methods.

    PubMed

    Midilli, Tulay Sagkal; Yasar, Eda; Baysal, Ebru

    2015-01-01

    The purpose of this study was to examine the menstruation and dysmenorrhea characteristics and the factors affecting dysmenorrhea of health school students, and the knowledge and use of the methods of complementary and alternative medicine (CAM) on the part of those students with dysmenorrhea. This is a descriptive study. A descriptive analysis was made by calculating the number, percentage, mean, Pearson χ, and logistic regression analysis. A total of 488 female students participated in the research and 87.7% (n = 428) of all students experienced dysmenorrhea. It was detected that a family history of dysmenorrhea and regular menstrual cycles of the students were dysmenorrhea-affecting factors (P < .05). Seven of 10 students with dysmenorrhea used CAM methods. Heat application of CAM methods for dysmenorrhea management was the most commonly used and also known by the students. The students who experienced severe pain used analgesics (P < .05) and CAM methods (P < .05).

  13. [Methods of the multivariate statistical analysis of so-called polyetiological diseases using the example of coronary heart disease].

    PubMed

    Lifshits, A M

    1979-01-01

    General characteristics of the multivariate statistical analysis (MSA) is given. Methodical premises and criteria for the selection of an adequate MSA method applicable to pathoanatomic investigations of the epidemiology of multicausal diseases are presented. The experience of using MSA with computors and standard computing programs in studies of coronary arteries aterosclerosis on the materials of 2060 autopsies is described. The combined use of 4 MSA methods: sequential, correlational, regressional, and discriminant permitted to quantitate the contribution of each of the 8 examined risk factors in the development of aterosclerosis. The most important factors were found to be the age, arterial hypertension, and heredity. Occupational hypodynamia and increased fatness were more important in men, whereas diabetes melitus--in women. The registration of this combination of risk factors by MSA methods provides for more reliable prognosis of the likelihood of coronary heart disease with a fatal outcome than prognosis of the degree of coronary aterosclerosis.

  14. Simulation of multi-element multispectral UV radiation source for optical-electronic system of minerals luminescence analysis

    NASA Astrophysics Data System (ADS)

    Peretyagin, Vladimir S.; Korolev, Timofey K.; Chertov, Aleksandr N.

    2017-02-01

    The problems of dressability the solid minerals are attracted attention of specialists, where the extraction of mineral raw materials is a significant sector of the economy. There are a significant amount of mineral ore dressability methods. At the moment the radiometric dressability methods are considered the most promising. One of radiometric methods is method photoluminescence. This method is based on the spectral analysis, amplitude and kinetic parameters luminescence of minerals (under UV radiation), as well as color parameters of radiation. The absence of developed scientific and methodological approaches of analysis irradiation area to UV radiation as well as absence the relevant radiation sources are the factors which hinder development and use of photoluminescence method. The present work is devoted to the development of multi-element UV radiation source designed for the solution problem of analysis and sorting minerals by their selective luminescence. This article is presented a method of theoretical modeling of the radiation devices based on UV LEDs. The models consider such factors as spectral component, the spatial and energy parameters of the LEDs. Also, this article is presented the results of experimental studies of the some samples minerals.

  15. A simple method for determining stress intensity factors for a crack in bi-material interface

    NASA Astrophysics Data System (ADS)

    Morioka, Yuta

    Because of violently oscillating nature of stress and displacement fields near the crack tip, it is difficult to obtain stress intensity factors for a crack between two dis-similar media. For a crack in a homogeneous medium, it is a common practice to find stress intensity factors through strain energy release rates. However, individual strain energy release rates do not exist for bi-material interface crack. Hence it is necessary to find alternative methods to evaluate stress intensity factors. Several methods have been proposed in the past. However they involve mathematical complexity and sometimes require additional finite element analysis. The purpose of this research is to develop a simple method to find stress intensity factors in bi-material interface cracks. A finite element based projection method is proposed in the research. It is shown that the projection method yields very accurate stress intensity factors for a crack in isotropic and anisotropic bi-material interfaces. The projection method is also compared to displacement ratio method and energy method proposed by other authors. Through comparison it is found that projection method is much simpler to apply with its accuracy comparable to that of displacement ratio method.

  16. Secondary School Students' Views of Inhibiting Factors in Seeking Counselling

    ERIC Educational Resources Information Center

    Chan, Stephanie; Quinn, Philip

    2012-01-01

    This study examines secondary school students' perceptions of inhibiting factors in seeking counselling. Responses to a questionnaire completed by 1346 secondary school students were analysed using quantitative and qualitative methods. Exploratory factor analysis highlighted that within 21 pre-defined inhibiting factors, items loaded strongly on…

  17. Application of factor analysis to the water quality in reservoirs

    NASA Astrophysics Data System (ADS)

    Silva, Eliana Costa e.; Lopes, Isabel Cristina; Correia, Aldina; Gonçalves, A. Manuela

    2017-06-01

    In this work we present a Factor Analysis of chemical and environmental variables of the water column and hydro-morphological features of several Portuguese reservoirs. The objective is to reduce the initial number of variables, keeping their common characteristics. Using the Factor Analysis, the environmental variables measured in the epilimnion and in the hypolimnion, together with the hydromorphological characteristics of the dams were reduced from 63 variables to only 13 factors, which explained a total of 83.348% of the variance in the original data. After performing rotation using the Varimax method, the relations between the factors and the original variables got clearer and more explainable, which provided a Factor Analysis model for these environmental variables using 13 varifactors: Water quality and distance to the source, Hypolimnion chemical composition, Sulfite-reducing bacteria and nutrients, Coliforms and faecal streptococci, Reservoir depth, Temperature, Location, among other factors.

  18. Total Ambient Dose Equivalent Buildup Factor Determination for Nbs04 Concrete.

    PubMed

    Duckic, Paulina; Hayes, Robert B

    2018-06-01

    Buildup factors are dimensionless multiplicative factors required by the point kernel method to account for scattered radiation through a shielding material. The accuracy of the point kernel method is strongly affected by the correspondence of analyzed parameters to experimental configurations, which is attempted to be simplified here. The point kernel method has not been found to have widespread practical use for neutron shielding calculations due to the complex neutron transport behavior through shielding materials (i.e. the variety of interaction mechanisms that neutrons may undergo while traversing the shield) as well as non-linear neutron total cross section energy dependence. In this work, total ambient dose buildup factors for NBS04 concrete are calculated in terms of neutron and secondary gamma ray transmission factors. The neutron and secondary gamma ray transmission factors are calculated using MCNP6™ code with updated cross sections. Both transmission factors and buildup factors are given in a tabulated form. Practical use of neutron transmission and buildup factors warrants rigorously calculated results with all associated uncertainties. In this work, sensitivity analysis of neutron transmission factors and total buildup factors with varying water content has been conducted. The analysis showed significant impact of varying water content in concrete on both neutron transmission factors and total buildup factors. Finally, support vector regression, a machine learning technique, has been engaged to make a model based on the calculated data for calculation of the buildup factors. The developed model can predict most of the data with 20% relative error.

  19. Methodology development for quantitative optimization of security enhancement in medical information systems -Case study in a PACS and a multi-institutional radiotherapy database-.

    PubMed

    Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari

    2002-01-01

    The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.

  20. Multiscale weighted colored graphs for protein flexibility and rigidity analysis

    NASA Astrophysics Data System (ADS)

    Bramer, David; Wei, Guo-Wei

    2018-02-01

    Protein structural fluctuation, measured by Debye-Waller factors or B-factors, is known to correlate to protein flexibility and function. A variety of methods has been developed for protein Debye-Waller factor prediction and related applications to domain separation, docking pose ranking, entropy calculation, hinge detection, stability analysis, etc. Nevertheless, none of the current methodologies are able to deliver an accuracy of 0.7 in terms of the Pearson correlation coefficients averaged over a large set of proteins. In this work, we introduce a paradigm-shifting geometric graph model, multiscale weighted colored graph (MWCG), to provide a new generation of computational algorithms to significantly change the current status of protein structural fluctuation analysis. Our MWCG model divides a protein graph into multiple subgraphs based on interaction types between graph nodes and represents the protein rigidity by generalized centralities of subgraphs. MWCGs not only predict the B-factors of protein residues but also accurately analyze the flexibility of all atoms in a protein. The MWCG model is validated over a number of protein test sets and compared with many standard methods. An extensive numerical study indicates that the proposed MWCG offers an accuracy of over 0.8 and thus provides perhaps the first reliable method for estimating protein flexibility and B-factors. It also simultaneously predicts all-atom flexibility in a molecule.

  1. A Systematic Review of Methodology: Time Series Regression Analysis for Environmental Factors and Infectious Diseases

    PubMed Central

    Imai, Chisato; Hashizume, Masahiro

    2015-01-01

    Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149

  2. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  3. [Confrontation of knowledge on alcohol concentration in blood and in exhaled air].

    PubMed

    Bauer, Miroslav; Bauerová, Jiřina; Šikuta, Ján; Šidlo, Jozef

    2015-01-01

    The authors of the paper give a brief historical overview of the development of experimental alcohology in the former Czechoslovakia. Enhanced attention is paid to tests of work quality control of toxicological laboratories. Information on results of control tests of blood samples using the method of gas chromatography in Slovakia and within a world-wide study "Eurotox 1990" is presented. There are pointed out the pitfalls related to objective evaluation of the analysis results interpreting alcohol concentration in biological materials and the associated need to eliminate a negative influence of the human factor. The authors recommend performing analyses of alcohol in biological materials only at accredited workplaces and in the case of samples storage to secure a mandatory inhibition of phosphorylation process. There are analysed the reasons of numerical differences of analyses while taking evidence of alcohol in blood and in exhaled air. The authors confirm analysis accuracy using the method of gas chromatography along with breath analysers of exhaled air. They highlight the need for making the analysis results more objective also through confrontation with the results of clinical examination and with examined circumstances. The authors suggest a method of elimination of the human factor, the most frequently responsible for inaccuracy, to a tolerable level (safety factor) and the need of sample analysis by two methods independent of each other or the need of analysis of two biological materials.

  4. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  5. A dynamic factor model of the evaluation of the financial crisis in Turkey.

    PubMed

    Sezgin, F; Kinay, B

    2010-01-01

    Factor analysis has been widely used in economics and finance in situations where a relatively large number of variables are believed to be driven by few common causes of variation. Dynamic factor analysis (DFA) which is a combination of factor and time series analysis, involves autocorrelation matrices calculated from multivariate time series. Dynamic factor models were traditionally used to construct economic indicators, macroeconomic analysis, business cycles and forecasting. In recent years, dynamic factor models have become more popular in empirical macroeconomics. They have more advantages than other methods in various respects. Factor models can for instance cope with many variables without running into scarce degrees of freedom problems often faced in regression-based analysis. In this study, a model which determines the effect of the global crisis on Turkey is proposed. The main aim of the paper is to analyze how several macroeconomic quantities show an alteration before the evolution of the crisis and to decide if a crisis can be forecasted or not.

  6. Analysis of risk factors for postoperative morbidity in perforated peptic ulcer.

    PubMed

    Kim, Jae-Myung; Jeong, Sang-Ho; Lee, Young-Joon; Park, Soon-Tae; Choi, Sang-Kyung; Hong, Soon-Chan; Jung, Eun-Jung; Ju, Young-Tae; Jeong, Chi-Young; Ha, Woo-Song

    2012-03-01

    Emergency operations for perforated peptic ulcer are associated with a high incidence of postoperative complications. While several studies have investigated the impact of perioperative risk factors and underlying diseases on the postoperative morbidity after abdominal surgery, only a few have analyzed their role in perforated peptic ulcer disease. The purpose of this study was to determine any possible associations between postoperative morbidity and comorbid disease or perioperative risk factors in perforated peptic ulcer. In total, 142 consecutive patients, who underwent surgery for perforated peptic ulcer, at a single institution, between January 2005 and October 2010 were included in this study. The clinical data concerning the patient characteristics, operative methods, and complications were collected retrospectively. The postoperative morbidity rate associated with perforated peptic ulcer operations was 36.6% (52/142). Univariate analysis revealed that a long operating time, the open surgical method, age (≥60), sex (female), high American Society of Anesthesiologists (ASA) score and presence of preoperative shock were significant perioperative risk factors for postoperative morbidity. Significant comorbid risk factors included hypertension, diabetes mellitus and pulmonary disease. Multivariate analysis revealed a long operating time, the open surgical method, high ASA score and the presence of preoperative shock were all independent risk factors for the postoperative morbidity in perforated peptic ulcer. A high ASA score, preoperative shock, open surgery and long operating time of more than 150 minutes are high risk factors for morbidity. However, there is no association between postoperative morbidity and comorbid disease in patients with a perforated peptic ulcer.

  7. A Structural and Correlational Analysis of Two Common Measures of Personal Epistemology

    ERIC Educational Resources Information Center

    Laster, Bonnie Bost

    2010-01-01

    Scope and Method of Study: The current inquiry is a factor analytic study which utilizes first and second order factor analytic methods to examine the internal structures of two measurements of personal epistemological beliefs: the Schommer Epistemological Questionnaire (SEQ) and Epistemic Belief Inventory (EBI). The study also examines the…

  8. Risky Business: An Ecological Analysis of Intimate Partner Violence Disclosure

    ERIC Educational Resources Information Center

    Alaggia, Ramona; Regehr, Cheryl; Jenney, Angelique

    2012-01-01

    Objective: A multistage, mixed-methods study using grounded theory with descriptive data was conducted to examine factors in disclosure of intimate partner violence (IPV). Method: In-depth interviews with individuals and focus groups were undertaken to collect data from 98 IPV survivors and service providers to identify influential factors.…

  9. Factors, Practices, and Policies Influencing Students' Upward Transfer to Baccalaureate-Degree Programs and Institutions: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    LaSota, Robin Rae

    2013-01-01

    My dissertation utilizes an explanatory, sequential mixed-methods research design to assess factors influencing community college students' transfer probability to baccalaureate-granting institutions and to present promising practices in colleges and states directed at improving upward transfer, particularly for low-income and first-generation…

  10. Predictive analysis effectiveness in determining the epidemic disease infected area

    NASA Astrophysics Data System (ADS)

    Ibrahim, Najihah; Akhir, Nur Shazwani Md.; Hassan, Fadratul Hafinaz

    2017-10-01

    Epidemic disease outbreak had caused nowadays community to raise their great concern over the infectious disease controlling, preventing and handling methods to diminish the disease dissemination percentage and infected area. Backpropagation method was used for the counter measure and prediction analysis of the epidemic disease. The predictive analysis based on the backpropagation method can be determine via machine learning process that promotes the artificial intelligent in pattern recognition, statistics and features selection. This computational learning process will be integrated with data mining by measuring the score output as the classifier to the given set of input features through classification technique. The classification technique is the features selection of the disease dissemination factors that likely have strong interconnection between each other in causing infectious disease outbreaks. The predictive analysis of epidemic disease in determining the infected area was introduced in this preliminary study by using the backpropagation method in observation of other's findings. This study will classify the epidemic disease dissemination factors as the features for weight adjustment on the prediction of epidemic disease outbreaks. Through this preliminary study, the predictive analysis is proven to be effective method in determining the epidemic disease infected area by minimizing the error value through the features classification.

  11. Realist identification of group-level latent variables for perinatal social epidemiology theory building.

    PubMed

    Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc

    2014-01-01

    We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.

  12. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    NASA Astrophysics Data System (ADS)

    Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.

    2015-07-01

    In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.

  13. Configurations of Common Childhood Psychosocial Risk Factors

    ERIC Educational Resources Information Center

    Copeland, William; Shanahan, Lilly; Costello, E. Jane; Angold, Adrian

    2009-01-01

    Background: Co-occurrence of psychosocial risk factors is commonplace, but little is known about psychiatrically-predictive configurations of psychosocial risk factors. Methods: Latent class analysis (LCA) was applied to 17 putative psychosocial risk factors in a representative population sample of 920 children ages 9 to 17. The resultant class…

  14. A GRAPHICAL DIAGNOSTIC METHOD FOR ASSESSING THE ROTATION IN FACTOR ANALYTICAL MODELS OF ATMOSPHERIC POLLUTION. (R831078)

    EPA Science Inventory

    Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...

  15. Redefining the WISC-R: Implications for Professional Practice and Public Policy.

    ERIC Educational Resources Information Center

    Macmann, Gregg M.; Barnett, David W.

    1992-01-01

    The factor structure of the Wechsler Intelligence Scale for Children (Revised) was examined in the standardization sample using new methods of factor analysis. The substantial overlap across factors was most parsimoniously represented by a single general factor. Implications for public policy regarding the purposes and outcomes of special…

  16. Improving Alcohol Screening for College Students: Screening for Alcohol Misuse amongst College Students with a Simple Modification to the CAGE Questionnaire

    ERIC Educational Resources Information Center

    Taylor, Purcell; El-Sabawi, Taleed; Cangin, Causenge

    2016-01-01

    Objective: To improve the CAGE (Cut down, Annoyed, Guilty, Eye opener) questionnaire's predictive accuracy in screening college students. Participants: The sample consisted of 219 midwestern university students who self-administered a confidential survey. Methods: Exploratory factor analysis, confirmatory factor analysis, receiver operating…

  17. Factors Affecting Accuracy and Time Requirements of a Glucose Oxidase-Peroxidase Assay for Determination of Glucose

    USDA-ARS?s Scientific Manuscript database

    Accurate and rapid assays for glucose are desirable for analysis of glucose and starch in food and feedstuffs. An established colorimetric glucose oxidase-peroxidase method for glucose was modified to reduce analysis time, and evaluated for factors that affected accuracy. Time required to perform t...

  18. A Confirmatory Factor Analysis of the Professional Opinion Scale

    ERIC Educational Resources Information Center

    Greeno, Elizabeth J.; Hughes, Anne K.; Hayward, R. Anna; Parker, Karen L.

    2007-01-01

    The Professional Opinion Scale (POS) was developed to measure social work values orientation. Objective: A confirmatory factor analysis was performed on the POS. Method: This cross-sectional study used a mailed survey design with a national random (simple) sample of members of the National Association of Social Workers. Results: The study…

  19. STAMP-Based HRA Considering Causality Within a Sociotechnical System: A Case of Minuteman III Missile Accident.

    PubMed

    Rong, Hao; Tian, Jin

    2015-05-01

    The study contributes to human reliability analysis (HRA) by proposing a method that focuses more on human error causality within a sociotechnical system, illustrating its rationality and feasibility by using a case of the Minuteman (MM) III missile accident. Due to the complexity and dynamics within a sociotechnical system, previous analyses of accidents involving human and organizational factors clearly demonstrated that the methods using a sequential accident model are inadequate to analyze human error within a sociotechnical system. System-theoretic accident model and processes (STAMP) was used to develop a universal framework of human error causal analysis. To elaborate the causal relationships and demonstrate the dynamics of human error, system dynamics (SD) modeling was conducted based on the framework. A total of 41 contributing factors, categorized into four types of human error, were identified through the STAMP-based analysis. All factors are related to a broad view of sociotechnical systems, and more comprehensive than the causation presented in the accident investigation report issued officially. Recommendations regarding both technical and managerial improvement for a lower risk of the accident are proposed. The interests of an interdisciplinary approach provide complementary support between system safety and human factors. The integrated method based on STAMP and SD model contributes to HRA effectively. The proposed method will be beneficial to HRA, risk assessment, and control of the MM III operating process, as well as other sociotechnical systems. © 2014, Human Factors and Ergonomics Society.

  20. Background recovery via motion-based robust principal component analysis with matrix factorization

    NASA Astrophysics Data System (ADS)

    Pan, Peng; Wang, Yongli; Zhou, Mingyuan; Sun, Zhipeng; He, Guoping

    2018-03-01

    Background recovery is a key technique in video analysis, but it still suffers from many challenges, such as camouflage, lighting changes, and diverse types of image noise. Robust principal component analysis (RPCA), which aims to recover a low-rank matrix and a sparse matrix, is a general framework for background recovery. The nuclear norm is widely used as a convex surrogate for the rank function in RPCA, which requires computing the singular value decomposition (SVD), a task that is increasingly costly as matrix sizes and ranks increase. However, matrix factorization greatly reduces the dimension of the matrix for which the SVD must be computed. Motion information has been shown to improve low-rank matrix recovery in RPCA, but this method still finds it difficult to handle original video data sets because of its batch-mode formulation and implementation. Hence, in this paper, we propose a motion-assisted RPCA model with matrix factorization (FM-RPCA) for background recovery. Moreover, an efficient linear alternating direction method of multipliers with a matrix factorization (FL-ADM) algorithm is designed for solving the proposed FM-RPCA model. Experimental results illustrate that the method provides stable results and is more efficient than the current state-of-the-art algorithms.

  1. Analysis of exergy efficiency of a super-critical compressed carbon dioxide energy-storage system based on the orthogonal method.

    PubMed

    He, Qing; Hao, Yinping; Liu, Hui; Liu, Wenyi

    2018-01-01

    Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system.

  2. Analysis of exergy efficiency of a super-critical compressed carbon dioxide energy-storage system based on the orthogonal method

    PubMed Central

    He, Qing; Liu, Hui; Liu, Wenyi

    2018-01-01

    Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system. PMID:29634742

  3. Contextual factors affecting autonomy for patients in Iranian hospitals: A qualitative study

    PubMed Central

    Ebrahimi, Hossein; Sadeghian, Efat; Seyedfatemi, Naeimeh; Mohammadi, Eesa; Crowley, Maureen

    2016-01-01

    Background: Consideration of patient autonomy is an essential element in individualized, patient-centered, ethical care. Internal and external factors associated with patient autonomy are related to culture and it is not clear what they are in Iran. The aim of this study was to explore contextual factors affecting the autonomy of patients in Iranian hospitals. Materials and Methods: This was a qualitative study using conventional content analysis methods. Thirty-four participants (23 patients, 9 nurses, and 2 doctors) from three Iranian teaching hospitals, selected using purposive sampling, participated in semi-structured interviews. Unstructured observation and filed notes were other methods for data collection. The data were subjected to qualitative content analysis and analyzed using the MAXQDA-10 software. Results: Five categories and sixteen subcategories were identified. The five main categories related to patient autonomy were: Intrapersonal factors, physical health status, supportive family and friends, communication style, and organizational constraints. Conclusions: In summary, this study uncovered contextual factors that the care team, managers, and planners in the health field should target in order to improve patient autonomy in Iranian hospitals. PMID:27186203

  4. Teaching learning methods of an entrepreneurship curriculum

    PubMed Central

    ESMI, KERAMAT; MARZOUGHI, RAHMATALLAH; TORKZADEH, JAFAR

    2015-01-01

    Introduction One of the most significant elements of entrepreneurship curriculum design is teaching-learning methods, which plays a key role in studies and researches related to such a curriculum. It is the teaching method, and systematic, organized and logical ways of providing lessons that should be consistent with entrepreneurship goals and contents, and should also be developed according to the learners’ needs. Therefore, the current study aimed to introduce appropriate, modern, and effective methods of teaching entrepreneurship and their validation Methods This is a mixed method research of a sequential exploratory kind conducted through two stages: a) developing teaching methods of entrepreneurship curriculum, and b) validating developed framework. Data were collected through “triangulation” (study of documents, investigating theoretical basics and the literature, and semi-structured interviews with key experts). Since the literature on this topic is very rich, and views of the key experts are vast, directed and summative content analysis was used. In the second stage, qualitative credibility of research findings was obtained using qualitative validation criteria (credibility, confirmability, and transferability), and applying various techniques. Moreover, in order to make sure that the qualitative part is reliable, reliability test was used. Moreover, quantitative validation of the developed framework was conducted utilizing exploratory and confirmatory factor analysis methods and Cronbach’s alpha. The data were gathered through distributing a three-aspect questionnaire (direct presentation teaching methods, interactive, and practical-operational aspects) with 29 items among 90 curriculum scholars. Target population was selected by means of purposive sampling and representative sample. Results Results obtained from exploratory factor analysis showed that a three factor structure is an appropriate method for describing elements of teaching-learning methods of entrepreneurship curriculum. Moreover, the value for Kaiser Meyer Olkin measure of sampling adequacy equaled 0.72 and the value for Bartlett’s test of variances homogeneity was significant at the 0.0001 level. Except for internship element, the rest had a factor load of higher than 0.3. Also, the results of confirmatory factor analysis showed the model appropriateness, and the criteria for qualitative accreditation were acceptable. Conclusion Developed model can help instructors in selecting an appropriate method of entrepreneurship teaching, and it can also make sure that the teaching is on the right path. Moreover, the model is comprehensive and includes all the effective teaching methods in entrepreneurship education. It is also based on qualities, conditions, and requirements of Higher Education Institutions in Iranian cultural environment. PMID:26457314

  5. A comparative analysis of ethnomedicinal practices for treating gastrointestinal disorders used by communities living in three national parks (Korea).

    PubMed

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species.

  6. Methods for analysis of cracks in three-dimensional solids

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Newman, J. C., Jr.

    1984-01-01

    Analytical and numerical methods evaluating the stress-intensity factors for three-dimensional cracks in solids are presented, with reference to fatigue failure in aerospace structures. The exact solutions for embedded elliptical and circular cracks in infinite solids, and the approximate methods, including the finite-element, the boundary-integral equation, the line-spring models, and the mixed methods are discussed. Among the mixed methods, the superposition of analytical and finite element methods, the stress-difference, the discretization-error, the alternating, and the finite element-alternating methods are reviewed. Comparison of the stress-intensity factor solutions for some three-dimensional crack configurations showed good agreement. Thus, the choice of a particular method in evaluating the stress-intensity factor is limited only to the availability of resources and computer programs.

  7. The Shock and Vibration Digest. Volume 16, Number 1

    DTIC Science & Technology

    1984-01-01

    investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is

  8. Single-Molecule Studies of Actin Assembly and Disassembly Factors

    PubMed Central

    Smith, Benjamin A.; Gelles, Jeff; Goode, Bruce L.

    2014-01-01

    The actin cytoskeleton is very dynamic and highly regulated by multiple associated proteins in vivo. Understanding how this system of proteins functions in the processes of actin network assembly and disassembly requires methods to dissect the mechanisms of activity of individual factors and of multiple factors acting in concert. The advent of single-filament and single-molecule fluorescence imaging methods has provided a powerful new approach to discovering actin-regulatory activities and obtaining direct, quantitative insights into the pathways of molecular interactions that regulate actin network architecture and dynamics. Here we describe techniques for acquisition and analysis of single-molecule data, applied to the novel challenges of studying the filament assembly and disassembly activities of actin-associated proteins in vitro. We discuss the advantages of single-molecule analysis in directly visualizing the order of molecular events, measuring the kinetic rates of filament binding and dissociation, and studying the coordination among multiple factors. The methods described here complement traditional biochemical approaches in elucidating actin-regulatory mechanisms in reconstituted filamentous networks. PMID:24630103

  9. Shuttle user analysis (study 2.2). Volume 4: Standardized subsystem modules analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The capability to analyze payloads constructed of standardized modules was provided for the planning of future mission models. An inventory of standardized module designs previously obtained was used as a starting point. Some of the conclusions and recommendations are: (1) the two growth factor synthesis methods provide logical configurations for satellite type selection; (2) the recommended method is the one that determines the growth factor as a function of the baseline subsystem weight, since it provides a larger growth factor for small subsystem weights and results in a greater overkill due to standardization; (3) the method that is not recommended is the one that depends upon a subsystem similarity selection, since care must be used in the subsystem similarity selection; (4) it is recommended that the application of standardized subsystem factors be limited to satellites with baseline dry weights between about 700 and 6,500 lbs; and (5) the standardized satellite design approach applies to satellites maintainable in orbit or retrieved for ground maintenance.

  10. [Study on depressive disorder and related factors in surgical inpatients].

    PubMed

    Ge, Hong-min; Liu, Lan-fen; Han, Jian-bo

    2008-03-01

    To investigate the prevalence and possible influencing factors of depressive disorder in surgical inpatients. Two hundred and sixty-six surgical inpatients meeting the inclusion criteria were first screened with the self rating depression scale (SDS), and then the subjects screened positive and 20% of those screened negative were evaluated with Structured Clinical Interview for DSM-IV Axis I Disorders (SCID) as a gold standard for diagnosis of depressive disorder. Possible influencing factors were also analyzed by experienced psychiatrists. The standard score of SDS in the surgical inpatients were significantly higher than those in the Chinese norm, and the incidence of depressive disorder in the surgical inpatients was 37.2%. Unvaried analysis showed that depressive disorder were associated with gender, education, economic condition, variety of diseases, hospitalization duration, and treatment methods. Logistic regression analysis revealed that gender, economic condition, treatment methods and previous history were the main influencing factors. The incidence of depressive disorder in the surgical inpatients is high, and it is mainly influenced by gender, economic condition, treatment methods and previous history.

  11. An Analysis Method for Superconducting Resonator Parameter Extraction with Complex Baseline Removal

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe

    2014-01-01

    A new semi-empirical model is proposed for extracting the quality (Q) factors of arrays of superconducting microwave kinetic inductance detectors (MKIDs). The determination of the total internal and coupling Q factors enables the computation of the loss in the superconducting transmission lines. The method used allows the simultaneous analysis of multiple interacting discrete resonators with the presence of a complex spectral baseline arising from reflections in the system. The baseline removal allows an unbiased estimate of the device response as measured in a cryogenic instrumentation setting.

  12. Simple analysis of the effect of construction materials on bridge impact factors

    DOT National Transportation Integrated Search

    1999-06-01

    The purpose of this research is to study the influence of different construction materials on the dynamic impact factor of bridges. Initially, introduction of the dynamic impact factor and some evaluation methods are presented. A comparison of the re...

  13. Factors influencing the results of faculty evaluation in Isfahan University of Medical Sciences

    PubMed Central

    Kamali, Farahnaz; Yamani, Nikoo; Changiz, Tahereh; Zoubin, Fatemeh

    2018-01-01

    OBJECTIVE: This study aimed to explore factors influencing the results of faculty member evaluation from the viewpoints of faculty members affiliated with Isfahan University of Medical Sciences, Isfahan, Iran. MATERIALS AND METHODS: This qualitative study was done using a conventional content analysis method. Participants were faculty members of Isfahan University of Medical Sciences who, considering maximum variation in sampling, were chosen with a purposive sampling method. Semi-structured interviews were held with 11 faculty members until data saturation was reached. The interviews were transcribed verbatim and analyzed with conventional content analysis method for theme development. Further, the MAXQDA software was used for data management. RESULTS: The data analysis led to the development of two main themes, namely, “characteristics of the educational system” and “characteristics of the faculty member evaluation system.” The first main theme consists of three categories, i.e. “characteristics of influential people in evaluation,” “features of the courses,” and “background characteristics.” The other theme has the following as its categories: “evaluation methods,” “evaluation tools,” “evaluation process,” and “application of evaluation results.” Each category will have its subcategories. CONCLUSIONS: Many factors affect the evaluation of faculty members that should be taken into account by educational policymakers for improving the quality of the educational process. In addition to the factors that directly influence the educational system, methodological problems in the evaluation system need special attention. PMID:29417073

  14. Stress Intensity Factors of Semi-Circular Bend Specimens with Straight-Through and Chevron Notches

    NASA Astrophysics Data System (ADS)

    Ayatollahi, M. R.; Mahdavi, E.; Alborzi, M. J.; Obara, Y.

    2016-04-01

    Semi-circular bend specimen is one of the useful test specimens for determining fracture toughness of rock and geo-materials. Generally, in rock test specimens, initial cracks are produced in two shapes: straight-edge cracks and chevron notches. In this study, the minimum dimensionless stress intensity factors of semi-circular bend specimen (SCB) with straight-through and chevron notches are calculated. First, using finite element analysis, a suitable relation for the dimensionless stress intensity factor of SCB with straight-through crack is presented based on the normalized crack length and half-distance between supports. For evaluating the validity and accuracy of this relation, the obtained results are then compared with numerical and experimental results reported in the literature. Subsequently, by performing some experiments and also finite element analysis of the SCB specimen with chevron notch, the minimum dimensionless stress intensity factor of this specimen is obtained. Using the new equation for the dimensionless stress intensity factor of SCB with straight-through crack and an analytical method, i.e., Bluhm's slice synthesis method, the minimum (critical) dimensionless stress intensity factor of chevron notched semi-circular bend specimens is calculated. Good agreement is observed between the results of two mentioned methods.

  15. Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

    PubMed Central

    Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip

    2011-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561

  16. Design of experiments as a tool for LC-MS/MS method development for the trace analysis of the potentially genotoxic 4-dimethylaminopyridine impurity in glucocorticoids.

    PubMed

    Székely, Gy; Henriques, B; Gil, M; Ramos, A; Alvarez, C

    2012-11-01

    The present study reports on a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method development strategy supported by design of experiments (DoE) for the trace analysis of 4-dimethylaminopyridine (DMAP). The conventional approaches for development of LC-MS/MS methods are usually via trial and error, varying intentionally the experimental factors which is time consuming and interactions between experimental factors are not considered. The LC factors chosen for the DoE study include flow (F), gradient (G) and injection volume (V(inj)) while cone voltage (E(con)) and collision energy (E(col)) were chosen as MS parameters. All of the five factors were studied simultaneously. The method was optimized with respect to four responses: separation of peaks (Sep), peak area (A(peak)), length of the analysis (T) and the signal to noise ratio (S/N). A quadratic model, namely central composite face (CCF) featuring 29 runs was used instead of a less powerful linear model since the increase in the number of injections was insignificant. In order to determine the robustness of the method a new set of DoE experiments was carried out applying robustness around the optimal conditions was evaluated applying a fractional factorial of resolution III with 11 runs, wherein additional factors - such as column temperature and quadrupole resolution - were considered. The method utilizes a Phenomenex Gemini NX C-18 HPLC analytical column with electrospray ionization and a triple quadrupole mass detector in multiple reaction monitoring (MRM) mode, resulting in short analyses with a 10min runtime. Drawbacks of derivatization, namely incomplete reaction and time consuming sample preparation, have been avoided and the change from SIM to MRM mode resulted in increased sensitivity and lower LOQ. The DoE method development strategy led to a method allowing the trace analysis of DMAP at 0.5 ng/ml absolute concentration which corresponds to a 0.1 ppm limit of quantification in 5mg/ml mometasone furoate glucocorticoid. The obtained method was validated in a linear range of 0.1-10 ppm and presented a %RSD of 0.02% for system precision. Regarding DMAP recovery in mometasone furoate, spiked samples produced %recoveries between 83 and 113% in the range of 0.1-2 ppm. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Psychometric Properties of the Persian Version of the Social Anxiety - Acceptance and Action Questionnaire

    PubMed Central

    Soltani, Esmail; Bahrainian, Seyed Abdolmajid; Masjedi Arani, Abbas; Farhoudian, Ali; Gachkar, Latif

    2016-01-01

    Background Social anxiety disorder is often related to specific impairment or distress in different areas of life, including occupational, social and family settings. Objective The purpose of the present study was to examine the psychometric properties of the persian version of the social anxiety-acceptance and action questionnaire (SA-AAQ) in university students. Materials and Methods In this descriptive cross-sectional study, 324 students from Shahid Beheshti University of Medical Sciences participated via the cluster sampling method during year 2015. Factor analysis by the principle component analysis method, internal consistency analysis, and convergent and divergent validity were conducted to examine the validity of the SA-AAQ. To calculate the reliability of the SA-AAQ, Cronbach’s alpha and test-retest reliability were used. Results The results from factor analysis by principle component analysis method yielded three factors that were named acceptance, action and non-judging of experience. The three-factor solution explained 51.82% of the variance. Evidence for the internal consistency of SA-AAQ was obtained via calculating correlations between SA-AAQ and its subscales. Support for convergent and discriminant validity of the SA-AAQ via its correlations with the acceptance and action questionnaire - II, social interaction anxiety scale, cognitive fusion questionnaire, believability of anxious feelings and thoughts questionnaire, valued living questionnaire and WHOQOL- BREF was obtained. The reliability of the SA-AAQ via calculating Cronbach’s alpha and test-retest coefficients yielded values of 0.84 and 0.84, respectively. Conclusions The Iranian version of the SA-AAQ has acceptable levels of psychometric properties in university students. The SA-AAQ is a valid and reliable measure to be utilized in research investigations and therapeutic interventions. PMID:27803719

  18. Integration of ANFIS, NN and GA to determine core porosity and permeability from conventional well log data

    NASA Astrophysics Data System (ADS)

    Ja'fari, Ahmad; Hamidzadeh Moghadam, Rasoul

    2012-10-01

    Routine core analysis provides useful information for petrophysical study of the hydrocarbon reservoirs. Effective porosity and fluid conductivity (permeability) could be obtained from core analysis in laboratory. Coring hydrocarbon bearing intervals and analysis of obtained cores in laboratory is expensive and time consuming. In this study an improved method to make a quantitative correlation between porosity and permeability obtained from core and conventional well log data by integration of different artificial intelligent systems is proposed. The proposed method combines the results of adaptive neuro-fuzzy inference system (ANFIS) and neural network (NN) algorithms for overall estimation of core data from conventional well log data. These methods multiply the output of each algorithm with a weight factor. Simple averaging and weighted averaging were used for determining the weight factors. In the weighted averaging method the genetic algorithm (GA) is used to determine the weight factors. The overall algorithm was applied in one of SW Iran’s oil fields with two cored wells. One-third of all data were used as the test dataset and the rest of them were used for training the networks. Results show that the output of the GA averaging method provided the best mean square error and also the best correlation coefficient with real core data.

  19. [Study on ecological suitability regionalization of Eucommia ulmoides in Guizhou].

    PubMed

    Kang, Chuan-Zhi; Wang, Qing-Qing; Zhou, Tao; Jiang, Wei-Ke; Xiao, Cheng-Hong; Xie, Yu

    2014-05-01

    To study the ecological suitability regionalization of Eucommia ulmoides, for selecting artificial planting base and high-quality industrial raw material purchase area of the herb in Guizhou. Based on the investigation of 14 Eucommia ulmoides producing areas, pinoresinol diglucoside content and ecological factors were obtained. Using spatial analysis method to carry on ecological suitability regionalization. Meanwhile, combining pinoresinol diglucoside content, the correlation of major active components and environmental factors were analyzed by statistical analysis. The most suitability planting area of Eucommia ulmoides was the northwest of Guizhou. The distribution of Eucommia ulmoides was mainly affected by the type and pH value of soil, and monthly precipitation. The spatial structure of major active components in Eucommia ulmoides were randomly distributed in global space, but had only one aggregation point which had a high positive correlation in local space. The major active components of Eucommia ulmoides had no correlation with altitude, longitude or latitude. Using the spatial analysis method and statistical analysis method, based on environmental factor and pinoresinol diglucoside content, the ecological suitability regionalization of Eucommia ulmoides can provide reference for the selection of suitable planting area, artificial planting base and directing production layout.

  20. On the Preferred Flesh Color of Japanese and Chinese and the Determining Factors —Investigation of the Younger Generation Using Method of Successive Categories and Semantic Differential Method—

    NASA Astrophysics Data System (ADS)

    Fan, Ying; Deng, Pei; Tsuruoka, Hideki; Aoki, Naokazu; Kobayashi, Hiroyuki

    The preferred flesh color was surveyed by the successive five categories method and the SD method in Japan and China to investigate its determining factors. The Chinese most preferred flesh color was more reddish than the Japanese one, while the flesh color accepted by 50% and more of the observers in China was larger in chromaticness and more yellowish than in Japan. In the determining factors for selection of the preferred color extracted by a factor analysis, a big difference between Japanese and Chinese men was observed. The first factor of the former was kind personality, whereas that of the latter was showy appearance.

  1. Adhesive blood microsampling systems for steroid measurement via LC-MS/MS in the rat.

    PubMed

    Heussner, Kirsten; Rauh, Manfred; Cordasic, Nada; Menendez-Castro, Carlos; Huebner, Hanna; Ruebner, Matthias; Schmidt, Marius; Hartner, Andrea; Rascher, Wolfgang; Fahlbusch, Fabian B

    2017-04-01

    Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) allows for the direct analysis of multiple hormones in a single probe with minimal sample volume. Rodent-based animal studies strongly rely on microsampling, such as the dry blood spot (DBS) method. However, DBS suffers the drawback of hematocrit-dependence (non-volumetric). Hence, novel volumetric microsampling techniques were introduced recently, allowing sampling of fixed accurate volumes. We compared these methods for steroid analysis in the rat to improve inter-system comparability. We analyzed steroid levels in blood using the absorptive microsampling devices Whatman® 903 Protein Saver Cards, Noviplex™ Plasma Prep Cards and the Mitra™ Microsampling device and compared the obtained results to the respective EDTA plasma levels. Quantitative steroid analysis was performed via LC-MS/MS. For the determination of the plasma volume factor for each steroid, their levels in pooled blood samples from each human adults and rats (18weeks) were compared and the transferability of these factors was evaluated in a new set of juvenile (21days) and adult (18weeks) rats. Hematocrit was determined concomitantly. Using these approaches, we were unable to apply one single volume factor for each steroid. Instead, plasma volume factors had to be adjusted for the recovery rate of each steroid and device individually. The tested microsampling systems did not allow the use of one single volume factor for adult and juvenile rats based on an unexpectedly strong hematocrit-dependency and other steroid specific (pre-analytic) factors. Our study provides correction factors for LC-MS/MS steroid analysis of volumetric and non-volumetric microsampling systems in comparison to plasma. It argues for thorough analysis of chromatographic effects before the use of novel volumetric systems for steroid analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Factors influencing the results of faculty evaluation in Isfahan University of Medical Sciences.

    PubMed

    Kamali, Farahnaz; Yamani, Nikoo; Changiz, Tahereh; Zoubin, Fatemeh

    2018-01-01

    This study aimed to explore factors influencing the results of faculty member evaluation from the viewpoints of faculty members affiliated with Isfahan University of Medical Sciences, Isfahan, Iran. This qualitative study was done using a conventional content analysis method. Participants were faculty members of Isfahan University of Medical Sciences who, considering maximum variation in sampling, were chosen with a purposive sampling method. Semi-structured interviews were held with 11 faculty members until data saturation was reached. The interviews were transcribed verbatim and analyzed with conventional content analysis method for theme development. Further, the MAXQDA software was used for data management. The data analysis led to the development of two main themes, namely, "characteristics of the educational system" and "characteristics of the faculty member evaluation system." The first main theme consists of three categories, i.e. "characteristics of influential people in evaluation," "features of the courses," and "background characteristics." The other theme has the following as its categories: "evaluation methods," "evaluation tools," "evaluation process," and "application of evaluation results." Each category will have its subcategories. Many factors affect the evaluation of faculty members that should be taken into account by educational policymakers for improving the quality of the educational process. In addition to the factors that directly influence the educational system, methodological problems in the evaluation system need special attention.

  3. Physical and Cognitive-Affective Factors Associated with Fatigue in Individuals with Fibromyalgia: A Multiple Regression Analysis

    ERIC Educational Resources Information Center

    Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong

    2015-01-01

    Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…

  4. A New, More Powerful Approach to Multitrait-Multimethod Analyses: An Application of Second-Order Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Marsh, Herbert W.; Hocevar, Dennis

    The advantages of applying confirmatory factor analysis (CFA) to multitrait-multimethod (MTMM) data are widely recognized. However, because CFA as traditionally applied to MTMM data incorporates single indicators of each scale (i.e., each trait/method combination), important weaknesses are the failure to: (1) correct appropriately for measurement…

  5. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale.

    PubMed

    Sediyama, Cristina Y N; Moura, Ricardo; Garcia, Marina S; da Silva, Antonio G; Soraggi, Carolina; Neves, Fernando S; Albuquerque, Maicon R; Whiteside, Setephen P; Malloy-Diniz, Leandro F

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach's alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  6. Research on the factors of return on equity: empirical analysis in Chinese port industries from 2000-2008

    NASA Astrophysics Data System (ADS)

    Li, Wei

    2012-01-01

    Port industries are the basic industries in the national economy. The industries have become the most modernized departments in every country. The development of the port industry is not only advantageous to promote the optimizing arrangement of social resources, but also to promote the growth of foreign trade volume through enhancing the transportation functions. Return on equity (ROE) is a direct indicator related to the maximization of company's wealth. It makes up the shortcomings of earnings per share (EPS). The aim of this paper is to prove the correlation between ROE and other financial indicators by choosing the listed port companies as the research objectives and selecting the data of these companies from 2000 to 2008 as empirical sample data with statistical analysis of the chartered figure and coefficient. The detailed analysis method used in the paper is the combination of trend analysis, comparative analysis and the ratio of the factor analysis method. This paper analyzes and compares all these factors and draws the conclusions as follows: Firstly, ROE has a positive correlation with total assets turnover, main profit margin and fixed asset ratio, while has a negative correlation with assets liabilities ratio, total assets growth rate and DOL. Secondly, main profit margin has the greatest positive effect on ROE among all these factors. The second greatest factor is total assets turnover, which shows the operation capacity is also an important indicator after the profitability. Thirdly, assets liabilities ratio has the greatest negative effect on ROE among all these factors.

  7. Research on the factors of return on equity: empirical analysis in Chinese port industries from 2000-2008

    NASA Astrophysics Data System (ADS)

    Li, Wei

    2011-12-01

    Port industries are the basic industries in the national economy. The industries have become the most modernized departments in every country. The development of the port industry is not only advantageous to promote the optimizing arrangement of social resources, but also to promote the growth of foreign trade volume through enhancing the transportation functions. Return on equity (ROE) is a direct indicator related to the maximization of company's wealth. It makes up the shortcomings of earnings per share (EPS). The aim of this paper is to prove the correlation between ROE and other financial indicators by choosing the listed port companies as the research objectives and selecting the data of these companies from 2000 to 2008 as empirical sample data with statistical analysis of the chartered figure and coefficient. The detailed analysis method used in the paper is the combination of trend analysis, comparative analysis and the ratio of the factor analysis method. This paper analyzes and compares all these factors and draws the conclusions as follows: Firstly, ROE has a positive correlation with total assets turnover, main profit margin and fixed asset ratio, while has a negative correlation with assets liabilities ratio, total assets growth rate and DOL. Secondly, main profit margin has the greatest positive effect on ROE among all these factors. The second greatest factor is total assets turnover, which shows the operation capacity is also an important indicator after the profitability. Thirdly, assets liabilities ratio has the greatest negative effect on ROE among all these factors.

  8. The effect of push factors in the leisure sports participation of the retired elderly on re-socialization recovery resilience.

    PubMed

    Lee, Kwang-Uk; Kim, Hong-Rok; Yi, Eun-Surk

    2014-04-01

    This study aimed to provide useful materials for the realization of healthy and happy welfare society through the re-socialization of the retired elderly by identifying the effect of the push factors in the leisure sports participation of the retired elderly on re-socialization and recovery resilience. To achieve the study purpose, 304 subjects over the age of 55 residing in Seoul and Gyeonggin among the retired elderly were selected by using the method of systematic stratified cluster random sampling. As research methods, questionnaire papers were used. The data were collected and data which were judged to be incomplete or unreliable in responses were excluded from the analysis. After inputting data which are available to analysis and SPSS 18.0 program was used for statistical techniques. In this, data were processed by factor analysis, correlation analysis, and multiple regression analysis. The study results that were obtained from this analysis are as follows: First, the psychological stability among the push factors in the leisure sports participation of the elderly had a significant effect on re-socialization, while health pursuit had a significant effect on personal exchange and economic activity among the sub-factors of re-socialization. Second, psychological stability among the push factors in the leisure sports participation of the retired elderly had a significant effect on recovery resilience; personal relationships had an effect on empathy skills, impulse control, and self-efficacy; and health pursuit had a significant effect on impulse control, optimism, and self-efficacy.

  9. The effect of push factors in the leisure sports participation of the retired elderly on re-socialization recovery resilience

    PubMed Central

    Lee, Kwang-Uk; Kim, Hong-Rok; Yi, Eun-Surk

    2014-01-01

    This study aimed to provide useful materials for the realization of healthy and happy welfare society through the re-socialization of the retired elderly by identifying the effect of the push factors in the leisure sports participation of the retired elderly on re-socialization and recovery resilience. To achieve the study purpose, 304 subjects over the age of 55 residing in Seoul and Gyeonggin among the retired elderly were selected by using the method of systematic stratified cluster random sampling. As research methods, questionnaire papers were used. The data were collected and data which were judged to be incomplete or unreliable in responses were excluded from the analysis. After inputting data which are available to analysis and SPSS 18.0 program was used for statistical techniques. In this, data were processed by factor analysis, correlation analysis, and multiple regression analysis. The study results that were obtained from this analysis are as follows: First, the psychological stability among the push factors in the leisure sports participation of the elderly had a significant effect on re-socialization, while health pursuit had a significant effect on personal exchange and economic activity among the sub-factors of re-socialization. Second, psychological stability among the push factors in the leisure sports participation of the retired elderly had a significant effect on recovery resilience; personal relationships had an effect on empathy skills, impulse control, and self-efficacy; and health pursuit had a significant effect on impulse control, optimism, and self-efficacy. PMID:24877044

  10. Determination of the reference air kerma rate for 192Ir brachytherapy sources and the related uncertainty.

    PubMed

    van Dijk, Eduard; Kolkman-Deurloo, Inger-Karine K; Damen, Patricia M G

    2004-10-01

    Different methods exist to determine the air kerma calibration factor of an ionization chamber for the spectrum of a 192Ir high-dose-rate (HDR) or pulsed-dose-rate (PDR) source. An analysis of two methods to obtain such a calibration factor was performed: (i) the method recommended by [Goetsch et al., Med. Phys. 18, 462-467 (1991)] and (ii) the method employed by the Dutch national standards institute NMi [Petersen et al., Report S-EI-94.01 (NMi, Delft, The Netherlands, 1994)]. This analysis showed a systematic difference on the order of 1% in the determination of the strength of 192Ir HDR and PDR sources depending on the method used for determining the air kerma calibration factor. The definitive significance of the difference between these methods can only be addressed after performing an accurate analysis of the associated uncertainties. For an NE 2561 (or equivalent) ionization chamber and an in-air jig, a typical uncertainty budget of 0.94% was found with the NMi method. The largest contribution in the type-B uncertainty is the uncertainty in the air kerma calibration factor for isotope i, N(i)k, as determined by the primary or secondary standards laboratories. This uncertainty is dominated by the uncertainties in the physical constants for the average mass-energy absorption coefficient ratio and the stopping power ratios. This means that it is not foreseeable that the standards laboratories can decrease the uncertainty in the air kerma calibration factors for ionization chambers in the short term. When the results of the determination of the 192Ir reference air kerma rates in, e.g., different institutes are compared, the uncertainties in the physical constants are the same. To compare the applied techniques, the ratio of the results can be judged by leaving out the uncertainties due to these physical constants. In that case an uncertainty budget of 0.40% (coverage factor=2) should be taken into account. Due to the differences in approach between the method used by NMi and the method recommended by Goetsch et al., an extra type-B uncertainty of 0.9% (k= 1) has to be taken into account when the method of Goetsch et al. is applied. Compared to the uncertainty of 1% (k= 2) found for the air calibration of 192Ir, the difference of 0.9% found is significant.

  11. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  12. PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD

    NASA Astrophysics Data System (ADS)

    Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao

    Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.

  13. Factors Affecting the Communication Competence in Iranian Nursing Students: A Qualitative Study

    PubMed Central

    Jouzi, Mina; Vanaki, Zohreh; Mohammadi, Easa

    2015-01-01

    Background: Communication competence in nursing students is one of the nursing education requirements, especially during the internship period, the final stage of the bachelor nursing education in Iran. Several factors can influence this competence and identifying them could help provide safe care by nursing students in the future. Objectives: This study aimed to investigate factors that influence nursing students' communication competence. Patients and Methods: A purposeful sampling technique was used to select 18 nursing students who had completed their internship. Semi-structured interviews were conducted and data were analyzed by the conventional qualitative content analysis method. Results: After data analysis, three main categories were achieved: organizational factors, humanistic factors and socio-cultural factors. The main and latent theme that affected the students' communication competence was not being accepted as a caregiver in the clinical environment. Conclusions: With regards to students not being accepted in health care environments, it is recommended to plan special programs for empowering students to acquire better social state and acceptance by the health care team. PMID:26019902

  14. Stability-Derivative Determination from Flight Data

    NASA Technical Reports Server (NTRS)

    Holowicz, Chester H.; Holleman, Euclid C.

    1958-01-01

    A comprehensive discussion of the various factors affecting the determination of stability and control derivatives from flight data is presented based on the experience of the NASA High-Speed Flight Station. Factors relating to test techniques, determination of mass characteristics, instrumentation, and methods of analysis are discussed. For most longitudinal-stability-derivative analyses simple equations utilizing period and damping have been found to be as satisfactory as more comprehensive methods. The graphical time-vector method has been the basis of lateral-derivative analysis, although simple approximate methods can be useful If applied with caution. Control effectiveness has been generally obtained by relating the peak acceleration to the rapid control input, and consideration must be given to aerodynamic contributions if reasonable accuracy is to be realized.. Because of the many factors involved In the determination of stability derivatives, It is believed that the primary stability and control derivatives are probably accurate to within 10 to 25 percent, depending upon the specific derivative. Static-stability derivatives at low angle of attack show the greatest accuracy.

  15. Shear-wave velocity and site-amplification factors for 50 Australian sites determined by the spectral analysis of surface waves method

    USGS Publications Warehouse

    Kayen, Robert E.; Carkin, Bradley A.; Allen, Trevor; Collins, Clive; McPherson, Andrew; Minasian, Diane L.

    2015-01-01

    One-dimensional shear-wave velocity (VS ) profiles are presented at 50 strong motion sites in New South Wales and Victoria, Australia. The VS profiles are estimated with the spectral analysis of surface waves (SASW) method. The SASW method is a noninvasive method that indirectly estimates the VS at depth from variations in the Rayleigh wave phase velocity at the surface.

  16. Investigation of High-Angle-of-Attack Maneuver-Limiting Factors. Part 1. Analysis and Simulation

    DTIC Science & Technology

    1980-12-01

    useful, are not so satisfying or in- structive as the more positive identification of causal factors offered by the methods developed in Reference 5...same methods be applied to additional high-performance fighter aircraft having widely differing high AOA handling characteristics to see if further...predictions and the nonlinear model results were resolved. The second task involved development of methods , criteria, and an associated pilot rating scale, for

  17. [The impact of social and hygienic lifestyle factors on health status of students].

    PubMed

    Sakharova, O B; Kiku, P F; Gorborukova, T V

    2012-01-01

    The complex estimation of the impact of socio-hygienic lifestyle factors on the health of students has been performed. In the work the data of sociological analysis (questionnaire), the methods of multivariate statistics (correlation, regression analysis, method of correlation pleiades by P. V. Terentiev) were used. Among the analyzed components the average monthly income was found to make the greatest contribution of the health state and physical capacity of the studied contingent of students. The influence of this factor is most pronounced in a group of students with an average wealth. The quality of nutrition and the mode of life depend on the level of material well-being of students. Students with a deficiency or excess body weight are more susceptible to the effects of such lifestyle factors such as nutrition, physical activity, bad habits and prosperity.

  18. Factor analysis as a tool for spectral line component separation 21cm emission in the direction of L1780

    NASA Technical Reports Server (NTRS)

    Toth, L. V.; Mattila, K.; Haikala, L.; Balazs, L. G.

    1992-01-01

    The spectra of the 21cm HI radiation from the direction of L1780, a small high-galactic latitude dark/molecular cloud, were analyzed by multivariate methods. Factor analysis was performed on HI (21cm) spectra in order to separate the different components responsible for the spectral features. The rotated, orthogonal factors explain the spectra as a sum of radiation from the background (an extended HI emission layer), and from the L1780 dark cloud. The coefficients of the cloud-indicator factors were used to locate the HI 'halo' of the molecular cloud. Our statistically derived 'background' and 'cloud' spectral profiles, as well as the spatial distribution of the HI halo emission distribution were compared to the results of a previous study which used conventional methods analyzing nearly the same data set.

  19. Stress intensity factors for surface and corner cracks emanating from a wedge-loaded hole

    NASA Technical Reports Server (NTRS)

    Zhao, W.; Sutton, M. A.; Shivakumar, K. N.; Newman, J. C., Jr.

    1994-01-01

    To assist analysis of riveted lap joints, stress intensity factors are determined for surface and corner cracks emanating from a wedge-loaded hole by using a 3-D weight function method in conjunction with a 3-D finite element method. A stress intensity factor equation for surface cracks is also developed to provide a closed-form solution. The equation covers commonly-encountered geometrical ranges and retains high accuracy over the entire range.

  20. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Indicators of economic security of the region: a risk-based approach to assessing and rating

    NASA Astrophysics Data System (ADS)

    Karanina, Elena; Loginov, Dmitri

    2017-10-01

    The article presents the results of research of theoretical and methodical problems of strategy development for economic security of a particular region, justified by the composition of risk factors. The analysis of those risk factors is performed. The threshold values of indicators of economic security of regions were determined using the methods of socioeconomic statistics. The authors concluded that in modern Russian conditions it is necessary to pay great attention to the analysis of the composition and level of indicators of economic security of the region and, based on the materials of this analysis, to formulate more accurate decisions concerning the strategy of socio-economic development.

  2. a Cognitive Approach to Teaching a Graduate-Level Geobia Course

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel A.

    2016-06-01

    Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.

  3. Leading for the long haul: a mixed-method evaluation of the Sustainment Leadership Scale (SLS).

    PubMed

    Ehrhart, Mark G; Torres, Elisa M; Green, Amy E; Trott, Elise M; Willging, Cathleen E; Moullin, Joanna C; Aarons, Gregory A

    2018-01-19

    Despite our progress in understanding the organizational context for implementation and specifically the role of leadership in implementation, its role in sustainment has received little attention. This paper took a mixed-method approach to examine leadership during the sustainment phase of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Utilizing the Implementation Leadership Scale as a foundation, we sought to develop a short, practical measure of sustainment leadership that can be used for both applied and research purposes. Data for this study were collected as a part of a larger mixed-method study of evidence-based intervention, SafeCare®, sustainment. Quantitative data were collected from 157 providers using web-based surveys. Confirmatory factor analysis was used to examine the factor structure of the Sustainment Leadership Scale (SLS). Qualitative data were collected from 95 providers who participated in one of 15 focus groups. A framework approach guided qualitative data analysis. Mixed-method integration was also utilized to examine convergence of quantitative and qualitative findings. Confirmatory factor analysis supported the a priori higher order factor structure of the SLS with subscales indicating a single higher order sustainment leadership factor. The SLS demonstrated excellent internal consistency reliability. Qualitative analyses offered support for the dimensions of sustainment leadership captured by the quantitative measure, in addition to uncovering a fifth possible factor, available leadership. This study found qualitative and quantitative support for the pragmatic SLS measure. The SLS can be used for assessing leadership of first-level leaders to understand how staff perceive leadership during sustainment and to suggest areas where leaders could direct more attention in order to increase the likelihood that EBIs are institutionalized into the normal functioning of the organization.

  4. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    PubMed

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  5. Slope stability analysis using limit equilibrium method in nonlinear criterion.

    PubMed

    Lin, Hang; Zhong, Wenwen; Xiong, Wei; Tang, Wenyu

    2014-01-01

    In slope stability analysis, the limit equilibrium method is usually used to calculate the safety factor of slope based on Mohr-Coulomb criterion. However, Mohr-Coulomb criterion is restricted to the description of rock mass. To overcome its shortcomings, this paper combined Hoek-Brown criterion and limit equilibrium method and proposed an equation for calculating the safety factor of slope with limit equilibrium method in Hoek-Brown criterion through equivalent cohesive strength and the friction angle. Moreover, this paper investigates the impact of Hoek-Brown parameters on the safety factor of slope, which reveals that there is linear relation between equivalent cohesive strength and weakening factor D. However, there are nonlinear relations between equivalent cohesive strength and Geological Strength Index (GSI), the uniaxial compressive strength of intact rock σ ci , and the parameter of intact rock m i . There is nonlinear relation between the friction angle and all Hoek-Brown parameters. With the increase of D, the safety factor of slope F decreases linearly; with the increase of GSI, F increases nonlinearly; when σ ci is relatively small, the relation between F and σ ci is nonlinear, but when σ ci is relatively large, the relation is linear; with the increase of m i , F decreases first and then increases.

  6. Slope Stability Analysis Using Limit Equilibrium Method in Nonlinear Criterion

    PubMed Central

    Lin, Hang; Zhong, Wenwen; Xiong, Wei; Tang, Wenyu

    2014-01-01

    In slope stability analysis, the limit equilibrium method is usually used to calculate the safety factor of slope based on Mohr-Coulomb criterion. However, Mohr-Coulomb criterion is restricted to the description of rock mass. To overcome its shortcomings, this paper combined Hoek-Brown criterion and limit equilibrium method and proposed an equation for calculating the safety factor of slope with limit equilibrium method in Hoek-Brown criterion through equivalent cohesive strength and the friction angle. Moreover, this paper investigates the impact of Hoek-Brown parameters on the safety factor of slope, which reveals that there is linear relation between equivalent cohesive strength and weakening factor D. However, there are nonlinear relations between equivalent cohesive strength and Geological Strength Index (GSI), the uniaxial compressive strength of intact rock σ ci, and the parameter of intact rock m i. There is nonlinear relation between the friction angle and all Hoek-Brown parameters. With the increase of D, the safety factor of slope F decreases linearly; with the increase of GSI, F increases nonlinearly; when σ ci is relatively small, the relation between F and σ ci is nonlinear, but when σ ci is relatively large, the relation is linear; with the increase of m i, F decreases first and then increases. PMID:25147838

  7. Soil analysis based on sa,ples withdrawn from different volumes: correlation versus calibration

    Treesearch

    Lucian Weilopolski; Kurt Johnsen; Yuen Zhang

    2010-01-01

    Soil, particularly in forests, is replete with spatial variation with respect to soil C. Th e present standard chemical method for soil analysis by dry combustion (DC) is destructive, and comprehensive sampling is labor intensive and time consuming. Th ese, among other factors, are contributing to the development of new methods for soil analysis. Th ese include a near...

  8. Estimating Cyanobacteria Community Dynamics and its Relationship with Environmental Factors

    PubMed Central

    Luo, Wenhuai; Chen, Huirong; Lei, Anping; Lu, Jun; Hu, Zhangli

    2014-01-01

    The cyanobacteria community dynamics in two eutrophic freshwater bodies (Tiegang Reservoir and Shiyan Reservoir) was studied with both a traditional microscopic counting method and a PCR-DGGE genotyping method. Results showed that cyanobacterium Phormidium tenue was the predominant species; twenty-six cyanobacteria species were identified in water samples collected from the two reservoirs, among which fourteen were identified with the morphological method and sixteen with the PCR-DGGE method. The cyanobacteria community composition analysis showed a seasonal fluctuation from July to December. The cyanobacteria population peaked in August in both reservoirs, with cell abundances of 3.78 × 108 cells L-1 and 1.92 × 108 cells L-1 in the Tiegang and Shiyan reservoirs, respectively. Canonical Correspondence Analysis (CCA) was applied to further investigate the correlation between cyanobacteria community dynamics and environmental factors. The result indicated that the cyanobacteria community dynamics was mostly correlated with pH, temperature and total nitrogen. This study demonstrated that data obtained from PCR-DGGE combined with a traditional morphological method could reflect cyanobacteria community dynamics and its correlation with environmental factors in eutrophic freshwater bodies. PMID:24448632

  9. Method and apparatus for determining material structural integrity

    DOEpatents

    Pechersky, M.J.

    1994-01-01

    Disclosed are a nondestructive method and apparatus for determining the structural integrity of materials by combining laser vibrometry with damping analysis to determine the damping loss factor. The method comprises the steps of vibrating the area being tested over a known frequency range and measuring vibrational force and velocity vs time over the known frequency range. Vibrational velocity is preferably measured by a laser vibrometer. Measurement of the vibrational force depends on the vibration method: if an electromagnetic coil is used to vibrate a magnet secured to the area being tested, then the vibrational force is determined by the coil current. If a reciprocating transducer is used, the vibrational force is determined by a force gauge in the transducer. Using vibrational analysis, a plot of the drive point mobility of the material over the preselected frequency range is generated from the vibrational force and velocity data. Damping loss factor is derived from a plot of the drive point mobility over the preselected frequency range using the resonance dwell method and compared with a reference damping loss factor for structural integrity evaluation.

  10. An Analysis of the U.S. Navy Enlisted Separation Questionnaire

    DTIC Science & Technology

    1981-06-01

    on the amount of variance which will give a satis- factory and acceptable solution, only a small number (less than n) of factors will be needed to...three factors rather than nine initi al categories of data classification. C. FACTOR ANALISIS OF SUBSETS OF THE DATA During this phase of the analysis...Harm.an, H. H. & Holzinger, K. J., Factnr Anali• , Synthesis of Factorial Methods, University of Chicago PFess, 1941. 27. Fruchter, B., Introduction to

  11. Pathway-based factor analysis of gene expression data produces highly heritable phenotypes that associate with age.

    PubMed

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-03-09

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 "pathway phenotypes" that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold ([Formula: see text]). These phenotypes are more heritable ([Formula: see text]) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. Copyright © 2015 Brown et al.

  12. Pathway-Based Factor Analysis of Gene Expression Data Produces Highly Heritable Phenotypes That Associate with Age

    PubMed Central

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-01-01

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 “pathway phenotypes” that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold (P<5.38×10−5). These phenotypes are more heritable (h2=0.32) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. PMID:25758824

  13. A GIS-based automated procedure for landslide susceptibility mapping by the Conditional Analysis method: the Baganza valley case study (Italian Northern Apennines)

    NASA Astrophysics Data System (ADS)

    Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo

    2006-08-01

    Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.

  14. Patient Involvement in Safe Delivery: A Qualitative Study.

    PubMed

    Olfati, Forozun; Asefzadeh, Saeid; Changizi, Nasrin; Keramat, Afsaneh; Yunesian, Masud

    2015-09-28

    Patient involvement in safe delivery planning is considered important yet not widely practiced. The present study aimed at identifythe factors that affect patient involvementin safe delivery, as recommended by parturient women. This study was part of a qualitative research conducted by content analysis method and purposive sampling in 2013.The data were collected through 63 semi-structured interviews in4 hospitalsand analyzed using thematic content analysis. The participants in this research were women before discharge and after delivery. Findings were analyzed using Colaizzi's method. Four categories of factors that could affect patient involvement in safe delivery emerged from our analysis: patient-related (true and false beliefs, literacy, privacy, respect for patient), illness-related (pain, type of delivery, patient safety incidents), health care professional-relatedand task-related factors (behavior, monitoring &training), health care setting-related (financial aspects, facilities). More research is needed to explore the factors affecting the participation of mothers. It is therefore, recommended to: 1) take notice of mother education, their husbands, midwives and specialists; 2) provide pregnant women with insurance coverage from the outset of pregnancy, especially during prenatal period; 3) form a labor pain committee consisting of midwives, obstetricians, and anesthesiologists in order to identify the preferred painless labor methods based on the existing facilities and conditions, 4) carry out research on observing patients' privacy and dignity; 5) pay more attention on the factors affecting cesarean.

  15. Structural analysis of a Petri net model of oxidative stress in atherosclerosis.

    PubMed

    Kozak, Adam; Formanowicz, Dorota; Formanowicz, Piotr

    2018-06-01

    Atherosclerosis is a complex process of gathering sub-endothelial plaques decreasing lumen of the blood vessels. This disorder affects people of all ages, but its progression is asymptomatic for many years. It is regulated by many typical and atypical factors including the immune system response, a chronic kidney disease, a diet rich in lipids, a local inflammatory process and a local oxidative stress that is here one of the key factors. In this study, a Petri net model of atherosclerosis regulation is presented. This model includes also some information about stoichiometric relationships between its components and covers all mentioned factors. For the model, a structural analysis based on invariants was made and biological conclusions are presented. Since the model contains inhibitor arcs, a heuristic method for analysis of such cases is presented. This method can be used to extend the concept of feasible t -invariants.

  16. Application of the FTA and ETA Method for Gas Hazard Identification for the Performance of Safety Systems in the Industrial Department

    NASA Astrophysics Data System (ADS)

    Ignac-Nowicka, Jolanta

    2018-03-01

    The paper analyzes the conditions of safe use of industrial gas systems and factors influencing gas hazards. Typical gas installation and its basic features have been characterized. The results of gas threat analysis in an industrial enterprise using FTA error tree method and ETA event tree method are presented. Compares selected methods of identifying hazards gas industry with respect to the scope of their use. The paper presents an analysis of two exemplary hazards: an industrial gas catastrophe (FTA) and an explosive gas explosion (ETA). In both cases, technical risks and human errors (human factor) were taken into account. The cause-effect relationships of hazards and their causes are presented in the form of diagrams in the drawings.

  17. An Efficient Taguchi Approach for the Performance Optimization of Health, Safety, Environment and Ergonomics in Generation Companies

    PubMed Central

    Azadeh, Ali; Sheikhalishahi, Mohammad

    2014-01-01

    Background A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. Methods To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. Results The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. Conclusion The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors. PMID:26106505

  18. Adjusted Analyses in Studies Addressing Therapy and Harm: Users' Guides to the Medical Literature.

    PubMed

    Agoritsas, Thomas; Merglen, Arnaud; Shah, Nilay D; O'Donnell, Martin; Guyatt, Gordon H

    2017-02-21

    Observational studies almost always have bias because prognostic factors are unequally distributed between patients exposed or not exposed to an intervention. The standard approach to dealing with this problem is adjusted or stratified analysis. Its principle is to use measurement of risk factors to create prognostically homogeneous groups and to combine effect estimates across groups.The purpose of this Users' Guide is to introduce readers to fundamental concepts underlying adjustment as a way of dealing with prognostic imbalance and to the basic principles and relative trustworthiness of various adjustment strategies.One alternative to the standard approach is propensity analysis, in which groups are matched according to the likelihood of membership in exposed or unexposed groups. Propensity methods can deal with multiple prognostic factors, even if there are relatively few patients having outcome events. However, propensity methods do not address other limitations of traditional adjustment: investigators may not have measured all relevant prognostic factors (or not accurately), and unknown factors may bias the results.A second approach, instrumental variable analysis, relies on identifying a variable associated with the likelihood of receiving the intervention but not associated with any prognostic factor or with the outcome (other than through the intervention); this could mimic randomization. However, as with assumptions of other adjustment approaches, it is never certain if an instrumental variable analysis eliminates bias.Although all these approaches can reduce the risk of bias in observational studies, none replace the balance of both known and unknown prognostic factors offered by randomization.

  19. Analysis of case-only studies accounting for genotyping error.

    PubMed

    Cheng, K F

    2007-03-01

    The case-only design provides one approach to assess possible interactions between genetic and environmental factors. It has been shown that if these factors are conditionally independent, then a case-only analysis is not only valid but also very efficient. However, a drawback of the case-only approach is that its conclusions may be biased by genotyping errors. In this paper, our main aim is to propose a method for analysis of case-only studies when these errors occur. We show that the bias can be adjusted through the use of internal validation data, which are obtained by genotyping some sampled individuals twice. Our analysis is based on a simple and yet highly efficient conditional likelihood approach. Simulation studies considered in this paper confirm that the new method has acceptable performance under genotyping errors.

  20. Estimating gene function with least squares nonnegative matrix factorization.

    PubMed

    Wang, Guoli; Ochs, Michael F

    2007-01-01

    Nonnegative matrix factorization is a machine learning algorithm that has extracted information from data in a number of fields, including imaging and spectral analysis, text mining, and microarray data analysis. One limitation with the method for linking genes through microarray data in order to estimate gene function is the high variance observed in transcription levels between different genes. Least squares nonnegative matrix factorization uses estimates of the uncertainties on the mRNA levels for each gene in each condition, to guide the algorithm to a local minimum in normalized chi2, rather than a Euclidean distance or divergence between the reconstructed data and the data itself. Herein, application of this method to microarray data is demonstrated in order to predict gene function.

  1. Analysis of stationary availability factor of two-level backbone computer networks with arbitrary topology

    NASA Astrophysics Data System (ADS)

    Rahman, P. A.

    2018-05-01

    This scientific paper deals with the two-level backbone computer networks with arbitrary topology. A specialized method, offered by the author for calculation of the stationary availability factor of the two-level backbone computer networks, based on the Markov reliability models for the set of the independent repairable elements with the given failure and repair rates and the methods of the discrete mathematics, is also discussed. A specialized algorithm, offered by the author for analysis of the network connectivity, taking into account different kinds of the network equipment failures, is also observed. Finally, this paper presents an example of calculation of the stationary availability factor for the backbone computer network with the given topology.

  2. Analysis of soybean production and import trends and its import factors in Indonesia

    NASA Astrophysics Data System (ADS)

    Ningrum, I. H.; Irianto, H.; Riptanti, E. W.

    2018-03-01

    This study aims to analyze the factors affecting soybean imports in Indonesia and to know the trend and projection of Indonesian soybean production as well as the import in 2016-2020. The basic method used in this research is the description analysis method. The data used are secondary data in the form of time series data from 1979-2015. Methods of data analysis using simultaneous equations model with 2SLS (Two Stage Least Square) method and Trend analysis. The results showed that the factors affecting soybean imports in Indonesia are consumption and production. Consumption has positive effect while production is negatively affected. The percentage changed in soybean imports is greater than the percentage change in consumption and production of soybeans. Consumption is positively influenced by imports and production, while production is influenced positively by consumption and negative by imports. The production trend of soybean in 2016-2020 has a tendency to increase with a percentage of 11.18% per year. Production in 2016 is projected at 1.110.537 tons while in 2020 it will increase to 1,721,350 tons. The import trend in 2016-2020 has a tendency to increase with an average percentage of 4.13% per year. Import in 2016 is projected at 2.224.188 tons while in 2020 it will increase to 2.611.270 tons.

  3. Analysis of Developmental Data: Comparison Among Alternative Methods

    ERIC Educational Resources Information Center

    Wilson, Ronald S.

    1975-01-01

    To examine the ability of the correction factor epsilon to counteract statistical bias in univariate analysis, an analysis of variance (adjusted by epsilon) and a multivariate analysis of variance were performed on the same data. The results indicated that univariate analysis is a fully protected design when used with epsilon. (JMB)

  4. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  5. Comparison of Factor Simplicity Indices for Dichotomous Data: DETECT R, Bentler's Simplicity Index, and the Loading Simplicity Index

    ERIC Educational Resources Information Center

    Finch, Holmes; Stage, Alan Kirk; Monahan, Patrick

    2008-01-01

    A primary assumption underlying several of the common methods for modeling item response data is unidimensionality, that is, test items tap into only one latent trait. This assumption can be assessed several ways, using nonlinear factor analysis and DETECT, a method based on the item conditional covariances. When multidimensionality is identified,…

  6. Modern Methods of Rail Welding

    NASA Astrophysics Data System (ADS)

    Kozyrev, Nikolay A.; Kozyreva, Olga A.; Usoltsev, Aleksander A.; Kryukov, Roman E.; Shevchenko, Roman A.

    2017-10-01

    Existing methods of rail welding, which are enable to get continuous welded rail track, are observed in this article. Analysis of existing welding methods allows considering an issue of continuous rail track in detail. Metallurgical and welding technologies of rail welding and also process technologies reducing aftereffects of temperature exposure are important factors determining the quality and reliability of the continuous rail track. Analysis of the existing methods of rail welding enable to find the research line for solving this problem.

  7. Psychometric Properties of a Korean Measure of Person-Directed Care in Nursing Homes

    ERIC Educational Resources Information Center

    Choi, Jae-Sung; Lee, Minhong

    2014-01-01

    Objective: This study examined the validity and reliability of a person-directed care (PDC) measure for nursing homes in Korea. Method: Managerial personnel from 223 nursing homes in 2010 and 239 in 2012 were surveyed. Results: Item analysis and exploratory factor analysis for the first sample generated a 33-item PDC measure with eight factors.…

  8. Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy

    2016-01-01

    Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…

  9. Reduced angiogenic factor expression in intrauterine fetal growth restriction using semiquantitative immunohistochemistry and digital image analysis.

    PubMed

    Alahakoon, Thushari I; Zhang, Weiyi; Arbuckle, Susan; Zhang, Kewei; Lee, Vincent

    2018-05-01

    To localize, quantify and compare angiogenic factors, vascular endothelial growth factor (VEGF), placental growth factor (PlGF), as well as their receptors fms-like tyrosine kinase receptor (Flt-1) and kinase insert domain receptor (KDR) in the placentas of normal pregnancy and complications of preeclampsia (PE), intrauterine fetal growth restriction (IUGR) and PE + IUGR. In a prospective cross-sectional case-control study, 30 pregnant women between 24-40 weeks of gestation, were recruited into four clinical groups. Representative placental samples were stained for VEGF, PlGF, Flt-1 and KDR. Analysis was performed using semiquantitative methods and digital image analysis. The overall VEGF and Flt-1 were strongly expressed and did not show any conclusive difference in the expression between study groups. PlGF and KDR were significantly reduced in expression in the placentas from pregnancies complicated by IUGR compared with normal and preeclamptic pregnancies. The lack of PlGF and KDR may be a cause for the development of IUGR and may explain the loss of vasculature and villous architecture in IUGR. Automated digital image analysis software is a viable alternative method to the manual reading of placental immunohistochemical staining. © 2018 Japan Society of Obstetrics and Gynecology.

  10. Validity and consistency assessment of accident analysis methods in the petroleum industry.

    PubMed

    Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza

    2017-11-17

    Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.

  11. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  12. The Network Structure of Human Personality According to the NEO-PI-R: Matching Network Community Structure to Factor Structure

    PubMed Central

    Goekoop, Rutger; Goekoop, Jaap G.; Scholte, H. Steven

    2012-01-01

    Introduction Human personality is described preferentially in terms of factors (dimensions) found using factor analysis. An alternative and highly related method is network analysis, which may have several advantages over factor analytic methods. Aim To directly compare the ability of network community detection (NCD) and principal component factor analysis (PCA) to examine modularity in multidimensional datasets such as the neuroticism-extraversion-openness personality inventory revised (NEO-PI-R). Methods 434 healthy subjects were tested on the NEO-PI-R. PCA was performed to extract factor structures (FS) of the current dataset using both item scores and facet scores. Correlational network graphs were constructed from univariate correlation matrices of interactions between both items and facets. These networks were pruned in a link-by-link fashion while calculating the network community structure (NCS) of each resulting network using the Wakita Tsurumi clustering algorithm. NCSs were matched against FS and networks of best matches were kept for further analysis. Results At facet level, NCS showed a best match (96.2%) with a ‘confirmatory’ 5-FS. At item level, NCS showed a best match (80%) with the standard 5-FS and involved a total of 6 network clusters. Lesser matches were found with ‘confirmatory’ 5-FS and ‘exploratory’ 6-FS of the current dataset. Network analysis did not identify facets as a separate level of organization in between items and clusters. A small-world network structure was found in both item- and facet level networks. Conclusion We present the first optimized network graph of personality traits according to the NEO-PI-R: a ‘Personality Web’. Such a web may represent the possible routes that subjects can take during personality development. NCD outperforms PCA by producing plausible modularity at item level in non-standard datasets, and can identify the key roles of individual items and clusters in the network. PMID:23284713

  13. Characterization of Residual Stress Effects on Fatigue Crack Growth of a Friction Stir Welded Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Newman, John A.; Smith, Stephen W.; Seshadri, Banavara R.; James, Mark A.; Brazill, Richard L.; Schultz, Robert W.; Donald, J. Keith; Blair, Amy

    2015-01-01

    An on-line compliance-based method to account for residual stress effects in stress-intensity factor and fatigue crack growth property determinations has been evaluated. Residual stress intensity factor results determined from specimens containing friction stir weld induced residual stresses are presented, and the on-line method results were found to be in excellent agreement with residual stress-intensity factor data obtained using the cut compliance method. Variable stress-intensity factor tests were designed to demonstrate that a simple superposition model, summing the applied stress-intensity factor with the residual stress-intensity factor, can be used to determine the total crack-tip stress-intensity factor. Finite element, VCCT (virtual crack closure technique), and J-integral analysis methods have been used to characterize weld-induced residual stress using thermal expansion/contraction in the form of an equivalent delta T (change in local temperature during welding) to simulate the welding process. This equivalent delta T was established and applied to analyze different specimen configurations to predict residual stress distributions and associated residual stress-intensity factor values. The predictions were found to agree well with experimental results obtained using the crack- and cut-compliance methods.

  14. Determination of important topographic factors for landslide mapping analysis using MLP network.

    PubMed

    Alkhasawneh, Mutasem Sh; Ngah, Umi Kalthum; Tay, Lea Tien; Mat Isa, Nor Ashidi; Al-batah, Mohammad Subhi

    2013-01-01

    Landslide is one of the natural disasters that occur in Malaysia. Topographic factors such as elevation, slope angle, slope aspect, general curvature, plan curvature, and profile curvature are considered as the main causes of landslides. In order to determine the dominant topographic factors in landslide mapping analysis, a study was conducted and presented in this paper. There are three main stages involved in this study. The first stage is the extraction of extra topographic factors. Previous landslide studies had identified mainly six topographic factors. Seven new additional factors have been proposed in this study. They are longitude curvature, tangential curvature, cross section curvature, surface area, diagonal line length, surface roughness, and rugosity. The second stage is the specification of the weight of each factor using two methods. The methods are multilayer perceptron (MLP) network classification accuracy and Zhou's algorithm. At the third stage, the factors with higher weights were used to improve the MLP performance. Out of the thirteen factors, eight factors were considered as important factors, which are surface area, longitude curvature, diagonal length, slope angle, elevation, slope aspect, rugosity, and profile curvature. The classification accuracy of multilayer perceptron neural network has increased by 3% after the elimination of five less important factors.

  15. Determination of Important Topographic Factors for Landslide Mapping Analysis Using MLP Network

    PubMed Central

    Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Mat Isa, Nor Ashidi; Al-batah, Mohammad Subhi

    2013-01-01

    Landslide is one of the natural disasters that occur in Malaysia. Topographic factors such as elevation, slope angle, slope aspect, general curvature, plan curvature, and profile curvature are considered as the main causes of landslides. In order to determine the dominant topographic factors in landslide mapping analysis, a study was conducted and presented in this paper. There are three main stages involved in this study. The first stage is the extraction of extra topographic factors. Previous landslide studies had identified mainly six topographic factors. Seven new additional factors have been proposed in this study. They are longitude curvature, tangential curvature, cross section curvature, surface area, diagonal line length, surface roughness, and rugosity. The second stage is the specification of the weight of each factor using two methods. The methods are multilayer perceptron (MLP) network classification accuracy and Zhou's algorithm. At the third stage, the factors with higher weights were used to improve the MLP performance. Out of the thirteen factors, eight factors were considered as important factors, which are surface area, longitude curvature, diagonal length, slope angle, elevation, slope aspect, rugosity, and profile curvature. The classification accuracy of multilayer perceptron neural network has increased by 3% after the elimination of five less important factors. PMID:24453846

  16. Development of the Career Anchors Scale among Occupational Health Nurses in Japan.

    PubMed

    Kubo, Yoshiko; Hatono, Yoko; Kubo, Tomohide; Shimamoto, Satoko; Nakatani, Junko; Burgel, Barbara J

    2016-11-29

    This study aimed to develop the Career Anchors Scale among Occupational Health Nurses (CASOHN) and evaluate its reliability and validity. Scale items were developed through a qualitative inductive analysis of interview data, and items were revised following an examination of content validity by experts and occupational health nurses (OHNs), resulting in a provisional scale of 41 items. A total of 745 OHNs (response rate 45.2%) affiliated with the Japan Society for Occupational Health participated in the self-administered questionnaire survey. Two items were deleted based on item-total correlations. Factor analysis was then conducted on the remaining 39 items to examine construct validity. An exploratory factor analysis with a main factor method and promax rotation resulted in the extraction of six factors. The variance contribution ratios of the six factors were 37.45, 7.01, 5.86, 4.95, 4.16, and 3.19%. The cumulative contribution ratio was 62.62%. The factors were named as follows: Demonstrating expertise and considering position in work (Factor 1); Management skills for effective work (Factor 2); Supporting health improvement in groups and organizations (Factor 3); Providing employee-focused support (Factor 4); Collaborating with occupational health team members and personnel (Factor 5); and Compatibility of work and private life (Factor 6). The confidence coefficient determined by the split-half method was 0.85. Cronbach's alpha coefficient for the overall scale was 0.95, whereas those of the six subscales were 0.88, 0.90, 0.91, 0.80, 0.85, and 0.79, respectively. CASOHN was found to be valid and reliable for measuring career anchors among OHNs in Japan.

  17. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  18. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  19. Forecast and analysis of the ratio of electric energy to terminal energy consumption for global energy internet

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si

    2018-02-01

    In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.

  20. A retrospective analysis to identify the factors affecting infection in patients undergoing chemotherapy.

    PubMed

    Park, Ji Hyun; Kim, Hyeon-Young; Lee, Hanna; Yun, Eun Kyoung

    2015-12-01

    This study compares the performance of the logistic regression and decision tree analysis methods for assessing the risk factors for infection in cancer patients undergoing chemotherapy. The subjects were 732 cancer patients who were receiving chemotherapy at K university hospital in Seoul, Korea. The data were collected between March 2011 and February 2013 and were processed for descriptive analysis, logistic regression and decision tree analysis using the IBM SPSS Statistics 19 and Modeler 15.1 programs. The most common risk factors for infection in cancer patients receiving chemotherapy were identified as alkylating agents, vinca alkaloid and underlying diabetes mellitus. The logistic regression explained 66.7% of the variation in the data in terms of sensitivity and 88.9% in terms of specificity. The decision tree analysis accounted for 55.0% of the variation in the data in terms of sensitivity and 89.0% in terms of specificity. As for the overall classification accuracy, the logistic regression explained 88.0% and the decision tree analysis explained 87.2%. The logistic regression analysis showed a higher degree of sensitivity and classification accuracy. Therefore, logistic regression analysis is concluded to be the more effective and useful method for establishing an infection prediction model for patients undergoing chemotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Factors Affecting Optimal Surface Roughness of AISI 4140 Steel in Turning Operation Using Taguchi Experiment

    NASA Astrophysics Data System (ADS)

    Novareza, O.; Sulistiyarini, D. H.; Wiradmoko, R.

    2018-02-01

    This paper presents the result of using Taguchi method in turning process of medium carbon steel of AISI 4140. The primary concern is to find the optimal surface roughness after turning process. The taguchi method is used to get a combination of factors and factor levels in order to get the optimum surface roughness level. Four important factors with three levels were used in experiment based on Taguchi method. A number of 27 experiments were carried out during the research and analysed using analysis of variance (ANOVA) method. The result of surface finish was determined in Ra type surface roughness. The depth of cut was found to be the most important factors for reducing the surface roughness of AISI 4140 steel. On the contrary, the other important factors i.e. spindle speed and rake side angle of the tool were proven to be less factors that affecting the surface finish. It is interesting to see the effect of coolant composition that gained the second important factors to reduce the roughness. It may need further research to explain this result.

  2. Donor retention in health care in Iran: a factor analysis

    PubMed Central

    Aghababa, Sara; Nasiripour, Amir Ashkan; Maleki, Mohammadreza; Gohari, Mahmoodreza

    2017-01-01

    Background: Long-term financial support is essential for the survival of a charitable organization. Health charities need to identify the effective factors influencing donor retention. Methods: In the present study, the items of a questionnaire were derived from both literature review and semi-structured interviews related to donor retention. Using a purposive sampling, 300 academic and executive practitioners were selected. After the follow- up, a total of 243 usable questionnaires were prepared for factor analysis. The questionnaire was validated based on the face and content validity and reliability through Cronbach’s α-coefficient. Results: The results of exploratory factor analysis extracted 2 factors for retention: donor factor (variance = 33.841%; Cronbach’s α-coefficient = 90.2) and charity factor (variance = 29.038%; Cronbach’s α-coefficient = 82.8), respectively. Subsequently, confirmatory factor analysis was applied to support the overall reasonable fit. Conclusions: In this study, it was found that repeated monetary donations are supplied to the charitable organizations when both aspects of donor factor (retention factor and charity factor) for retention are taken into consideration. This model could provide a perspective for making sustainable donations and charitable giving PMID:28955663

  3. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    NASA Astrophysics Data System (ADS)

    Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.

    2015-11-01

    In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs) by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen-Rocky Mountain Biogenic Aerosol Study) ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misattribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed, yielding an explicit cluster attribution for each particle and improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.

  4. Full-Information Item Bi-Factor Analysis. ONR Technical Report. [Biometric Lab Report No. 90-2.

    ERIC Educational Resources Information Center

    Gibbons, Robert D.; And Others

    A plausible "s"-factor solution for many types of psychological and educational tests is one in which there is one general factor and "s - 1" group- or method-related factors. The bi-factor solution results from the constraint that each item has a non-zero loading on the primary dimension "alpha(sub j1)" and at most…

  5. The use of multicomponent statistical analysis in hydrogeological environmental research.

    PubMed

    Lambrakis, Nicolaos; Antonakos, Andreas; Panagopoulos, George

    2004-04-01

    The present article examines the possibilities of investigating NO(3)(-) spread in aquifers by applying multicomponent statistical methods (factor, cluster and discriminant analysis) on hydrogeological, hydrochemical, and environmental parameters. A 4-R-Mode factor model determined from the analysis showed its useful role in investigating hydrogeological parameters affecting NO(3)(-) concentration, such as its dilution by upcoming groundwater of the recharge areas. The relationship between NO(3)(-) concentration and agricultural activities can be determined sufficiently by the first factor which relies on NO(3)(-) and SO(4)(2-) of the same origin-that of agricultural fertilizers. The other three factors of R-Mode analysis are not connected directly to the NO(3)(-) problem. They do however, by extracting the role of the unsaturated zone, show an interesting relationship between organic matter content, thickness and saturated hydraulic conductivity. The application of Hirerarchical Cluster Analysis, based on all possible combinations of classification method, showed two main groups of samples. The first group comprises samples from the edges and the second from the central part of the study area. By the application of Discriminant Analysis it was shown that NO(3)(-) and SO(4)(2-) ions are the most significant variables in the discriminant function. Therefore, the first group is considered to comprise all samples from areas not influenced by fertilizers lying on the edges of contaminating activities such as crop cultivation, while the second comprises all the other samples.

  6. Methods for Analysis of Urban Energy Systems: A New York City Case Study

    NASA Astrophysics Data System (ADS)

    Howard, Bianca

    This dissertation describes methods developed for analysis of the New York City energy system. The analysis specifically aims to consider the built environment and its' impacts on greenhouse gas (GHG) emissions. Several contributions to the urban energy systems literature were made. First, estimates of annual energy intensities of the New York building stock were derived using a statistical analysis that leveraged energy consumption and tax assessor data collected by the Office of the Mayor. These estimates provided the basis for an assessment of the spatial distribution of building energy consumption. The energy consumption estimates were then leveraged to estimate the potential for combined heat and power (CHP) systems in New York City at both the building and microgrid scales. In aggregate, given the 2009 non-baseload GHG emissions factors for electricity production, these systems could reduce citywide GHG emissions by 10%. The operational characteristics of CHP systems were explored further considering different prime movers, climates, and GHG emissions factors. A combination of mixed integer linear programing and controlled random search algorithms were the methods used to determine the optimal capacity and operating strategies for the CHP systems under the various scenarios. Lastly a multi-regional unit commitment model of electricity and GHG emissions production for New York State was developed using data collected from several publicly available sources. The model was used to estimate average and marginal GHG emissions factors for New York State and New York City. The analysis found that marginal GHG emissions factors could reduce by 30% to 370 g CO2e/kWh in the next 10 years.

  7. SCA with rotation to distinguish common and distinctive information in linked data.

    PubMed

    Schouteden, Martijn; Van Deun, Katrijn; Pattyn, Sven; Van Mechelen, Iven

    2013-09-01

    Often data are collected that consist of different blocks that all contain information about the same entities (e.g., items, persons, or situations). In order to unveil both information that is common to all data blocks and information that is distinctive for one or a few of them, an integrated analysis of the whole of all data blocks may be most useful. Interesting classes of methods for such an approach are simultaneous-component and multigroup factor analysis methods. These methods yield dimensions underlying the data at hand. Unfortunately, however, in the results from such analyses, common and distinctive types of information are mixed up. This article proposes a novel method to disentangle the two kinds of information, by making use of the rotational freedom of component and factor models. We illustrate this method with data from a cross-cultural study of emotions.

  8. Study of the location of testing area in residual stress measurement by Moiré interferometry combined with hole-drilling method

    NASA Astrophysics Data System (ADS)

    Qin, Le; Xie, HuiMin; Zhu, RongHua; Wu, Dan; Che, ZhiGang; Zou, ShiKun

    2014-04-01

    This paper investigates the effect of the location of testing area in residual stress measurement by Moiré interferometry combined with hole-drilling method. The selection of the location of the testing area is analyzed from theory and experiment. In the theoretical study, the factors which affect the surface released radial strain ɛ r were analyzed on the basis of the formulae of the hole-drilling method, and the relations between those factors and ɛ r were established. By combining Moiré interferometry with the hole-drilling method, the residual stress of interference-fit specimen was measured to verify the theoretical analysis. According to the analysis results, the testing area for minimizing the error of strain measurement is determined. Moreover, if the orientation of the maximum principal stress is known, the value of strain will be measured with higher precision by the Moiré interferometry method.

  9. Forensic Schedule Analysis of Construction Delay in Military Projects in the Middle East

    DTIC Science & Technology

    This research performs forensic schedule analysis of delay factors that impacted recent large-scale military construction projects in the Middle East...The methodologies for analysis are adapted from the Professional Practice Guide to Forensic Schedule Analysis, particularly Method 3.7 Modeled

  10. Analysis and compensation for the effect of the catheter position on image intensities in intravascular optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, Shengnan; Eggermont, Jeroen; Wolterbeek, Ron; Broersen, Alexander; Busk, Carol A. G. R.; Precht, Helle; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2016-12-01

    Intravascular optical coherence tomography (IVOCT) is an imaging technique that is used to analyze the underlying cause of cardiovascular disease. Because a catheter is used during imaging, the intensities can be affected by the catheter position. This work aims to analyze the effect of the catheter position on IVOCT image intensities and to propose a compensation method to minimize this effect in order to improve the visualization and the automatic analysis of IVOCT images. The effect of catheter position is modeled with respect to the distance between the catheter and the arterial wall (distance-dependent factor) and the incident angle onto the arterial wall (angle-dependent factor). A light transmission model incorporating both factors is introduced. On the basis of this model, the interaction effect of both factors is estimated with a hierarchical multivariant linear regression model. Statistical analysis shows that IVOCT intensities are significantly affected by both factors with p<0.001, as either aspect increases the intensity decreases. This effect differs for different pullbacks. The regression results were used to compensate for this effect. Experiments show that the proposed compensation method can improve the performance of the automatic bioresorbable vascular scaffold strut detection.

  11. An impact analysis of forecasting methods and forecasting parameters on bullwhip effect

    NASA Astrophysics Data System (ADS)

    Silitonga, R. Y. H.; Jelly, N.

    2018-04-01

    Bullwhip effect is an increase of variance of demand fluctuation from downstream to upstream of supply chain. Forecasting methods and forecasting parameters were recognized as some factors that affect bullwhip phenomena. To study these factors, we can develop simulations. There are several ways to simulate bullwhip effect in previous studies, such as mathematical equation modelling, information control modelling, computer program, and many more. In this study a spreadsheet program named Bullwhip Explorer was used to simulate bullwhip effect. Several scenarios were developed to show the change in bullwhip effect ratio because of the difference in forecasting methods and forecasting parameters. Forecasting methods used were mean demand, moving average, exponential smoothing, demand signalling, and minimum expected mean squared error. Forecasting parameters were moving average period, smoothing parameter, signalling factor, and safety stock factor. It showed that decreasing moving average period, increasing smoothing parameter, increasing signalling factor can create bigger bullwhip effect ratio. Meanwhile, safety stock factor had no impact to bullwhip effect.

  12. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  13. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    PubMed Central

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-01-01

    Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328

  14. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    PubMed

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  15. Design and Analysis of Subscale and Full-Scale Buckling-Critical Cylinders for Launch Vehicle Technology Development

    NASA Technical Reports Server (NTRS)

    Hilburger, Mark W.; Lovejoy, Andrew E.; Thornburgh, Robert P.; Rankin, Charles

    2012-01-01

    NASA s Shell Buckling Knockdown Factor (SBKF) project has the goal of developing new analysis-based shell buckling design factors (knockdown factors) and design and analysis technologies for launch vehicle structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles. However, in order to validate any new analysis-based design data or methods, a series of carefully designed and executed structural tests are required at both the subscale and full-scale levels. This paper describes the design and analysis of three different orthogrid-stiffeNed metallic cylindrical-shell test articles. Two of the test articles are 8-ft-diameter, 6-ft-long test articles, and one test article is a 27.5-ft-diameter, 20-ft-long Space Shuttle External Tank-derived test article.

  16. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  17. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    PubMed Central

    Sediyama, Cristina Y. N.; Moura, Ricardo; Garcia, Marina S.; da Silva, Antonio G.; Soraggi, Carolina; Neves, Fernando S.; Albuquerque, Maicon R.; Whiteside, Setephen P.; Malloy-Diniz, Leandro F.

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS. PMID:28484414

  18. Research on the recycling industry development model for typical exterior plastic components of end-of-life passenger vehicle based on the SWOT method.

    PubMed

    Zhang, Hongshen; Chen, Ming

    2013-11-01

    In-depth studies on the recycling of typical automotive exterior plastic parts are significant and beneficial for environmental protection, energy conservation, and sustainable development of China. In the current study, several methods were used to analyze the recycling industry model for typical exterior parts of passenger vehicles in China. The strengths, weaknesses, opportunities, and challenges of the current recycling industry for typical exterior parts of passenger vehicles were analyzed comprehensively based on the SWOT method. The internal factor evaluation matrix and external factor evaluation matrix were used to evaluate the internal and external factors of the recycling industry. The recycling industry was found to respond well to all the factors and it was found to face good developing opportunities. Then, the cross-link strategies analysis for the typical exterior parts of the passenger car industry of China was conducted based on the SWOT analysis strategies and established SWOT matrix. Finally, based on the aforementioned research, the recycling industry model led by automobile manufacturers was promoted. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Parallel-vector solution of large-scale structural analysis problems on supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Agarwal, Tarun K.

    1989-01-01

    A direct linear equation solution method based on the Choleski factorization procedure is presented which exploits both parallel and vector features of supercomputers. The new equation solver is described, and its performance is evaluated by solving structural analysis problems on three high-performance computers. The method has been implemented using Force, a generic parallel FORTRAN language.

  20. A Systematic Review and Meta-Analysis of Predictors of Expressive-Language Outcomes among Late Talkers

    ERIC Educational Resources Information Center

    Fisher, Evelyn L.

    2017-01-01

    Purpose: The purpose of this study was to explore the literature on predictors of outcomes among late talkers using systematic review and meta-analysis methods. We sought to answer the question: What factors predict preschool-age expressive-language outcomes among late-talking toddlers? Method: We entered carefully selected search terms into the…

  1. Accounting for measurement error in biomarker data and misclassification of subtypes in the analysis of tumor data

    PubMed Central

    Nevo, Daniel; Zucker, David M.; Tamimi, Rulla M.; Wang, Molin

    2017-01-01

    A common paradigm in dealing with heterogeneity across tumors in cancer analysis is to cluster the tumors into subtypes using marker data on the tumor, and then to analyze each of the clusters separately. A more specific target is to investigate the association between risk factors and specific subtypes and to use the results for personalized preventive treatment. This task is usually carried out in two steps–clustering and risk factor assessment. However, two sources of measurement error arise in these problems. The first is the measurement error in the biomarker values. The second is the misclassification error when assigning observations to clusters. We consider the case with a specified set of relevant markers and propose a unified single-likelihood approach for normally distributed biomarkers. As an alternative, we consider a two-step procedure with the tumor type misclassification error taken into account in the second-step risk factor analysis. We describe our method for binary data and also for survival analysis data using a modified version of the Cox model. We present asymptotic theory for the proposed estimators. Simulation results indicate that our methods significantly lower the bias with a small price being paid in terms of variance. We present an analysis of breast cancer data from the Nurses’ Health Study to demonstrate the utility of our method. PMID:27558651

  2. To Identify the Important Soil Properties Affecting Dinoseb Adsorption with Statistical Analysis

    PubMed Central

    Guan, Yiqing; Wei, Jianhui; Zhang, Danrong; Zu, Mingjuan; Zhang, Liru

    2013-01-01

    Investigating the influences of soil characteristic factors on dinoseb adsorption parameter with different statistical methods would be valuable to explicitly figure out the extent of these influences. The correlation coefficients and the direct, indirect effects of soil characteristic factors on dinoseb adsorption parameter were analyzed through bivariate correlation analysis, and path analysis. With stepwise regression analysis the factors which had little influence on the adsorption parameter were excluded. Results indicate that pH and CEC had moderate relationship and lower direct effect on dinoseb adsorption parameter due to the multicollinearity with other soil factors, and organic carbon and clay contents were found to be the most significant soil factors which affect the dinoseb adsorption process. A regression is thereby set up to explore the relationship between the dinoseb adsorption parameter and the two soil factors: the soil organic carbon and clay contents. A 92% of the variation of dinoseb sorption coefficient could be attributed to the variation of the soil organic carbon and clay contents. PMID:23737715

  3. Seismic analysis for translational failure of landfills with retaining walls.

    PubMed

    Feng, Shi-Jin; Gao, Li-Ya

    2010-11-01

    In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. The Business Policy Course: Multiple Methods for Multiple Goals.

    ERIC Educational Resources Information Center

    Thomas, Anisya S.

    1998-01-01

    Outlines the objectives of a capstone business policy and strategy course; the use of case analysis, article critiques, storytelling, and computer simulation; and contextual factors in matching objectives and methods. (SK)

  5. A procedure for landslide susceptibility zonation by the conditional analysis method1

    NASA Astrophysics Data System (ADS)

    Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo

    2002-12-01

    Numerous methods have been proposed for landslide probability zonation of the landscape by means of a Geographic Information System (GIS). Among the multivariate methods, i.e. those methods which simultaneously take into account all the factors contributing to instability, the Conditional Analysis method applied to a subdivision of the territory into Unique Condition Units is particularly straightforward from a conceptual point of view and particularly suited to the use of a GIS. In fact, working on the principle that future landslides are more likely to occur under those conditions which led to past instability, landslide susceptibility is defined by computing the landslide density in correspondence with different combinations of instability factors. The conceptual simplicity of this method, however, does not necessarily imply that it is simple to implement, especially as it requires rather complex operations and a high number of GIS commands. Moreover, there is the possibility that, in order to achieve satisfactory results, the procedure has to be repeated a few times changing the factors or modifying the class subdivision. To solve this problem, we created a shell program which, by combining the shell commands, the GIS Geographical Research Analysis Support System (GRASS) commands and the gawk language commands, carries out the whole procedure automatically. This makes the construction of a Landslide Susceptibility Map easy and fast for large areas too, and even when a high spatial resolution is adopted, as shown by application of the procedure to the Parma River basin, in the Italian Northern Apennines.

  6. A new technique for ordering asymmetrical three-dimensional data sets in ecology.

    PubMed

    Pavoine, Sandrine; Blondel, Jacques; Baguette, Michel; Chessel, Daniel

    2007-02-01

    The aim of this paper is to tackle the problem that arises from asymmetrical data cubes formed by two crossed factors fixed by the experimenter (factor A and factor B, e.g., sites and dates) and a factor which is not controlled for (the species). The entries of this cube are densities in species. We approach this kind of data by the comparison of patterns, that is to say by analyzing first the effect of factor B on the species-factor A pattern, and second the effect of factor A on the species-factor B pattern. The analysis of patterns instead of individual responses requires a correspondence analysis. We use a method we call Foucart's correspondence analysis to coordinate the correspondence analyses of several independent matrices of species x factor A (respectively B) type, corresponding to each modality of factor B (respectively A). Such coordination makes it possible to evaluate the effect of factor B (respectively A) on the species-factor A (respectively B) pattern. The results obtained by such a procedure are much more insightful than those resulting from a classical single correspondence analysis applied to the global matrix that is obtained by simply unrolling the data cube, juxtaposing for example the individual species x factor A matrices through modalities of factor B. This is because a single global correspondence analysis combines three effects of factors in a way that cannot be determined from factorial maps (factor A, factor B, and factor A x factor B interaction) whereas the applications of Foucart's correspondence analysis clearly discriminate two different issues. Using two data sets, we illustrate that this technique proves to be particularly powerful in the analyses of ecological convergence which include several distinct data sets and in the analyses of spatiotemporal variations of species distributions.

  7. The Development and Validation of the Protective Factors Survey: A Self-Report Measure of Protective Factors against Child Maltreatment

    ERIC Educational Resources Information Center

    Counts, Jacqueline M.; Buffington, Elenor S.; Chang-Rios, Karin; Rasmussen, Heather N.; Preacher, Kristopher J.

    2010-01-01

    Objective: The objective of this study was to evaluate the internal structure of a self-report measure of multiple family-level protective factors against abuse and neglect and explore the relationship of this instrument to other measures of child maltreatment. Methods: For the exploratory factor analysis, 11 agencies from 4 states administered…

  8. An Automatic Method for Generating an Unbiased Intensity Normalizing Factor in Positron Emission Tomography Image Analysis After Stroke.

    PubMed

    Nie, Binbin; Liang, Shengxiang; Jiang, Xiaofeng; Duan, Shaofeng; Huang, Qi; Zhang, Tianhao; Li, Panlong; Liu, Hua; Shan, Baoci

    2018-06-07

    Positron emission tomography (PET) imaging of functional metabolism has been widely used to investigate functional recovery and to evaluate therapeutic efficacy after stroke. The voxel intensity of a PET image is the most important indicator of cellular activity, but is affected by other factors such as the basal metabolic ratio of each subject. In order to locate dysfunctional regions accurately, intensity normalization by a scale factor is a prerequisite in the data analysis, for which the global mean value is most widely used. However, this is unsuitable for stroke studies. Alternatively, a specified scale factor calculated from a reference region is also used, comprising neither hyper- nor hypo-metabolic voxels. But there is no such recognized reference region for stroke studies. Therefore, we proposed a totally data-driven automatic method for unbiased scale factor generation. This factor was generated iteratively until the residual deviation of two adjacent scale factors was reduced by < 5%. Moreover, both simulated and real stroke data were used for evaluation, and these suggested that our proposed unbiased scale factor has better sensitivity and accuracy for stroke studies.

  9. A systematic review of methodology: time series regression analysis for environmental factors and infectious diseases.

    PubMed

    Imai, Chisato; Hashizume, Masahiro

    2015-03-01

    Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases.

  10. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    PubMed

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  11. Analysis of Traffic Crashes Involving Pedestrians Using Big Data: Investigation of Contributing Factors and Identification of Hotspots.

    PubMed

    Xie, Kun; Ozbay, Kaan; Kurkcu, Abdullah; Yang, Hong

    2017-08-01

    This study aims to explore the potential of using big data in advancing the pedestrian risk analysis including the investigation of contributing factors and the hotspot identification. Massive amounts of data of Manhattan from a variety of sources were collected, integrated, and processed, including taxi trips, subway turnstile counts, traffic volumes, road network, land use, sociodemographic, and social media data. The whole study area was uniformly split into grid cells as the basic geographical units of analysis. The cell-structured framework makes it easy to incorporate rich and diversified data into risk analysis. The cost of each crash, weighted by injury severity, was assigned to the cells based on the relative distance to the crash site using a kernel density function. A tobit model was developed to relate grid-cell-specific contributing factors to crash costs that are left-censored at zero. The potential for safety improvement (PSI) that could be obtained by using the actual crash cost minus the cost of "similar" sites estimated by the tobit model was used as a measure to identify and rank pedestrian crash hotspots. The proposed hotspot identification method takes into account two important factors that are generally ignored, i.e., injury severity and effects of exposure indicators. Big data, on the one hand, enable more precise estimation of the effects of risk factors by providing richer data for modeling, and on the other hand, enable large-scale hotspot identification with higher resolution than conventional methods based on census tracts or traffic analysis zones. © 2017 Society for Risk Analysis.

  12. A Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An examination of four alternative models

    PubMed Central

    Vahedi, Shahram; Farrokhi, Farahman

    2011-01-01

    Objective The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM), proposed by Earp. Method The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA) was carried out to determine the factor structures of the Persian adaptation of SAM. Results As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature. PMID:22952530

  13. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    PubMed

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  14. Historical Evolution of Old-Age Mortality and New Approaches to Mortality Forecasting

    PubMed Central

    Gavrilov, Leonid A.; Gavrilova, Natalia S.; Krut'ko, Vyacheslav N.

    2017-01-01

    Knowledge of future mortality levels and trends is important for actuarial practice but poses a challenge to actuaries and demographers. The Lee-Carter method, currently used for mortality forecasting, is based on the assumption that the historical evolution of mortality at all age groups is driven by one factor only. This approach cannot capture an additive manner of mortality decline observed before the 1960s. To overcome the limitation of the one-factor model of mortality and to determine the true number of factors underlying mortality changes over time, we suggest a new approach to mortality analysis and forecasting based on the method of latent variable analysis. The basic assumption of this approach is that most variation in mortality rates over time is a manifestation of a small number of latent variables, variation in which gives rise to the observed mortality patterns. To extract major components of mortality variation, we apply factor analysis to mortality changes in developed countries over the period of 1900–2014. Factor analysis of time series of age-specific death rates in 12 developed countries (data taken from the Human Mortality Database) identified two factors capable of explaining almost 94 to 99 percent of the variance in the temporal changes of adult death rates at ages 25 to 85 years. Analysis of these two factors reveals that the first factor is a “young-age” or background factor with high factor loadings at ages 30 to 45 years. The second factor can be called an “oldage” or senescent factor because of high factor loadings at ages 65 to 85 years. It was found that the senescent factor was relatively stable in the past but now is rapidly declining for both men and women. The decline of the senescent factor is faster for men, although in most countries, it started almost 30 years later. Factor analysis of time series of age-specific death rates conducted for the oldest-old ages (65 to 100 years) found two factors explaining variation of mortality at extremely old ages in the United States. The first factor is comparable to the senescent factor found for adult mortality. The second factor, however, is specific to extreme old ages (96 to 100 years) and shows peaks in 1960 and 2000. Although mortality below 90 to 95 years shows a steady decline with time driven by the senescent factor, mortality of centenarians does not decline and remains relatively stable. The approach suggested in this paper has several advantages. First, it is able to determine the total number of independent factors affecting mortality changes over time. Second, this approach allows researchers to determine the time interval in which underlying factors remain stable or undergo rapid changes. Most methods of mortality projections are not able to identify the best base period for mortality projections, attempting to use the longest-possible time period instead. We observe that the senescent factor of mortality continues to decline, and this decline does not demonstrate any indications of slowing down. At the same time, mortality of centenarians does not decline and remains stable. The lack of mortality decline at extremely old ages may diminish anticipated longevity gains in the future. PMID:29170765

  15. Testing for measurement invariance and latent mean differences across methods: interesting incremental information from multitrait-multimethod studies

    PubMed Central

    Geiser, Christian; Burns, G. Leonard; Servera, Mateu

    2014-01-01

    Models of confirmatory factor analysis (CFA) are frequently applied to examine the convergent validity of scores obtained from multiple raters or methods in so-called multitrait-multimethod (MTMM) investigations. We show that interesting incremental information about method effects can be gained from including mean structures and tests of MI across methods in MTMM models. We present a modeling framework for testing MI in the first step of a CFA-MTMM analysis. We also discuss the relevance of MI in the context of four more complex CFA-MTMM models with method factors. We focus on three recently developed multiple-indicator CFA-MTMM models for structurally different methods [the correlated traits-correlated (methods – 1), latent difference, and latent means models; Geiser et al., 2014a; Pohl and Steyer, 2010; Pohl et al., 2008] and one model for interchangeable methods (Eid et al., 2008). We demonstrate that some of these models require or imply MI by definition for a proper interpretation of trait or method factors, whereas others do not, and explain why MI may or may not be required in each model. We show that in the model for interchangeable methods, testing for MI is critical for determining whether methods can truly be seen as interchangeable. We illustrate the theoretical issues in an empirical application to an MTMM study of attention deficit and hyperactivity disorder (ADHD) with mother, father, and teacher ratings as methods. PMID:25400603

  16. Applications of FEM and BEM in two-dimensional fracture mechanics problems

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Steeve, B. E.; Swanson, G. R.

    1992-01-01

    A comparison of the finite element method (FEM) and boundary element method (BEM) for the solution of two-dimensional plane strain problems in fracture mechanics is presented in this paper. Stress intensity factors (SIF's) were calculated using both methods for elastic plates with either a single-edge crack or an inclined-edge crack. In particular, two currently available programs, ANSYS for finite element analysis and BEASY for boundary element analysis, were used.

  17. Broadband Studies of Semsmic Sources at Regional and Teleseismic Distances Using Advanced Time Series Analysis Methods. Volume 1.

    DTIC Science & Technology

    1991-03-21

    discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were

  18. CLUSFAVOR 5.0: hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles

    PubMed Central

    Peterson, Leif E

    2002-01-01

    CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816

  19. College Students' Motivation to Achieve and Maintain a Healthy Weight

    ERIC Educational Resources Information Center

    Furia, Andrea C.; Lee, Rebecca E.; Strother, Myra L.; Huang, Terry T-K.

    2009-01-01

    Objectives: To develop and refine a scale of motivational factors related to healthy weight achievement and maintenance and to examine differences by gender and weight status. Methods: A cross-sectional survey of 300 university students aged 18-24 years. Results: Factor analysis yielded 6 factors--Intrinsic (Cronbach's alpha = 0.73): affective…

  20. Detecting and correcting the bias of unmeasured factors using perturbation analysis: a data-mining approach.

    PubMed

    Lee, Wen-Chung

    2014-02-05

    The randomized controlled study is the gold-standard research method in biomedicine. In contrast, the validity of a (nonrandomized) observational study is often questioned because of unknown/unmeasured factors, which may have confounding and/or effect-modifying potential. In this paper, the author proposes a perturbation test to detect the bias of unmeasured factors and a perturbation adjustment to correct for such bias. The proposed method circumvents the problem of measuring unknowns by collecting the perturbations of unmeasured factors instead. Specifically, a perturbation is a variable that is readily available (or can be measured easily) and is potentially associated, though perhaps only very weakly, with unmeasured factors. The author conducted extensive computer simulations to provide a proof of concept. Computer simulations show that, as the number of perturbation variables increases from data mining, the power of the perturbation test increased progressively, up to nearly 100%. In addition, after the perturbation adjustment, the bias decreased progressively, down to nearly 0%. The data-mining perturbation analysis described here is recommended for use in detecting and correcting the bias of unmeasured factors in observational studies.

  1. Factors that influence the efficiency of beef and dairy cattle recording system in Kenya: A SWOT-AHP analysis.

    PubMed

    Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J

    2011-01-01

    Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.

  2. Discriminative factor analysis of juvenile delinquency in South Korea.

    PubMed

    Kim, Hyun Sil; Kim, Hun Soo

    2006-12-01

    The present study was intended to compare difference in research variables between delinquent adolescents and student adolescents, and to analyze discriminative factors of delinquent behaviors among Korean adolescents. The research design of this study was a questionnaire survey. Questionnaires were administered to 2,167 adolescents (1,196 students and 971 delinquents), sampled from 8 middle and high school and 6 juvenile corrective institutions, using the proportional stratified random sampling method. Statistical methods employed were Chi-square, t-test, and logistic regression analysis. The discriminative factors of delinquent behaviors were smoking, alcohol use, other drug use, being sexually abused, viewing time of media violence and pornography. Among these discriminative factors, the factor most strongly associated with delinquency was smoking (odds ratio: 32.32). That is, smoking adolescent has a 32-fold higher possibility of becoming a delinquent adolescent than a non-smoking adolescent. Our findings, that smoking was the strongest discriminative factor of delinquent behavior, suggest that educational strategies to prevent adolescent smoking may reduce the rate of juvenile delinquency. Antismoking educational efforts are therefore urgently needed in South Korea.

  3. Analysis and evaluation of processes and equipment in tasks 2 and 4 of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Goldman, H.; Wolf, M.

    1978-01-01

    Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that the specific add-on costs of the Cz-process can be expected to be reduced by about a factor of three by 1982, and about a factor of five by 1986. A format to guide in the accumulation of the data needed for thorough techno-economic analysis of solar cell production processes was developed.

  4. Bem Sex Role Inventory Validation in the International Mobility in Aging Study.

    PubMed

    Ahmed, Tamer; Vafaei, Afshin; Belanger, Emmanuelle; Phillips, Susan P; Zunzunegui, Maria-Victoria

    2016-09-01

    This study investigated the measurement structure of the Bem Sex Role Inventory (BSRI) with different factor analysis methods. Most previous studies on validity applied exploratory factor analysis (EFA) to examine the BSRI. We aimed to assess the psychometric properties and construct validity of the 12-item short-form BSRI in a sample administered to 1,995 older adults from wave 1 of the International Mobility in Aging Study (IMIAS). We used Cronbach's alpha to assess internal consistency reliability and confirmatory factor analysis (CFA) to assess psychometric properties. EFA revealed a three-factor model, further confirmed by CFA and compared with the original two-factor structure model. Results revealed that a two-factor solution (instrumentality-expressiveness) has satisfactory construct validity and superior fit to data compared to the three-factor solution. The two-factor solution confirms expected gender differences in older adults. The 12-item BSRI provides a brief, psychometrically sound, and reliable instrument in international samples of older adults.

  5. Impact of the Choice of Normalization Method on Molecular Cancer Class Discovery Using Nonnegative Matrix Factorization.

    PubMed

    Yang, Haixuan; Seoighe, Cathal

    2016-01-01

    Nonnegative Matrix Factorization (NMF) has proved to be an effective method for unsupervised clustering analysis of gene expression data. By the nonnegativity constraint, NMF provides a decomposition of the data matrix into two matrices that have been used for clustering analysis. However, the decomposition is not unique. This allows different clustering results to be obtained, resulting in different interpretations of the decomposition. To alleviate this problem, some existing methods directly enforce uniqueness to some extent by adding regularization terms in the NMF objective function. Alternatively, various normalization methods have been applied to the factor matrices; however, the effects of the choice of normalization have not been carefully investigated. Here we investigate the performance of NMF for the task of cancer class discovery, under a wide range of normalization choices. After extensive evaluations, we observe that the maximum norm showed the best performance, although the maximum norm has not previously been used for NMF. Matlab codes are freely available from: http://maths.nuigalway.ie/~haixuanyang/pNMF/pNMF.htm.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  7. Application of principal component analysis to ecodiversity assessment of postglacial landscape (on the example of Debnica Kaszubska commune, Middle Pomerania)

    NASA Astrophysics Data System (ADS)

    Wojciechowski, Adam

    2017-04-01

    In order to assess ecodiversity understood as a comprehensive natural landscape factor (Jedicke 2001), it is necessary to apply research methods which recognize the environment in a holistic way. Principal component analysis may be considered as one of such methods as it allows to distinguish the main factors determining landscape diversity on the one hand, and enables to discover regularities shaping the relationships between various elements of the environment under study on the other hand. The procedure adopted to assess ecodiversity with the use of principal component analysis involves: a) determining and selecting appropriate factors of the assessed environment qualities (hypsometric, geological, hydrographic, plant, and others); b) calculating the absolute value of individual qualities for the basic areas under analysis (e.g. river length, forest area, altitude differences, etc.); c) principal components analysis and obtaining factor maps (maps of selected components); d) generating a resultant, detailed map and isolating several classes of ecodiversity. An assessment of ecodiversity with the use of principal component analysis was conducted in the test area of 299,67 km2 in Debnica Kaszubska commune. The whole commune is situated in the Weichselian glaciation area of high hypsometric and morphological diversity as well as high geo- and biodiversity. The analysis was based on topographical maps of the commune area in scale 1:25000 and maps of forest habitats. Consequently, nine factors reflecting basic environment elements were calculated: maximum height (m), minimum height (m), average height (m), the length of watercourses (km), the area of water reservoirs (m2), total forest area (ha), coniferous forests habitats area (ha), deciduous forest habitats area (ha), alder habitats area (ha). The values for individual factors were analysed for 358 grid cells of 1 km2. Based on the principal components analysis, four major factors affecting commune ecodiversity were distinguished: hypsometric component (PC1), deciduous forest habitats component (PC2), river valleys and alder habitats component (PC3), and lakes component (PC4). The distinguished factors characterise natural qualities of postglacial area and reflect well the role of the four most important groups of environment components in shaping ecodiversity of the area under study. The map of ecodiversity of Debnica Kaszubska commune was created on the basis of the first four principal component scores and then five classes of diversity were isolated: very low, low, average, high and very high. As a result of the assessment, five commune regions of very high ecodiversity were separated. These regions are also very attractive for tourists and valuable in terms of their rich nature which include protected areas such as Slupia Valley Landscape Park. The suggested method of ecodiversity assessment with the use of principal component analysis may constitute an alternative methodological proposition to other research methods used so far. Literature Jedicke E., 2001. Biodiversität, Geodiversität, Ökodiversität. Kriterien zur Analyse der Landschaftsstruktur - ein konzeptioneller Diskussionsbeitrag. Naturschutz und Landschaftsplanung, 33(2/3), 59-68.

  8. Quantitative immunohistochemistry of factor VIII-related antigen in breast carcinoma: a comparison of computer-assisted image analysis with established counting methods.

    PubMed

    Kohlberger, P D; Obermair, A; Sliutz, G; Heinzl, H; Koelbl, H; Breitenecker, G; Gitsch, G; Kainz, C

    1996-06-01

    Microvessel density in the area of the most intense neovascularization in invasive breast carcinoma is reported to be an independent prognostic factor. The established method of enumeration of microvessel density is to count the vessels using an ocular raster (counted microvessel density [CMVD]). The vessels were detected by staining endothelial cells using Factor VIII-related antigen. The aim of the study was to compare the CMVD results with the percentage of factor VIII-related antigen-stained area using computer-assisted image analysis. A true color red-green-blue (RGB) image analyzer based on a morphologically reduced instruction set computer processor was used to evaluate the area of stained endothelial cells. Sixty invasive breast carcinomas were included in the analysis. There was no significant correlation between the CMVD and the percentage of factor VIII-related antigen-stained area (Spearman correlation coefficient = 0.24, confidence interval = 0.02-0.46). Although high CMVD was significantly correlated with poorer recurrence free survival (P = .024), percentage of factor VIII-related antigen-stained area showed no prognostic value. Counted microvessel density and percentage of factor VIII-related antigen-stained area showed a highly significant correlation with vessel invasion (P = .0001 and P = .02, respectively). There was no correlation between CMVD and percentage of factor VIII-related antigen-stained area with other prognostic factors. In contrast to the CMVD within malignant tissue, the percentage of factor VIII-related antigen-stained area is not suitable as an indicator of prognosis in breast cancer patients.

  9. Causation mechanism analysis for haze pollution related to vehicle emission in Guangzhou, China by employing the fault tree approach.

    PubMed

    Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Xu, Pingru; Qian, Yu

    2016-05-01

    Recently, China has frequently experienced large-scale, severe and persistent haze pollution due to surging urbanization and industrialization and a rapid growth in the number of motor vehicles and energy consumption. The vehicle emission due to the consumption of a large number of fossil fuels is no doubt a critical factor of the haze pollution. This work is focused on the causation mechanism of haze pollution related to the vehicle emission for Guangzhou city by employing the Fault Tree Analysis (FTA) method for the first time. With the establishment of the fault tree system of "Haze weather-Vehicle exhausts explosive emission", all of the important risk factors are discussed and identified by using this deductive FTA method. The qualitative and quantitative assessments of the fault tree system are carried out based on the structure, probability and critical importance degree analysis of the risk factors. The study may provide a new simple and effective tool/strategy for the causation mechanism analysis and risk management of haze pollution in China. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A feasibility study on age-related factors of wrist pulse using principal component analysis.

    PubMed

    Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim

    2016-08-01

    Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.

  11. Quantitative super-resolution single molecule microscopy dataset of YFP-tagged growth factor receptors.

    PubMed

    Lukeš, Tomáš; Pospíšil, Jakub; Fliegel, Karel; Lasser, Theo; Hagen, Guy M

    2018-03-01

    Super-resolution single molecule localization microscopy (SMLM) is a method for achieving resolution beyond the classical limit in optical microscopes (approx. 200 nm laterally). Yellow fluorescent protein (YFP) has been used for super-resolution single molecule localization microscopy, but less frequently than other fluorescent probes. Working with YFP in SMLM is a challenge because a lower number of photons are emitted per molecule compared with organic dyes, which are more commonly used. Publically available experimental data can facilitate development of new data analysis algorithms. Four complete, freely available single molecule super-resolution microscopy datasets on YFP-tagged growth factor receptors expressed in a human cell line are presented, including both raw and analyzed data. We report methods for sample preparation, for data acquisition, and for data analysis, as well as examples of the acquired images. We also analyzed the SMLM datasets using a different method: super-resolution optical fluctuation imaging (SOFI). The 2 modes of analysis offer complementary information about the sample. A fifth single molecule super-resolution microscopy dataset acquired with the dye Alexa 532 is included for comparison purposes. This dataset has potential for extensive reuse. Complete raw data from SMLM experiments have typically not been published. The YFP data exhibit low signal-to-noise ratios, making data analysis a challenge. These datasets will be useful to investigators developing their own algorithms for SMLM, SOFI, and related methods. The data will also be useful for researchers investigating growth factor receptors such as ErbB3.

  12. Entrance and exit region friction factor models for annular seal analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Elrod, David Alan

    1988-01-01

    The Mach number definition and boundary conditions in Nelson's nominally-centered, annular gas seal analysis are revised. A method is described for determining the wall shear stress characteristics of an annular gas seal experimentally. Two friction factor models are developed for annular seal analysis; one model is based on flat-plate flow theory; the other uses empirical entrance and exit region friction factors. The friction factor predictions of the models are compared to experimental results. Each friction model is used in an annular gas seal analysis. The seal characteristics predicted by the two seal analyses are compared to experimental results and to the predictions of Nelson's analysis. The comparisons are for smooth-rotor seals with smooth and honeycomb stators. The comparisons show that the analysis which uses empirical entrance and exit region shear stress models predicts the static and stability characteristics of annular gas seals better than the other analyses. The analyses predict direct stiffness poorly.

  13. Dynamical Behaviors between the PM10 and the meteorological factor using the detrended cross-correlation analysis method

    NASA Astrophysics Data System (ADS)

    Kim, Kyungsik; Lee, Dong-In

    2013-04-01

    There is considerable interest in cross-correlations in collective modes of real data from atmospheric geophysics, seismology, finance, physiology, genomics, and nanodevices. If two systems interact mutually, that interaction gives rise to collective modes. This phenomenon is able to be analyzed using the cross-correlation of traditional methods, random matrix theory, and the detrended cross-correlation analysis method. The detrended cross-correlation analysis method was used in the past to analyze several models such as autoregressive fractionally integrated moving average processes, stock prices and their trading volumes, and taxi accidents. Particulate matter is composed of the organic and inorganic mixtures such as the natural sea salt, soil particle, vehicles exhaust, construction dust, and soot. The PM10 is known as the particle with the aerodynamic diameter (less than 10 microns) that is able to enter the human respiratory system. The PM10 concentration has an effect on the climate change by causing an unbalance of the global radiative equilibrium through the direct effect that blocks the stoma of plants and cuts off the solar radiation, different from the indirect effect that changes the optical property of clouds, cloudiness, and lifetime of clouds. Various factors contribute to the degree of the PM10 concentration. Notable among these are the land-use types, surface vegetation coverage, as well as meteorological factors. In this study, we analyze and simulate cross-correlations in time scales between the PM10 concentration and the meteorological factor (among temperature, wind speed and humidity) using the detrended cross-correlation analysis method through the removal of specific trends at eight cities in the Korean peninsula. We divide time series data into Asian dust events and non-Asian dust events to analyze the change of meteorological factors on the fluctuation of PM10 the concentration during Asian dust events. In particular, our result is compared to analytic findings from references published in all nations. ----------------------------------------------------------------- This work was supported by Center for the ASER (CATER 2012-6110) and by the NRFK through a grant provided by the KMEST(No.K1663000201107900).

  14. Specialty Selections of Jefferson Medical College Students: A Conjoint Analysis.

    ERIC Educational Resources Information Center

    Diamond, James J.; And Others

    1994-01-01

    A consumer research technique, conjoint analysis, was used to assess the relative importance of several factors in 104 fourth-year medical students' selection of specialty. Conjoint analysis appears to be a useful method for investigating the complex process of specialty selection. (SLD)

  15. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  16. Multivariate Statistical Analysis of MSL APXS Bulk Geochemical Data

    NASA Astrophysics Data System (ADS)

    Hamilton, V. E.; Edwards, C. S.; Thompson, L. M.; Schmidt, M. E.

    2014-12-01

    We apply cluster and factor analyses to bulk chemical data of 130 soil and rock samples measured by the Alpha Particle X-ray Spectrometer (APXS) on the Mars Science Laboratory (MSL) rover Curiosity through sol 650. Multivariate approaches such as principal components analysis (PCA), cluster analysis, and factor analysis compliment more traditional approaches (e.g., Harker diagrams), with the advantage of simultaneously examining the relationships between multiple variables for large numbers of samples. Principal components analysis has been applied with success to APXS, Pancam, and Mössbauer data from the Mars Exploration Rovers. Factor analysis and cluster analysis have been applied with success to thermal infrared (TIR) spectral data of Mars. Cluster analyses group the input data by similarity, where there are a number of different methods for defining similarity (hierarchical, density, distribution, etc.). For example, without any assumptions about the chemical contributions of surface dust, preliminary hierarchical and K-means cluster analyses clearly distinguish the physically adjacent rock targets Windjana and Stephen as being distinctly different than lithologies observed prior to Curiosity's arrival at The Kimberley. In addition, they are separated from each other, consistent with chemical trends observed in variation diagrams but without requiring assumptions about chemical relationships. We will discuss the variation in cluster analysis results as a function of clustering method and pre-processing (e.g., log transformation, correction for dust cover) and implications for interpreting chemical data. Factor analysis shares some similarities with PCA, and examines the variability among observed components of a dataset so as to reveal variations attributable to unobserved components. Factor analysis has been used to extract the TIR spectra of components that are typically observed in mixtures and only rarely in isolation; there is the potential for similar results with data from APXS. These techniques offer new ways to understand the chemical relationships between the materials interrogated by Curiosity, and potentially their relation to materials observed by APXS instruments on other landed missions.

  17. Unsupervised Bayesian linear unmixing of gene expression microarrays.

    PubMed

    Bazot, Cécile; Dobigeon, Nicolas; Tourneret, Jean-Yves; Zaas, Aimee K; Ginsburg, Geoffrey S; Hero, Alfred O

    2013-03-19

    This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor.

  18. Improved Variable Selection Algorithm Using a LASSO-Type Penalty, with an Application to Assessing Hepatitis B Infection Relevant Factors in Community Residents

    PubMed Central

    Guo, Pi; Zeng, Fangfang; Hu, Xiaomin; Zhang, Dingmei; Zhu, Shuming; Deng, Yu; Hao, Yuantao

    2015-01-01

    Objectives In epidemiological studies, it is important to identify independent associations between collective exposures and a health outcome. The current stepwise selection technique ignores stochastic errors and suffers from a lack of stability. The alternative LASSO-penalized regression model can be applied to detect significant predictors from a pool of candidate variables. However, this technique is prone to false positives and tends to create excessive biases. It remains challenging to develop robust variable selection methods and enhance predictability. Material and methods Two improved algorithms denoted the two-stage hybrid and bootstrap ranking procedures, both using a LASSO-type penalty, were developed for epidemiological association analysis. The performance of the proposed procedures and other methods including conventional LASSO, Bolasso, stepwise and stability selection models were evaluated using intensive simulation. In addition, methods were compared by using an empirical analysis based on large-scale survey data of hepatitis B infection-relevant factors among Guangdong residents. Results The proposed procedures produced comparable or less biased selection results when compared to conventional variable selection models. In total, the two newly proposed procedures were stable with respect to various scenarios of simulation, demonstrating a higher power and a lower false positive rate during variable selection than the compared methods. In empirical analysis, the proposed procedures yielding a sparse set of hepatitis B infection-relevant factors gave the best predictive performance and showed that the procedures were able to select a more stringent set of factors. The individual history of hepatitis B vaccination, family and individual history of hepatitis B infection were associated with hepatitis B infection in the studied residents according to the proposed procedures. Conclusions The newly proposed procedures improve the identification of significant variables and enable us to derive a new insight into epidemiological association analysis. PMID:26214802

  19. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    NASA Astrophysics Data System (ADS)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  20. Hybrid-finite-element analysis of some nonlinear and 3-dimensional problems of engineering fracture mechanics

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.; Nakagaki, M.; Kathiresan, K.

    1980-01-01

    In this paper, efficient numerical methods for the analysis of crack-closure effects on fatigue-crack-growth-rates, in plane stress situations, and for the solution of stress-intensity factors for arbitrary shaped surface flaws in pressure vessels, are presented. For the former problem, an elastic-plastic finite element procedure valid for the case of finite deformation gradients is developed and crack growth is simulated by the translation of near-crack-tip elements with embedded plastic singularities. For the latter problem, an embedded-elastic-singularity hybrid finite element method, which leads to a direct evaluation of K-factors, is employed.

  1. Determination of pKa values of some antipsychotic drugs by HPLC--correlations with the Kamlet and taft solvatochromic parameters and HPLC analysis in dosage forms.

    PubMed

    Sanli, Senem; Akmese, Bediha; Altun, Yuksel

    2013-01-01

    In this study, ionization constant (pKa) values were determined by using the dependence of the retention factor on the pH of the mobile phase for four ionizable drugs, namely, risperidone (RI), clozapine (CL), olanzapine (OL), and sertindole (SE). The effect of the mobile phase composition on the pKa was studied by measuring the pKa at different acetonitrile-water mixtures in an HPLC-UV method. To explain the variation of the pKa values obtained over the whole composition range studied, the quasi-lattice quasi-chemical theory of preferential solvation was applied. The pKa values of drugs were correlated with the Kamlet and Taft solvatochromic parameters. Kamlet and Taft's general equation was reduced to two terms by using combined factor analysis and target factor analysis in these mixtures: the independent term and the hydrogen-bond donating ability a. The HPLC-UV method was successfully applied for the determination of RI, OL, and SE in pharmaceutical dosage forms. CL was chosen as an internal standard. Additionally, the repeatability, reproducibility, selectivity, precision, and accuracy of the method in all media were investigated and calculated.

  2. Exploring the Sensitivity of Horn's Parallel Analysis to the Distributional Form of Random Data

    ERIC Educational Resources Information Center

    Dinno, Alexis

    2009-01-01

    Horn's parallel analysis (PA) is the method of consensus in the literature on empirical methods for deciding how many components/factors to retain. Different authors have proposed various implementations of PA. Horn's seminal 1965 article, a 1996 article by Thompson and Daniel, and a 2004 article by Hayton, Allen, and Scarpello all make assertions…

  3. Determination of the mechanical parameters of rock mass based on a GSI system and displacement back analysis

    NASA Astrophysics Data System (ADS)

    Kang, Kwang-Song; Hu, Nai-Lian; Sin, Chung-Sik; Rim, Song-Ho; Han, Eun-Cheol; Kim, Chol-Nam

    2017-08-01

    It is very important to obtain the mechanical paramerters of rock mass for excavation design, support design, slope design and stability analysis of the underground structure. In order to estimate the mechanical parameters of rock mass exactly, a new method of combining a geological strength index (GSI) system with intelligent displacment back analysis is proposed in this paper. Firstly, average spacing of joints (d) and rock mass block rating (RBR, a new quantitative factor), surface condition rating (SCR) and joint condition factor (J c) are obtained on in situ rock masses using the scanline method, and the GSI values of rock masses are obtained from a new quantitative GSI chart. A correction method of GSI value is newly introduced by considering the influence of joint orientation and groundwater on rock mass mechanical properties, and then value ranges of rock mass mechanical parameters are chosen by the Hoek-Brown failure criterion. Secondly, on the basis of the measurement result of vault settlements and horizontal convergence displacements of an in situ tunnel, optimal parameters are estimated by combination of genetic algorithm (GA) and numerical simulation analysis using FLAC3D. This method has been applied in a lead-zinc mine. By utilizing the improved GSI quantization, correction method and displacement back analysis, the mechanical parameters of the ore body, hanging wall and footwall rock mass were determined, so that reliable foundations were provided for mining design and stability analysis.

  4. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract

    NASA Astrophysics Data System (ADS)

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  5. STATISTICAL ANALYSIS OF SPECTROPHOTOMETRIC DETERMINATIONS OF BORON; Estudo Estatistico de Determinacoes Espectrofotometricas de Boro

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima, F.W.; Pagano, C.; Schneiderman, B.

    1959-07-01

    Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)

  6. Driver Performance Problems of Intercity Bus Public Transportation Safety in Indonesia

    NASA Astrophysics Data System (ADS)

    Suraji, A.; Harnen, S.; Wicaksono, A.; Djakfar, L.

    2017-11-01

    The risk of an inter-city bus public accident can be influenced by various factors such as the driver’s performance. Therefore, knowing the various influential factors related to driver’s performance is very necessary as an effort to realize road traffic safety. This study aims to determine the factors that fall on the accident associated with the driver’s performance and make mathematical modeling factors that affect the accident. Methods of data retrieval were obtained from NTSC secondary data. The data was processed by identifying factors that cause the accident. Furthermore data processing and analysis used the PCA method to obtain mathematical modeling of factors influencing the inter-city bus accidents. The results showed that the main factors that cause accidents are health, discipline, and driver competence.

  7. Factors Influencing Achievement in Undergraduate Social Science Research Methods Courses: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Markle, Gail

    2017-01-01

    Undergraduate social science research methods courses tend to have higher than average rates of failure and withdrawal. Lack of success in these courses impedes students' progression through their degree programs and negatively impacts institutional retention and graduation rates. Grounded in adult learning theory, this mixed methods study…

  8. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance Structure Models to Block-Toeplitz Matrices Representing Single-Subject Multivariate Time-Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    1998-01-01

    Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…

  9. Small-Chamber Measurements of Chemical-Specific Emission Factors for Drywall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddalena, Randy; Russell, Marion; Apte, Michael G.

    2010-06-01

    Imported drywall installed in U.S. homes is suspected of being a source of odorous and potentially corrosive indoor pollutants. To support an investigation of those building materials by the Consumer Products Safety Commission (CPSC), Lawrence Berkeley National Laboratory (LBNL) measured chemical-specific emission factors for 30 samples of drywall materials. Emission factors are reported for 75 chemicals and 30 different drywall samples encompassing both domestic and imported stock and incorporating natural, synthetic, or mixed gypsum core material. CPSC supplied all drywall materials. First the drywall samples were isolated and conditioned in dedicated chambers, then they were transferred to small chambers wheremore » emission testing was performed. Four sampling and analysis methods were utilized to assess (1) volatile organic compounds, (2) low molecular weight carbonyls, (3) volatile sulfur compounds, and (4) reactive sulfur gases. LBNL developed a new method that combines the use of solid phase microextraction (SPME) with small emission chambers to measure the reactive sulfur gases, then extended that technique to measure the full suite of volatile sulfur compounds. The testing procedure and analysis methods are described in detail herein. Emission factors were measured under a single set of controlled environmental conditions. The results are compared graphically for each method and in detailed tables for use in estimating indoor exposure concentrations.« less

  10. Factor Structure of the Kessler Psychological Distress Scale (K6) among Emerging Adults

    ERIC Educational Resources Information Center

    Bessaha, Melissa L.

    2017-01-01

    Objective: Confirmatory factor analysis was used to assess the factor structure of the 6-item version of the Kessler Psychological Distress Scale (K6). Methods: A subsample of emerging adults, aged 18-29 (n = 20,699), from the 2013 National Survey of Drug Use and Health were used in this study. Results: Each of the models (one-factor, two-factor…

  11. An Efficient Taguchi Approach for the Performance Optimization of Health, Safety, Environment and Ergonomics in Generation Companies.

    PubMed

    Azadeh, Ali; Sheikhalishahi, Mohammad

    2015-06-01

    A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.

  12. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  13. Research on the relationship between the elements and pharmacological activities in velvet antler using factor analysis and cluster analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Libing

    2017-04-01

    Velvet antler has certain effect on improving the body's immune cells and the regulation of immune system function, nervous system, anti-stress, anti-aging and osteoporosis. It has medicinal applications to treat a wide range of diseases such as tissue wound healing, anti-tumor, cardiovascular disease, et al. Therefore, the research on the relationship between pharmacological activities and elements in velvet antler is of great significance. The objective of this study was to comprehensively evaluate 15 kinds of elements in different varieties of velvet antlers and study on the relationship between the elements and traditional Chinese medicine efficacy for the human. The factor analysis and the factor cluster analysis methods were used to analyze the data of elements in the sika velvet antler, cervus elaphus linnaeus, flower horse hybrid velvet antler, apiti (elk) velvet antler, male reindeer velvet antler and find out the relationship between 15 kinds of elements including Ca, P, Mg, Na, K, Fe, Cu, Mn, Al, Ba, Co, Sr, Cr, Zn and Ni. Combining with MATLAB2010 and SPSS software, the chemometrics methods were made on the relationship between the elements in velvet antler and the pharmacological activities. The first commonality factor F1 had greater load on the indexes of Ca, P, Mg, Co, Sr and Ni, and the second commonality factor F2 had greater load on the indexes of K, Mn, Zn and Cr, and the third commonality factor F3 had greater load on the indexes of Na, Cu and Ba, and the fourth commonality factor F4 had greater load on the indexes of Fe and Al. 15 kinds of elements in velvet antler in the order were elk velvet antler>flower horse hybrid velvet antler>cervus elaphus linnaeus>sika velvet antler>male reindeer velvet antler. Based on the factor analysis and the factor cluster analysis, a model for evaluating traditional Chinese medicine quality was constructed. These studies provide the scientific base and theoretical foundation for the future large-scale rational relation development of velvet antler resources as well as the relationship between the elements and traditional Chinese medicine efficacy for the human.

  14. Clinical Trials With Large Numbers of Variables: Important Advantages of Canonical Analysis.

    PubMed

    Cleophas, Ton J

    2016-01-01

    Canonical analysis assesses the combined effects of a set of predictor variables on a set of outcome variables, but it is little used in clinical trials despite the omnipresence of multiple variables. The aim of this study was to assess the performance of canonical analysis as compared with traditional multivariate methods using multivariate analysis of covariance (MANCOVA). As an example, a simulated data file with 12 gene expression levels and 4 drug efficacy scores was used. The correlation coefficient between the 12 predictor and 4 outcome variables was 0.87 (P = 0.0001) meaning that 76% of the variability in the outcome variables was explained by the 12 covariates. Repeated testing after the removal of 5 unimportant predictor and 1 outcome variable produced virtually the same overall result. The MANCOVA identified identical unimportant variables, but it was unable to provide overall statistics. (1) Canonical analysis is remarkable, because it can handle many more variables than traditional multivariate methods such as MANCOVA can. (2) At the same time, it accounts for the relative importance of the separate variables, their interactions and differences in units. (3) Canonical analysis provides overall statistics of the effects of sets of variables, whereas traditional multivariate methods only provide the statistics of the separate variables. (4) Unlike other methods for combining the effects of multiple variables such as factor analysis/partial least squares, canonical analysis is scientifically entirely rigorous. (5) Limitations include that it is less flexible than factor analysis/partial least squares, because only 2 sets of variables are used and because multiple solutions instead of one is offered. We do hope that this article will stimulate clinical investigators to start using this remarkable method.

  15. Discriminatory Analysis II. Factor Analysis and Discrimination

    DTIC Science & Technology

    1950-10-01

    present In a given-’test. The last phase of factor ana2lysis is concerned -- -kv’-~ ’psyc Ig hological-siLr ifi-cance to the rectors, extract -, aed*-hee...tests. However, as IHotelling t s procedu~re is an iterative ~ one,.it can be stopped after the extraction of’ any nuLmber of coiuponenta if., one -deems...need not extract 22 ’l p factors and that one can factior R1 by this method as well as R, However, the fact remains that the coomrponents do not have

  16. Human Modeling for Ground Processing Human Factors Engineering Analysis

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  17. Psychometric Properties of the Persian Version of the Social Anxiety - Acceptance and Action Questionnaire.

    PubMed

    Soltani, Esmail; Bahrainian, Seyed Abdolmajid; Masjedi Arani, Abbas; Farhoudian, Ali; Gachkar, Latif

    2016-06-01

    Social anxiety disorder is often related to specific impairment or distress in different areas of life, including occupational, social and family settings. The purpose of the present study was to examine the psychometric properties of the persian version of the social anxiety-acceptance and action questionnaire (SA-AAQ) in university students. In this descriptive cross-sectional study, 324 students from Shahid Beheshti University of Medical Sciences participated via the cluster sampling method during year 2015. Factor analysis by the principle component analysis method, internal consistency analysis, and convergent and divergent validity were conducted to examine the validity of the SA-AAQ. To calculate the reliability of the SA-AAQ, Cronbach's alpha and test-retest reliability were used. The results from factor analysis by principle component analysis method yielded three factors that were named acceptance, action and non-judging of experience. The three-factor solution explained 51.82% of the variance. Evidence for the internal consistency of SA-AAQ was obtained via calculating correlations between SA-AAQ and its subscales. Support for convergent and discriminant validity of the SA-AAQ via its correlations with the acceptance and action questionnaire - II, social interaction anxiety scale, cognitive fusion questionnaire, believability of anxious feelings and thoughts questionnaire, valued living questionnaire and WHOQOL- BREF was obtained. The reliability of the SA-AAQ via calculating Cronbach's alpha and test-retest coefficients yielded values of 0.84 and 0.84, respectively. The Iranian version of the SA-AAQ has acceptable levels of psychometric properties in university students. The SA-AAQ is a valid and reliable measure to be utilized in research investigations and therapeutic interventions.

  18. Fighting for Intelligence: A Brief Overview of the Academic Work of John L. Horn

    PubMed Central

    McArdle, John J.; Hofer, Scott M.

    2015-01-01

    John L. Horn (1928–2006) was a pioneer in multivariate thinking and the application of multivariate methods to research on intelligence and personality. His key works on individual differences in the methodological areas of factor analysis and the substantive areas of cognition are reviewed here. John was also our mentor, teacher, colleague, and friend. We overview John Horn’s main contributions to the field of intelligence by highlighting 3 issues about his methods of factor analysis and 3 of his substantive debates about intelligence. We first focus on Horn’s methodological demonstrations describing (a) the many uses of simulated random variables in exploratory factor analysis; (b) the exploratory uses of confirmatory factor analysis; and (c) the key differences between states, traits, and trait-changes. On a substantive basis, John believed that there were important individual differences among people in terms of cognition and personality. These sentiments led to his intellectual battles about (d) Spearman’s g theory of a unitary intelligence, (e) Guilford’s multifaceted model of intelligence, and (f) the Schaie and Baltes approach to defining the lack of decline of intelligence earlier in the life span. We conclude with a summary of John Horn’s unique approaches to dealing with common issues. PMID:26246642

  19. Rasch analysis and impact factor methods both yield valid and comparable measures of health status in interstitial lung disease.

    PubMed

    Patel, Amit S; Siegert, Richard J; Bajwah, Sabrina; Brignall, Kate; Gosker, Harry R; Moxham, John; Maher, Toby M; Renzoni, Elisabetta A; Wells, Athol U; Higginson, Irene J; Birring, Surinder S

    2015-09-01

    Rasch analysis has largely replaced impact factor methodology for developing health status measures. The aim of this study was to develop a health status questionnaire for patients with interstitial lung disease (ILD) using impact factor methodology and to compare its validity with that of another version developed using Rasch analysis. A preliminary 71-item questionnaire was developed and evaluated in 173 patients with ILD. Items were reduced by the impact factor method (King's Brief ILD questionnaire, KBILD-I) and Rasch analysis (KBILD-R). Both questionnaires were validated by assessing their relationship with forced vital capacity (FVC) and St Georges Respiratory Questionnaire (SGRQ) and by evaluating internal reliability, repeatability, and longitudinal responsiveness. The KBILD-R and KBILD-I comprised 15 items each. The content of eight items differed between the KBILD-R and KBILD-I. Internal and test-retest reliability was good for total scores of both questionnaires. There was a good relationship with SGRQ and moderate relationship with FVC for both questionnaires. Effect sizes were comparable. Both questionnaires discriminated patients with differing disease severity. Despite considerable differences in the content of retained items, both KBILD-R and KBILD-I questionnaires demonstrated acceptable measurement properties and performed comparably in a clinical setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Thinking beyond Opisthorchis viverrini for risk of cholangiocarcinoma in the lower Mekong region: a systematic review and meta-analysis.

    PubMed

    Steele, Jennifer A; Richter, Carsten H; Echaubard, Pierre; Saenna, Parichat; Stout, Virginia; Sithithaworn, Paiboon; Wilcox, Bruce A

    2018-05-17

    Cholangiocarcinoma (CCA) is a fatal bile duct cancer associated with infection by the liver fluke, Opisthorchis viverrini, in the lower Mekong region. Numerous public health interventions have focused on reducing exposure to O. viverrini, but incidence of CCA in the region remains high. While this may indicate the inefficacy of public health interventions due to complex social and cultural factors, it may further indicate other risk factors or interactions with the parasite are important in pathogenesis of CCA. This systematic review aims to provide a comprehensive analysis of described risk factors for CCA in addition to O. viverrini to guide future integrative interventions. We searched five international and seven Thai research databases to identify studies relevant to risk factors for CCA in the lower Mekong region. Selected studies were assessed for risk of bias and quality in terms of study design, population, CCA diagnostic methods, and statistical methods. The final 18 included studies reported numerous risk factors which were grouped into behaviors, socioeconomics, diet, genetics, gender, immune response, other infections, and treatment for O. viverrini. Seventeen risk factors were reported by two or more studies and were assessed with random effects models during meta-analysis. This meta-analysis indicates that the combination of alcohol and smoking (OR = 11.1, 95% CI: 5.63-21.92, P <  0.0001) is most significantly associated with increased risk for CCA and is an even greater risk factor than O. viverrini exposure. This analysis also suggests that family history of cancer, consumption of raw cyprinoid fish, consumption of high nitrate foods, and praziquantel treatment are associated with significantly increased risk. These risk factors may have complex relationships with the host, parasite, or pathogenesis of CCA, and many of these risk factors were found to interact with each other in one or more studies. Our findings suggest that a complex variety of risk factors in addition to O. viverrini infection should be addressed in future public health interventions to reduce CCA in affected regions. In particular, smoking and alcohol use, dietary patterns, and socioeconomic factors should be considered when developing intervention programs to reduce CCA.

  1. Method and apparatus for determining material structural integrity

    DOEpatents

    Pechersky, Martin

    1996-01-01

    A non-destructive method and apparatus for determining the structural integrity of materials by combining laser vibrometry with damping analysis techniques to determine the damping loss factor of a material. The method comprises the steps of vibrating the area being tested over a known frequency range and measuring vibrational force and velocity as a function of time over the known frequency range. Vibrational velocity is preferably measured by a laser vibrometer. Measurement of the vibrational force depends on the vibration method. If an electromagnetic coil is used to vibrate a magnet secured to the area being tested, then the vibrational force is determined by the amount of coil current used in vibrating the magnet. If a reciprocating transducer is used to vibrate a magnet secured to the area being tested, then the vibrational force is determined by a force gauge in the reciprocating transducer. Using known vibrational analysis methods, a plot of the drive point mobility of the material over the preselected frequency range is generated from the vibrational force and velocity measurements. The damping loss factor is derived from a plot of the drive point mobility over the preselected frequency range using the resonance dwell method and compared with a reference damping loss factor for structural integrity evaluation.

  2. Data for factor analysis of hydro-geochemical characteristics of groundwater resources in Iranshahr.

    PubMed

    Biglari, Hamed; Saeidi, Mehdi; Karimyan, Kamaleddin; Narooie, Mohammad Reza; Sharafi, Hooshmand

    2018-08-01

    Detection of Hydrogeological and Hydro-geochemical changes affecting the quality of aquifer water is very important. The aim of this study was to determine the factor analysis of the hydro-geochemical characteristics of Iranshahr underground water resources during the warm and cool seasons. In this study, 248 samples (two-time repetitions) of ground water resources were provided at first by cluster-random sampling method during 2017 in the villages of Iranshahr city. After transferring the samples to the laboratory, concentrations of 13 important chemical parameters in those samples were determined according to o water and wastewater standard methods. The results of this study indicated that 45.45% and 55.55% of the correlation between parameters has had a significant decrease and increase, respectively with the transition from warm seasons to cold seasons. According to the factor analysis method, three factors of land hydro-geochemical processes, supplying resources by surface water and sewage as well as human activities have been identified as influential on the chemical composition of these resources.The highest growth rate of 0.37 was observed between phosphate and nitrate ions while the lowest trend of - 0.33 was seen between fluoride ion and calcium as well as chloride ions. Also, a significant increase in the correlation between magnesium ion and nitrate ion from warm seasons to cold seasons indicates the high seasonal impact of the relation between these two parameters.

  3. Study on the Influence of Elevation of Tailing Dam on Stability

    NASA Astrophysics Data System (ADS)

    Wan, Shuai; Wang, Kun; Kong, Songtao; Zhao, Runan; Lan, Ying; Zhang, Run

    2017-12-01

    This paper takes Yunnan as the object of a tailing, by theoretical analysis and numerical calculation method of the effect of seismic load effect of elevation on the stability of the tailing, to analyse the stability of two point driven safety factor and liquefaction area. The Bishop method is adopted to simplify the calculation of dynamic safety factor and liquefaction area analysis using comparison method of shear stress to analyse liquefaction, so we obtained the influence of elevation on the stability of the tailing. Under the earthquake, with the elevation increased, the safety coefficient of dam body decreases, shallow tailing are susceptible to liquefy. Liquefaction area mainly concentrated in the bank below the water surface, to improve the scientific basis for the design and safety management of the tailing.

  4. A needs analysis method for land-use planning of illegal dumping sites: a case study in Aomori-Iwate, Japan.

    PubMed

    Ishii, Kazuei; Furuichi, Toru; Nagao, Yukari

    2013-02-01

    Land use at contaminated sites, following remediation, is often needed for regional redevelopment. However, there exist few methods of developing economically and socially feasible land-use plans based on regional needs because of the wide variety of land-use requirements. This study proposes a new needs analysis method for the conceptual land-use planning of contaminated sites and illustrates this method with a case study of an illegal dumping site for hazardous waste. In this method, planning factors consisting of the land-use attributes and related facilities are extracted from the potential needs of the residents through a preliminary questionnaire. Using the extracted attributes of land use and the related facilities, land-use cases are designed for selection-based conjoint analysis. A second questionnaire for respondents to the first one who indicated an interest in participating in the second questionnaire is conducted for the conjoint analysis to determine the utility function and marginal cost of each attribute in order to prioritize the planning factors to develop a quantitative and economically and socially feasible land-use plan. Based on the results, site-specific land-use alternatives are developed and evaluated by the utility function obtained from the conjoint analysis. In this case study of an illegal dumping site for hazardous waste, the uses preferred as part of a conceptual land-use plan following remediation of the site were (1) agricultural land and a biogas plant designed to recover energy from biomass or (2) a park with a welfare facility and an athletic field. Our needs analysis method with conjoint analysis is applicable to the development of conceptual land-use planning for similar sites following remediation, particularly when added value is considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Dong-Chang; Jans, Hans; McEwan, Sandy

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively.more » The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.« less

  6. Method of Testing and Predicting Failures of Electronic Mechanical Systems

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, Frances A.

    1996-01-01

    A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.

  7. Multiscale Characterization of PM2.5 in Southern Taiwan based on Noise-assisted Multivariate Empirical Mode Decomposition and Time-dependent Intrinsic Correlation

    NASA Astrophysics Data System (ADS)

    Hsiao, Y. R.; Tsai, C.

    2017-12-01

    As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.

  8. Factor Structure and Psychometric Properties of the Brief Illness Perception Questionnaire in Turkish Cancer Patients

    PubMed Central

    Karataş, Tuğba; Özen, Şükrü; Kutlutürkan, Sevinç

    2017-01-01

    Objective: The main aim of this study was to investigate the factor structure and psychometric properties of the Brief Illness Perception Questionnaire (BIPQ) in Turkish cancer patients. Methods: This methodological study involved 135 cancer patients. Statistical methods included confirmatory or exploratory factor analysis and Cronbach alpha coefficients for internal consistency. Results: The values of fit indices are within the acceptable range. The alpha coefficients for emotional illness representations, cognitive illness representations, and total scale are 0.83, 0.80, and 0.85, respectively. Conclusions: The results confirm the two-factor structure of the Turkish BIPQ and demonstrate its reliability and validity. PMID:28217734

  9. [Methods of a posteriori identification of food patterns in Brazilian children: a systematic review].

    PubMed

    Carvalho, Carolina Abreu de; Fonsêca, Poliana Cristina de Almeida; Nobre, Luciana Neri; Priore, Silvia Eloiza; Franceschini, Sylvia do Carmo Castro

    2016-01-01

    The objective of this study is to provide guidance for identifying dietary patterns using the a posteriori approach, and analyze the methodological aspects of the studies conducted in Brazil that identified the dietary patterns of children. Articles were selected from the Latin American and Caribbean Literature on Health Sciences, Scientific Electronic Library Online and Pubmed databases. The key words were: Dietary pattern; Food pattern; Principal Components Analysis; Factor analysis; Cluster analysis; Reduced rank regression. We included studies that identified dietary patterns of children using the a posteriori approach. Seven studies published between 2007 and 2014 were selected, six of which were cross-sectional and one cohort, Five studies used the food frequency questionnaire for dietary assessment; one used a 24-hour dietary recall and the other a food list. The method of exploratory approach used in most publications was principal components factor analysis, followed by cluster analysis. The sample size of the studies ranged from 232 to 4231, the values of the Kaiser-Meyer-Olkin test from 0.524 to 0.873, and Cronbach's alpha from 0.51 to 0.69. Few Brazilian studies identified dietary patterns of children using the a posteriori approach and principal components factor analysis was the technique most used.

  10. A Confirmatory Factor Analysis of the Structure of Abbreviated Math Anxiety Scale

    PubMed Central

    Farrokhi, Farahman

    2011-01-01

    Objective The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Abbreviated Math Anxiety Scale (AMAS), proposed by Hopko, Mahadevan, Bare & Hunt. Method The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. The confirmatory factor analysis (CFA) was carried out to determine the factor structures of the Persian version of AMAS. Results As expected, the two-factor solution provided a better fit to the data than a single factor. Moreover, multi-group analyses showed that this two-factor structure was invariant across sex. Hence, AMAS provides an equally valid measure for use among college students. Conclusions Brief AMAS demonstrates adequate reliability and validity. The AMAS scores can be used to compare symptoms of math anxiety between male and female students. The study both expands and adds support to the existing body of math anxiety literature. PMID:22952521

  11. Analysis of Tourists Preferences on Smart Tourism in Yogyakarta (Case: Vredeburg Fort Museum)

    NASA Astrophysics Data System (ADS)

    Amanda, Rima; Santosa, PInsap; Nur Rizal, M.

    2018-04-01

    Smart tourism is a supporting system of an individual tourism in the terms of a comprehensive and integrated information and technology service. An educational tourist destination such as a museum is expected to present an informative and interactive atmosphere. Vredeburg Fort Museum as one of the tourist destinations in Yogyakarta begins to lose its visitors. The lack of interest of public towards the museum and the assumption that the museum is an ancient, less well maintained, and boring place become main obstacles in attracting tourists. This research aims to find the important factors becoming the preferences of tourists to visit the Vredeburg Museum in Yogyakarta. The research method used is the Principal Component Analysis. The analysis shows there are four main factors with eigenvalue more than 1, i.e. the first factor of 8,623, the second factor of 1,920, the third factor of 1,175, and the fourth factor of 1.082. Those four factors are the result of the grouping of 20 preference determinant variables.

  12. Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error

    PubMed Central

    Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee

    2017-01-01

    Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146

  13. Constraint factor graph cut-based active contour method for automated cellular image segmentation in RNAi screening.

    PubMed

    Chen, C; Li, H; Zhou, X; Wong, S T C

    2008-05-01

    Image-based, high throughput genome-wide RNA interference (RNAi) experiments are increasingly carried out to facilitate the understanding of gene functions in intricate biological processes. Automated screening of such experiments generates a large number of images with great variations in image quality, which makes manual analysis unreasonably time-consuming. Therefore, effective techniques for automatic image analysis are urgently needed, in which segmentation is one of the most important steps. This paper proposes a fully automatic method for cells segmentation in genome-wide RNAi screening images. The method consists of two steps: nuclei and cytoplasm segmentation. Nuclei are extracted and labelled to initialize cytoplasm segmentation. Since the quality of RNAi image is rather poor, a novel scale-adaptive steerable filter is designed to enhance the image in order to extract long and thin protrusions on the spiky cells. Then, constraint factor GCBAC method and morphological algorithms are combined to be an integrated method to segment tight clustered cells. Compared with the results obtained by using seeded watershed and the ground truth, that is, manual labelling results by experts in RNAi screening data, our method achieves higher accuracy. Compared with active contour methods, our method consumes much less time. The positive results indicate that the proposed method can be applied in automatic image analysis of multi-channel image screening data.

  14. Analysis of Influential Factors Associated with the Smoking Behavior of Aboriginal Schoolchildren in Remote Taiwanese Mountainous Areas

    ERIC Educational Resources Information Center

    Huang, Hsiao-Ling; Hsu, Chih-Cheng; Peng, Wu-Der; Yen, Yea-Yin; Chen, Ted; Hu, Chih-Yang; Shi, Hon-Yi; Lee, Chien-Hung; Chen, Fu-Li; Lin, Pi-Li

    2012-01-01

    Background: A disparity in smoking behavior exists between the general and minority populations residing in Taiwan's mountainous areas. This study analyzed individual and environmental factors associated with children's smoking behavior in these areas of Taiwan. Methods: In this school-based study, data on smoking behavior and related factors for…

  15. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  16. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1983-01-01

    The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally-useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  17. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    PubMed

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  18. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    PubMed

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  19. [On measuring of factors influencing the complex need for cultural entertainments of the inhabitants in geriatric nursing homes (3rd information) (author's transl)].

    PubMed

    Kuhlmey, J; Lautsch, E

    1980-01-01

    In our 2. information on the investigation of the need for cultural entertainments of inhabitants in geriatric nursing homes we tested the influence of the factors age, sex, kind of work and during of stay in the geriatric nursing home singly and successively for each single indicator of this complex need. In this 3. information the influence of this four factors was investigated in these contradictory dependency on the indicators under synchronous consideration of their contradictory dependency. The contradictory dependency of the factors was presented by typisation (cluster analysis). As a result of the cluster analysis same classes arose--similar disposed inhabitants belong to same classes. The average coinage in this classes was obtained and differences were analysed by statistical methods multidimensional analysis of variance and analysis of discriminance).

  20. Examining the factor structure of MUIS-C scale among baby boomers with hepatitis C.

    PubMed

    Reinoso, Humberto; Türegün, Mehmet

    2016-11-01

    Baby boomers account for two out of every three cases of hepatitis C infection in the U.S. To conduct an exploratory factor analysis directed at supporting the use of the MUIS-C as a reliable instrument in measuring illness uncertainty among baby boomers with hepatitis C. The steps of conducting a typical principal component analysis (PCA) with an oblique rotation were used on a sample of 146 participants, the sampling adequacy of items was examined via the Kaiser-Meyer-Olkin (KMO) measure, and the Bartlett's sphericity test was used for appropriateness of conducting a factor analysis. A two-factor structure was obtained by using Horn's parallel analysis method. The two factors explained a cumulative total of 45.8% of the variance. The results of the analyses indicated that the MUIS-C was a valid and reliable instrument and potentially suitable for use in baby boomer population diagnosed with hepatitis C. Published by Elsevier Inc.

  1. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  2. An Examination of the Nature of Erotic Talk.

    PubMed

    Jonason, Peter K; Betteridge, Gabrielle L; Kneebone, Ian I

    2016-01-01

    Using a mixed-methods study, we provided the first systematic documentation and exploration of erotic talk. In Study 1 (N = 95), participants provided 569 erotic talk statements in an anonymous online survey, which we classified, using a modified thematic analysis, as being representative of eight themes. In Study 2 (N = 238), we quantified individual differences in these themes, subjected them to factor analysis, and examined the nomological network surrounding them with measures of relationship and sexual satisfaction, sociosexuality, and personality. The eight initial categories represented two higher order factors, which we call individualist talk and mutualistic talk. These factors were orthogonal in factor analysis and distinct in their nomological network. While the majority of people reported using erotic talk, we found few sex differences in its use.

  3. [Priorization of facilitators for the implementation of medication review with follow-up service in Spanish community pharmacies through exploratory factor analysis].

    PubMed

    Gil, Modesta Inmaculada; Benrimoj, Shalom Isaac; Martínez-Martínez, Fernando; Cardero, Manuel; Gastelurrutia, Miguel Ángel

    2013-01-01

    to prioritize previously identified in Spain facilitators for the implementation of new Pharmaceutical Services that allow designing strategies for the implementation of Medication Review with follow-up (MRFup) service. Exploratory factor analysis (EFA). A draft of a questionnaire was performed based on a previous literature review and following the RAND/UCLA methodology. An expert panel worked with it and generated a definitive questionnaire which, after piloting, was used with a representative sample of pharmacists, owners or staff members, who were working in community pharmacy, using computer-assisted telephone interviewing (CATI) methodology. To understand underlying constructs in the questionnaire an EFA was performed. Different approaches were tested such as principal components factor analysis and principal axis factoring method. The best interpretability was achieved using the Factorization of Principal axis method with Direct Oblimin rotation, which explained the 40.0% of total variance. This produced four factors defined as: «Incentives», «External campaigns», «Expert in MRFup» and «Professionalism of the pharmacist». It can be stated that for implementation and sustainability of MRFup Service it is necessary being paid; also it must be explained to health professional and society in general. Practice of MRFup service demands pharmacists receiving a more clinical education and assuming more responsibilities as health professionals. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  4. [Determination of five naphthaquinones in Arnebia euchroma by quantitative analysis multi-components with single-marker].

    PubMed

    Zhao, Wen-Wen; Wu, Zhi-Min; Wu, Xia; Zhao, Hai-Yu; Chen, Xiao-Qing

    2016-10-01

    This study is to determine five naphthaquinones (acetylshikonin, β-acetoxyisovalerylalkannin, isobutylshikonin, β,β'-dimethylacrylalkannin,α-methyl-n-butylshikonin) by quantitative analysis of multi-components with a single marker (QAMS). β,β'-Dimethylacrylalkannin was selected as the internal reference substance, and the relative correlation factors (RCFs) of acetylshikonin, β-acetoxyisovalerylalkannin, isobutylshikonin and α-methyl-n-butylshikonin were calculated. Then the ruggedness of relative correction factors was tested on different instruments and columns. Meanwhile, 16 batches of Arnebia euchroma were analyzed by external standard method (ESM) and QAMS, respectively. The peaks were identifited by LC-MS. The ruggedness of relative correction factors was good. And the analytical results calculated by ESM and QAMS showed no difference. The quantitative method established was feasible and suitable for the quality evaluation of A. euchroma. Copyright© by the Chinese Pharmaceutical Association.

  5. In-house validation of a liquid chromatography-tandem mass spectrometry method for the determination of selective androgen receptor modulators (SARMS) in bovine urine.

    PubMed

    Schmidt, Kathrin S; Mankertz, Joachim

    2018-06-01

    A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.

  6. Regression Analysis and Calibration Recommendations for the Characterization of Balance Temperature Effects

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2018-01-01

    Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.

  7. Geospatial and machine learning techniques for wicked social science problems: analysis of crash severity on a regional highway corridor

    NASA Astrophysics Data System (ADS)

    Effati, Meysam; Thill, Jean-Claude; Shabani, Shahin

    2015-04-01

    The contention of this paper is that many social science research problems are too "wicked" to be suitably studied using conventional statistical and regression-based methods of data analysis. This paper argues that an integrated geospatial approach based on methods of machine learning is well suited to this purpose. Recognizing the intrinsic wickedness of traffic safety issues, such approach is used to unravel the complexity of traffic crash severity on highway corridors as an example of such problems. The support vector machine (SVM) and coactive neuro-fuzzy inference system (CANFIS) algorithms are tested as inferential engines to predict crash severity and uncover spatial and non-spatial factors that systematically relate to crash severity, while a sensitivity analysis is conducted to determine the relative influence of crash severity factors. Different specifications of the two methods are implemented, trained, and evaluated against crash events recorded over a 4-year period on a regional highway corridor in Northern Iran. Overall, the SVM model outperforms CANFIS by a notable margin. The combined use of spatial analysis and artificial intelligence is effective at identifying leading factors of crash severity, while explicitly accounting for spatial dependence and spatial heterogeneity effects. Thanks to the demonstrated effectiveness of a sensitivity analysis, this approach produces comprehensive results that are consistent with existing traffic safety theories and supports the prioritization of effective safety measures that are geographically targeted and behaviorally sound on regional highway corridors.

  8. Accounting for measurement error in biomarker data and misclassification of subtypes in the analysis of tumor data.

    PubMed

    Nevo, Daniel; Zucker, David M; Tamimi, Rulla M; Wang, Molin

    2016-12-30

    A common paradigm in dealing with heterogeneity across tumors in cancer analysis is to cluster the tumors into subtypes using marker data on the tumor, and then to analyze each of the clusters separately. A more specific target is to investigate the association between risk factors and specific subtypes and to use the results for personalized preventive treatment. This task is usually carried out in two steps-clustering and risk factor assessment. However, two sources of measurement error arise in these problems. The first is the measurement error in the biomarker values. The second is the misclassification error when assigning observations to clusters. We consider the case with a specified set of relevant markers and propose a unified single-likelihood approach for normally distributed biomarkers. As an alternative, we consider a two-step procedure with the tumor type misclassification error taken into account in the second-step risk factor analysis. We describe our method for binary data and also for survival analysis data using a modified version of the Cox model. We present asymptotic theory for the proposed estimators. Simulation results indicate that our methods significantly lower the bias with a small price being paid in terms of variance. We present an analysis of breast cancer data from the Nurses' Health Study to demonstrate the utility of our method. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Quantitative topographic differentiation of the neonatal EEG.

    PubMed

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  10. Comparing direct and iterative equation solvers in a large structural analysis software system

    NASA Technical Reports Server (NTRS)

    Poole, E. L.

    1991-01-01

    Two direct Choleski equation solvers and two iterative preconditioned conjugate gradient (PCG) equation solvers used in a large structural analysis software system are described. The two direct solvers are implementations of the Choleski method for variable-band matrix storage and sparse matrix storage. The two iterative PCG solvers include the Jacobi conjugate gradient method and an incomplete Choleski conjugate gradient method. The performance of the direct and iterative solvers is compared by solving several representative structural analysis problems. Some key factors affecting the performance of the iterative solvers relative to the direct solvers are identified.

  11. The X-Factor: an evaluation of common methods used to analyse major inter-segment kinematics during the golf swing.

    PubMed

    Brown, Susan J; Selbie, W Scott; Wallace, Eric S

    2013-01-01

    A common biomechanical feature of a golf swing, described in various ways in the literature, is the interaction between the thorax and pelvis, often termed the X-Factor. There is no consistent method used within golf biomechanics literature however to calculate these segment interactions. The purpose of this study was to examine X-factor data calculated using three reported methods in order to determine the similarity or otherwise of the data calculated using each method. A twelve-camera three-dimensional motion capture system was used to capture the driver swings of 19 participants and a subject specific three-dimensional biomechanical model was created with the position and orientation of each model estimated using a global optimisation algorithm. Comparison of the X-Factor methods showed significant differences for events during the swing (P < 0.05). Data for each kinematic measure were derived as a times series for all three methods and regression analysis of these data showed that whilst one method could be successfully mapped to another, the mappings between methods are subject dependent (P <0.05). Findings suggest that a consistent methodology considering the X-Factor from a joint angle approach is most insightful in describing a golf swing.

  12. An analysis of the functioning of mental healthcare in northwestern Poland.

    PubMed

    Bażydło, Marta; Karakiewicz, Beata

    Modern psychiatry faces numerous challenges related with the change of the epidemiology of mental disorders and the development of knowledge in this area of science. An answer to this situation is to be the introduction of community psychiatry. The implementation of this model in Poland was the aim of the National Mental Health Protection Programme. The aim of the study was to analyse the functioning of mental healthcare using the example of the West Pomeranian Province in Poland. The analysis relied on a qualitative method. Three group interviews in an interdisciplinary advisory panel were conducted. People representing various areas acting for people with mental disorders participated in each meeting. Based on the conclusions that were drawn, PEST and SWOT analyses of functioning of mental healthcare were performed. Within the analysis of the macro-environment of mental healthcare, the influence of the following factors was evaluated through PEST analysis: political and legal, economic, socio-cultural, and technological. All of these factors were assessed as negative for the functioning of mental healthcare. Then, a SWOT analysis was performed to indicate the strengths, weaknesses, opportunities, and threats in the functioning of mental healthcare. 1. Mental healthcare is more influenced by external factors than by internal factors. 2. Macro-environmental factors influence the functioning of mental healthcare in a significantly negative manner. 3. The basic problem in the functioning of mental healthcare is insufficient funding. 4. In order to improve the functioning of mental healthcare, it is necessary to change the funding methods, regulations, the way society perceives mental disorders, and the system of monitoring mental healthcare services.

  13. Is It Feasible to Identify Natural Clusters of TSC-Associated Neuropsychiatric Disorders (TAND)?

    PubMed

    Leclezio, Loren; Gardner-Lubbe, Sugnet; de Vries, Petrus J

    2018-04-01

    Tuberous sclerosis complex (TSC) is a genetic disorder with multisystem involvement. The lifetime prevalence of TSC-Associated Neuropsychiatric Disorders (TAND) is in the region of 90% in an apparently unique, individual pattern. This "uniqueness" poses significant challenges for diagnosis, psycho-education, and intervention planning. To date, no studies have explored whether there may be natural clusters of TAND. The purpose of this feasibility study was (1) to investigate the practicability of identifying natural TAND clusters, and (2) to identify appropriate multivariate data analysis techniques for larger-scale studies. TAND Checklist data were collected from 56 individuals with a clinical diagnosis of TSC (n = 20 from South Africa; n = 36 from Australia). Using R, the open-source statistical platform, mean squared contingency coefficients were calculated to produce a correlation matrix, and various cluster analyses and exploratory factor analysis were examined. Ward's method rendered six TAND clusters with good face validity and significant convergence with a six-factor exploratory factor analysis solution. The "bottom-up" data-driven strategies identified a "scholastic" cluster of TAND manifestations, an "autism spectrum disorder-like" cluster, a "dysregulated behavior" cluster, a "neuropsychological" cluster, a "hyperactive/impulsive" cluster, and a "mixed/mood" cluster. These feasibility results suggest that a combination of cluster analysis and exploratory factor analysis methods may be able to identify clinically meaningful natural TAND clusters. Findings require replication and expansion in larger dataset, and could include quantification of cluster or factor scores at an individual level. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. A stiffness derivative finite element technique for determination of crack tip stress intensity factors

    NASA Technical Reports Server (NTRS)

    Parks, D. M.

    1974-01-01

    A finite element technique for determination of elastic crack tip stress intensity factors is presented. The method, based on the energy release rate, requires no special crack tip elements. Further, the solution for only a single crack length is required, and the crack is 'advanced' by moving nodal points rather than by removing nodal tractions at the crack tip and performing a second analysis. The promising straightforward extension of the method to general three-dimensional crack configurations is presented and contrasted with the practical impossibility of conventional energy methods.

  15. Stability analysis of Caisson Cofferdam Based on Strength Reduction Method

    NASA Astrophysics Data System (ADS)

    Xu, B. B.; Zhang, N. S.

    2018-05-01

    The working mechanism of the caisson cofferdam depends on the self-weight of the structure and internal filling to ensure its skid resistance and overturn resistance stability. Using the strength reduction method, the safety factor of the caisson cofferdam can be obtained. The potential slide surface can be searched automatically without constraining the range of the arc center. According to the results, the slippage surface goes through the bottom of the caisson. Based on the judgement criterion of the strength reduction method, the final safety factor is about 1.65.

  16. Vibration-based structural health monitoring using adaptive statistical method under varying environmental condition

    NASA Astrophysics Data System (ADS)

    Jin, Seung-Seop; Jung, Hyung-Jo

    2014-03-01

    It is well known that the dynamic properties of a structure such as natural frequencies depend not only on damage but also on environmental condition (e.g., temperature). The variation in dynamic characteristics of a structure due to environmental condition may mask damage of the structure. Without taking the change of environmental condition into account, false-positive or false-negative damage diagnosis may occur so that structural health monitoring becomes unreliable. In order to address this problem, an approach to construct a regression model based on structural responses considering environmental factors has been usually used by many researchers. The key to success of this approach is the formulation between the input and output variables of the regression model to take into account the environmental variations. However, it is quite challenging to determine proper environmental variables and measurement locations in advance for fully representing the relationship between the structural responses and the environmental variations. One alternative (i.e., novelty detection) is to remove the variations caused by environmental factors from the structural responses by using multivariate statistical analysis (e.g., principal component analysis (PCA), factor analysis, etc.). The success of this method is deeply depending on the accuracy of the description of normal condition. Generally, there is no prior information on normal condition during data acquisition, so that the normal condition is determined by subjective perspective with human-intervention. The proposed method is a novel adaptive multivariate statistical analysis for monitoring of structural damage detection under environmental change. One advantage of this method is the ability of a generative learning to capture the intrinsic characteristics of the normal condition. The proposed method is tested on numerically simulated data for a range of noise in measurement under environmental variation. A comparative study with conventional methods (i.e., fixed reference scheme) demonstrates the superior performance of the proposed method for structural damage detection.

  17. SU-F-T-192: Study of Robustness Analysis Method of Multiple Field Optimized IMPT Plans for Head & Neck Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Wang, X; Li, H

    Purpose: Proton therapy is more sensitive to uncertainties than photon treatments due to protons’ finite range depending on the tissue density. Worst case scenario (WCS) method originally proposed by Lomax has been adopted in our institute for robustness analysis of IMPT plans. This work demonstrates that WCS method is sufficient enough to take into account of the uncertainties which could be encountered during daily clinical treatment. Methods: A fast and approximate dose calculation method is developed to calculate the dose for the IMPT plan under different setup and range uncertainties. Effects of two factors, inversed square factor and range uncertainty,more » are explored. WCS robustness analysis method was evaluated using this fast dose calculation method. The worst-case dose distribution was generated by shifting isocenter by 3 mm along x,y and z directions and modifying stopping power ratios by ±3.5%. 1000 randomly perturbed cases in proton range and x, yz directions were created and the corresponding dose distributions were calculated using this approximated method. DVH and dosimetric indexes of all 1000 perturbed cases were calculated and compared with the result using worst case scenario method. Results: The distributions of dosimetric indexes of 1000 perturbed cases were generated and compared with the results using worst case scenario. For D95 of CTVs, at least 97% of 1000 perturbed cases show higher values than the one of worst case scenario. For D5 of CTVs, at least 98% of perturbed cases have lower values than worst case scenario. Conclusion: By extensively calculating the dose distributions under random uncertainties, WCS method was verified to be reliable in evaluating the robustness level of MFO IMPT plans of H&N patients. The extensively sampling approach using fast approximated method could be used in evaluating the effects of different factors on the robustness level of IMPT plans in the future.« less

  18. The Shock and Vibration Digest. Volume 18, Number 12

    DTIC Science & Technology

    1986-12-01

    practical msthods for fracture mechanics analysis. Linear elastic methods can yield useful results. Elas- dc-plasdc methods are becoming useful with...geometry factors. Fracture mechanics analysis based on linear elastic concepts developed in the 1960s has become established during the last decade as...2) is slightly conservative [2,3]. Materials that ran be treated with linear elastic fracture mechanics usually belong in this category. No

  19. Analysis of Franck-Condon factors for CO+ molecule using the Fourier Grid Hamiltonian method

    NASA Astrophysics Data System (ADS)

    Syiemiong, Arnestar; Swer, Shailes; Jha, Ashok Kumar; Saxena, Atul

    2018-04-01

    Franck-Condon factors (FCFs) are important parameters and it plays a very important role in determining the intensities of the vibrational bands in electronic transitions. In this paper, we illustrate the Fourier Grid Hamiltonian (FGH) method, a relatively simple method to calculate the FCFs. The FGH is a method used for calculating the vibrational eigenvalues and eigenfunctions of bound electronic states of diatomic molecules. The obtained vibrational wave functions for the ground and the excited states are used to calculate the vibrational overlap integral and then the FCFs. In this computation, we used the Morse potential and Bi-Exponential potential model for constructing and diagonalizing the molecular Hamiltonians. The effects of the change in equilibrium internuclear distance (xe), dissociation energy (De), and the nature of the excited state electronic energy curve on the FCFs have been determined. Here we present our work for the qualitative analysis of Franck-Condon Factorsusing this Fourier Grid Hamiltonian Method.

  20. Analysis of the financial factors governing the profitability of lunar helium-3

    NASA Technical Reports Server (NTRS)

    Kulcinski, G. L.; Thompson, H.; Ott, S.

    1989-01-01

    Financial factors influencing the profitability of the mining and utilization of lunar helium-3 are examined. The analysis addressed the following questions: (1) which financial factors have the greatest leverage on the profitability of He-3; (2) over what range can these factors be varied to keep the He-3 option profitable; and (3) what ultimate effect could this energy source have on the price of electricity for U.S. consumers. Two complementary methods of analysis were used in the assessment: rate of return on incremental investment required and reduction revenue requirements (total cost to customers) achieved. Some of the factors addressed include energy demand, power generation costs with and without fusion, profitability for D-He(3) fusion, annual capital and operating costs, launch mass and costs, He-3 price, and government funding. Specific conclusions are made with respect to each of the companies considered: utilities, lunar mining company, and integrated energy company.

  1. Global sensitivity analysis of a filtration model for submerged anaerobic membrane bioreactors (AnMBR).

    PubMed

    Robles, A; Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2014-04-01

    The results of a global sensitivity analysis of a filtration model for submerged anaerobic MBRs (AnMBRs) are assessed in this paper. This study aimed to (1) identify the less- (or non-) influential factors of the model in order to facilitate model calibration and (2) validate the modelling approach (i.e. to determine the need for each of the proposed factors to be included in the model). The sensitivity analysis was conducted using a revised version of the Morris screening method. The dynamic simulations were conducted using long-term data obtained from an AnMBR plant fitted with industrial-scale hollow-fibre membranes. Of the 14 factors in the model, six were identified as influential, i.e. those calibrated using off-line protocols. A dynamic calibration (based on optimisation algorithms) of these influential factors was conducted. The resulting estimated model factors accurately predicted membrane performance. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Implementation of MCA Method for Identification of Factors for Conceptual Cost Estimation of Residential Buildings

    NASA Astrophysics Data System (ADS)

    Juszczyk, Michał; Leśniak, Agnieszka; Zima, Krzysztof

    2013-06-01

    Conceptual cost estimation is important for construction projects. Either underestimation or overestimation of building raising cost may lead to failure of a project. In the paper authors present application of a multicriteria comparative analysis (MCA) in order to select factors influencing residential building raising cost. The aim of the analysis is to indicate key factors useful in conceptual cost estimation in the early design stage. Key factors are being investigated on basis of the elementary information about the function, form and structure of the building, and primary assumptions of technological and organizational solutions applied in construction process. The mentioned factors are considered as variables of the model which aim is to make possible conceptual cost estimation fast and with satisfying accuracy. The whole analysis included three steps: preliminary research, choice of a set of potential variables and reduction of this set to select the final set of variables. Multicriteria comparative analysis is applied in problem solution. Performed analysis allowed to select group of factors, defined well enough at the conceptual stage of the design process, to be used as a describing variables of the model.

  3. Suppression of vapor cell temperature error for spin-exchange-relaxation-free magnetometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Jixi, E-mail: lujixi@buaa.edu.cn; Qian, Zheng; Fang, Jiancheng

    2015-08-15

    This paper presents a method to reduce the vapor cell temperature error of the spin-exchange-relaxation-free (SERF) magnetometer. The fluctuation of cell temperature can induce variations of the optical rotation angle, resulting in a scale factor error of the SERF magnetometer. In order to suppress this error, we employ the variation of the probe beam absorption to offset the variation of the optical rotation angle. The theoretical discussion of our method indicates that the scale factor error introduced by the fluctuation of the cell temperature could be suppressed by setting the optical depth close to one. In our experiment, we adjustmore » the probe frequency to obtain various optical depths and then measure the variation of scale factor with respect to the corresponding cell temperature changes. Our experimental results show a good agreement with our theoretical analysis. Under our experimental condition, the error has been reduced significantly compared with those when the probe wavelength is adjusted to maximize the probe signal. The cost of this method is the reduction of the scale factor of the magnetometer. However, according to our analysis, it only has minor effect on the sensitivity under proper operating parameters.« less

  4. Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.

    PubMed

    Zhao, Yuchao; Frey, H Christopher

    2004-11-01

    Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.

  5. Human factors issues in the design of user interfaces for planning and scheduling

    NASA Technical Reports Server (NTRS)

    Murphy, Elizabeth D.

    1991-01-01

    The purpose is to provide and overview of human factors issues that impact the effectiveness of user interfaces to automated scheduling tools. The following methods are employed: (1) a survey of planning and scheduling tools; (2) the identification and analysis of human factors issues; (3) the development of design guidelines based on human factors literature; and (4) the generation of display concepts to illustrate guidelines.

  6. Column Subset Selection, Matrix Factorization, and Eigenvalue Optimization

    DTIC Science & Technology

    2008-07-01

    Pietsch and Grothendieck, which are regarded as basic instruments in modern functional analysis [Pis86]. • The methods for computing these... Pietsch factorization and the maxcut semi- definite program [GW95]. 1.2. Overview. We focus on the algorithmic version of the Kashin–Tzafriri theorem...will see that the desired subset is exposed by factoring the random submatrix. This factorization, which was invented by Pietsch , is regarded as a basic

  7. New classification methods on singularity of mechanism

    NASA Astrophysics Data System (ADS)

    Luo, Jianguo; Han, Jianyou

    2010-07-01

    Based on the analysis of base and methods of singularity of mechanism, four methods obtained according to the factors of moving states of mechanism and cause of singularity and property of linear complex of singularity and methods in studying singularity, these bases and methods can't reflect the direct property and systematic property and controllable property of the structure of mechanism in macro, thus can't play an excellent role in guiding to evade the configuration before the appearance of singularity. In view of the shortcomings of forementioned four bases and methods, six new methods combined with the structure and exterior phenomena and motion control of mechanism directly and closely, classfication carried out based on the factors of moving base and joint component and executor and branch and acutating source and input parameters, these factors display the systemic property in macro, excellent guiding performance can be expected in singularity evasion and machine design and machine control based on these new bases and methods.

  8. A multifactorial analysis of obesity as CVD risk factor: use of neural network based methods in a nutrigenetics context.

    PubMed

    Valavanis, Ioannis K; Mougiakakou, Stavroula G; Grimaldi, Keith A; Nikita, Konstantina S

    2010-09-08

    Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. The ANN based methods revealed factors that interactively contribute to obesity trait and provided predictive models with a promising generalization ability. In general, results showed that ANNs and their hybrids can provide useful tools for the study of complex traits in the context of nutrigenetics.

  9. Three Dimensional Parametric Analyses of Stress Concentration Factor and Its Mitigation in Isotropic and Orthotropic Plate with Central Circular Hole Under Axial In-Plane Loading

    NASA Astrophysics Data System (ADS)

    Nagpal, Shubhrata; Jain, Nitin Kumar; Sanyal, Shubhashis

    2016-01-01

    The problem of finding the stress concentration factor of a loaded rectangular plate has offered considerably analytical difficulty. The present work focused on understanding of behavior of isotropic and orthotropic plate subjected to static in-plane loading using finite element method. The complete plate model configuration has been analyzed using finite element method based software ANSYS. In the present work two parameters: thickness to width of plate (T/A) and diameter of hole to width of plate (D/A) have been varied for analysis of stress concentration factor (SCF) and its mitigation. Plates of five different materials have been considered for complete analysis to find out the sensitivity of stress concentration factor. The D/A ratio varied from 0.1 to 0.7 for analysis of SCF and varied from 0.1 to 0.5 for analyzing the mitigation of SCF. 0.01, 0.05 and 0.1 are considered as T/A ratio for all the cases. The results are presented in graphical form and discussed. The mitigation in SCF reported is very encouraging. The SCF is more sensitive to D/A ratio as compared to T/A.

  10. Confirmatory factor analysis using Microsoft Excel.

    PubMed

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  11. [Analysis and experimental verification of sensitivity and SNR of laser warning receiver].

    PubMed

    Zhang, Ji-Long; Wang, Ming; Tian, Er-Ming; Li, Xiao; Wang, Zhi-Bin; Zhang, Yue

    2009-01-01

    In order to countermeasure increasingly serious threat from hostile laser in modern war, it is urgent to do research on laser warning technology and system, and the sensitivity and signal to noise ratio (SNR) are two important performance parameters in laser warning system. In the present paper, based on the signal statistical detection theory, a method for calculation of the sensitivity and SNR in coherent detection laser warning receiver (LWR) has been proposed. Firstly, the probabilities of the laser signal and receiver noise were analyzed. Secondly, based on the threshold detection theory and Neyman-Pearson criteria, the signal current equation was established by introducing detection probability factor and false alarm rate factor, then, the mathematical expressions of sensitivity and SNR were deduced. Finally, by using method, the sensitivity and SNR of the sinusoidal grating laser warning receiver developed by our group were analyzed, and the theoretic calculation and experimental results indicate that the SNR analysis method is feasible, and can be used in performance analysis of LWR.

  12. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less

  13. [Analysis of dietary pattern and diabetes mellitus influencing factors identified by classification tree model in adults of Fujian].

    PubMed

    Yu, F L; Ye, Y; Yan, Y S

    2017-05-10

    Objective: To find out the dietary patterns and explore the relationship between environmental factors (especially dietary patterns) and diabetes mellitus in the adults of Fujian. Methods: Multi-stage sampling method were used to survey residents aged ≥18 years by questionnaire, physical examination and laboratory detection in 10 disease surveillance points in Fujian. Factor analysis was used to identify the dietary patterns, while logistic regression model was applied to analyze relationship between dietary patterns and diabetes mellitus, and classification tree model was adopted to identify the influencing factors for diabetes mellitus. Results: There were four dietary patterns in the population, including meat, plant, high-quality protein, and fried food and beverages patterns. The result of logistic analysis showed that plant pattern, which has higher factor loading of fresh fruit-vegetables and cereal-tubers, was a protective factor for non-diabetes mellitus. The risk of diabetes mellitus in the population at T2 and T3 levels of factor score were 0.727 (95 %CI: 0.561-0.943) times and 0.736 (95 %CI : 0.573-0.944) times higher, respectively, than those whose factor score was in lowest quartile. Thirteen influencing factors and eleven group at high-risk for diabetes mellitus were identified by classification tree model. The influencing factors were dyslipidemia, age, family history of diabetes, hypertension, physical activity, career, sex, sedentary time, abdominal adiposity, BMI, marital status, sleep time and high-quality protein pattern. Conclusion: There is a close association between dietary patterns and diabetes mellitus. It is necessary to promote healthy and reasonable diet, strengthen the monitoring and control of blood lipids, blood pressure and body weight, and have good lifestyle for the prevention and control of diabetes mellitus.

  14. Contextual factors affecting autonomy for patients in Iranian hospitals: A qualitative study.

    PubMed

    Ebrahimi, Hossein; Sadeghian, Efat; Seyedfatemi, Naeimeh; Mohammadi, Eesa; Crowley, Maureen

    2016-01-01

    Consideration of patient autonomy is an essential element in individualized, patient-centered, ethical care. Internal and external factors associated with patient autonomy are related to culture and it is not clear what they are in Iran. The aim of this study was to explore contextual factors affecting the autonomy of patients in Iranian hospitals. This was a qualitative study using conventional content analysis methods. Thirty-four participants (23 patients, 9 nurses, and 2 doctors) from three Iranian teaching hospitals, selected using purposive sampling, participated in semi-structured interviews. Unstructured observation and filed notes were other methods for data collection. The data were subjected to qualitative content analysis and analyzed using the MAXQDA-10 software. Five categories and sixteen subcategories were identified. The five main categories related to patient autonomy were: Intrapersonal factors, physical health status, supportive family and friends, communication style, and organizational constraints. In summary, this study uncovered contextual factors that the care team, managers, and planners in the health field should target in order to improve patient autonomy in Iranian hospitals.

  15. Centrifugal ultrafiltration of human serum for improving immunoglobulin A quantification using attenuated total reflectance infrared spectroscopy.

    PubMed

    Elsohaby, Ibrahim; McClure, J Trenton; Riley, Christopher B; Bryanton, Janet; Bigsby, Kathryn; Shaw, R Anthony

    2018-02-20

    Attenuated total reflectance infrared (ATR-IR) spectroscopy is a simple, rapid and cost-effective method for the analysis of serum. However, the complex nature of serum remains a limiting factor to the reliability of this method. We investigated the benefits of coupling the centrifugal ultrafiltration with ATR-IR spectroscopy for quantification of human serum IgA concentration. Human serum samples (n = 196) were analyzed for IgA using an immunoturbidimetric assay. ATR-IR spectra were acquired for whole serum samples and for the retentate (residue) reconstituted with saline following 300 kDa centrifugal ultrafiltration. IR-based analytical methods were developed for each of the two spectroscopic datasets, and the accuracy of each of the two methods compared. Analytical methods were based upon partial least squares regression (PLSR) calibration models - one with 5-PLS factors (for whole serum) and the second with 9-PLS factors (for the reconstituted retentate). Comparison of the two sets of IR-based analytical results to reference IgA values revealed improvements in the Pearson correlation coefficient (from 0.66 to 0.76), and the root mean squared error of prediction in IR-based IgA concentrations (from 102 to 79 mg/dL) for the ultrafiltration retentate-based method as compared to the method built upon whole serum spectra. Depleting human serum low molecular weight proteins using a 300 kDa centrifugal filter thus enhances the accuracy IgA quantification by ATR-IR spectroscopy. Further evaluation and optimization of this general approach may ultimately lead to routine analysis of a range of high molecular-weight analytical targets that are otherwise unsuitable for IR-based analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    PubMed

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Global analysis of bacterial transcription factors to predict cellular target processes.

    PubMed

    Doerks, Tobias; Andrade, Miguel A; Lathe, Warren; von Mering, Christian; Bork, Peer

    2004-03-01

    Whole-genome sequences are now available for >100 bacterial species, giving unprecedented power to comparative genomics approaches. We have applied genome-context methods to predict target processes that are regulated by transcription factors (TFs). Of 128 orthologous groups of proteins annotated as TFs, to date, 36 are functionally uncharacterized; in our analysis we predict a probable cellular target process or biochemical pathway for half of these functionally uncharacterized TFs.

  18. A General Procedure to Assess the Internal Structure of a Noncognitive Measure--The Student360 Insight Program (S360) Time Management Scale. Research Report. ETS RR-11-42

    ERIC Educational Resources Information Center

    Ling, Guangming; Rijmen, Frank

    2011-01-01

    The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…

  19. Risk factors for technical failure of endoscopic double self-expandable metallic stent placement by partial stent-in-stent method.

    PubMed

    Kawakubo, Kazumichi; Kawakami, Hiroshi; Toyokawa, Yoshihide; Otani, Koichi; Kuwatani, Masaki; Abe, Yoko; Kawahata, Shuhei; Kubo, Kimitoshi; Kubota, Yoshimasa; Sakamoto, Naoya

    2015-01-01

    Endoscopic double self-expandable metallic stent (SEMS) placement by the partial stent-in-stent (PSIS) method has been reported to be useful for the management of unresectable hilar malignant biliary obstruction. However, it is technically challenging, and the optimal SEMS for the procedure remains unknown. The aim of this study was to identify the risk factors for technical failure of endoscopic double SEMS placement for unresectable malignant hilar biliary obstruction (MHBO). Between December 2009 and May 2013, 50 consecutive patients with MHBO underwent endoscopic double SEMS placement by the PSIS method. We retrospectively evaluated the rate of successful double SEMS placement and identified the risk factors for technical failure. The technical success rate for double SEMS placement was 82.0% (95% confidence interval [CI]: 69.2-90.2). On univariate analysis, the rate of technical failure was high in patients with metastatic disease and unilateral placement. Multivariate analysis revealed that metastatic disease was a significant risk factor for technical failure (odds ratio: 9.63, 95% CI: 1.11-105.5). The subgroup analysis after double guidewire insertion showed that the rate of technical success was higher in the laser-cut type SEMS with a large mesh and thick delivery system than in the braided type SEMS with a small mesh and thick delivery system. Metastatic disease was a significant risk factor for technical failure of double SEMS placement for unresectable MHBO. The laser-cut type SEMS with a large mesh and thin delivery system might be preferable for the PSIS procedure. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  20. Candida Species From Eye Infections: Drug Susceptibility, Virulence Factors, and Molecular Characterization.

    PubMed

    Ranjith, Konduri; Sontam, Bhavani; Sharma, Savitri; Joseph, Joveeta; Chathoth, Kanchana N; Sama, Kalyana C; Murthy, Somasheila I; Shivaji, Sisinthy

    2017-08-01

    To determine the type of Candida species in ocular infections and to investigate the relationship of antifungal susceptibility profile to virulence factors. Fifty isolates of yeast-like fungi from patients with keratitis, endophthalmitis, and orbital cellulitis were identified by Vitek-2 compact system and DNA sequencing of ITS1-5.8S-ITS2 regions of the rRNA gene, followed by phylogenetic analysis for phenotypic and genotypic identification, respectively. Minimum inhibitory concentration of six antifungal drugs was determined by E test/microbroth dilution methods. Phenotypic and genotypic methods were used to determine the virulence factors. Phylogenetic analysis showed the clustering of all isolates into eight distinct groups with a major cluster formed Candida parapsilosis (n = 21), which was the most common species by both Vitek 2 and DNA sequencing. Using χ2 test no significant difference was noted between the techniques except that Vitek 2 did not identify C. viswanathii, C. orthopsilosis, and two non-Candida genera. Of 43 tested Candida isolates high susceptibility to amphotericin B (39/43, 90.6%) and natamycin (43/43, 100%) was noted. While none of the isolates produced coagulase, all produced esterase and catalase. The potential to form biofilm was detected in 23/43 (53.4%) isolates. Distribution of virulence factors by heat map analysis showed difference in metabolic activity of biofilm producers from nonbiofilm producers. Identified by Vitek 2 and DNA sequencing methods C. parapsilosis was the most common species associated with eye infections. Irrespective of the virulence factors elaborated, the Candida isolates were susceptible to commonly used antifungal drugs such as amphotericin B and natamycin.

  1. Factor analysis of the Zung self-rating depression scale in a large sample of patients with major depressive disorder in primary care

    PubMed Central

    Romera, Irene; Delgado-Cohen, Helena; Perez, Teresa; Caballero, Luis; Gilaberte, Immaculada

    2008-01-01

    Background The aim of this study was to examine the symptomatic dimensions of depression in a large sample of patients with major depressive disorder (MDD) in the primary care (PC) setting by means of a factor analysis of the Zung self-rating depression scale (ZSDS). Methods A factor analysis was performed, based on the polychoric correlations matrix, between ZSDS items using promax oblique rotation in 1049 PC patients with a diagnosis of MDD (DSM-IV). Results A clinical interpretable four-factor solution consisting of a core depressive factor (I); a cognitive factor (II); an anxiety factor (III) and a somatic factor (IV) was extracted. These factors accounted for 36.9% of the variance on the ZSDS. The 4-factor structure was validated and high coefficients of congruence were obtained (0.98, 0.95, 0.92 and 0.87 for factors I, II, III and IV, respectively). The model seemed to fit the data well with fit indexes within recommended ranges (GFI = 0.9330, AGFI = 0.9112 and RMR = 0.0843). Conclusion Our findings suggest that depressive symptoms in patients with MDD in the PC setting cluster into four dimensions: core depressive, cognitive, anxiety and somatic, by means of a factor analysis of the ZSDS. Further research is needed to identify possible diagnostic, therapeutic or prognostic implications of the different depressive symptomatic profiles. PMID:18194524

  2. Impacts of Tourism in Ubud Bali Indonesia: a community-based tourism perspective

    NASA Astrophysics Data System (ADS)

    Ernawati, N. M.; Sudarmini, N. M.; Sukmawati, N. M. R.

    2018-01-01

    The impact of tourism is vital to be assessed to measure the results of the development, in order to maximize the benefits gained from tourism. Academics are encouraged to conduct research on this field. This study aims to identify the impact of tourism in Ubud tourist destination, Bali, Indonesia. It is a quantitative method, study using survey method, and Factor analysis, Frequency and Mean analyses as analytical tools. The impact of tourism is assessed against impact measurement instrument developed by Koster and Randall. The study used a sample of 170 respondents consisting of teenagers, productive age population, and senior citizens of Ubud. The result of the Average analysis shows that the impact of tourism in Ubud in general lies at 1.9 which indicates that the people are agreed that the impact of tourism in Ubud is positive. Factor analysis classified the impacts of tourism based on the positive or negative influences inflicted on society. Further, the four Factors extracted show: Factor 1 indicates areas of the most obvious positive impact, Factor 4 lies the issues, wherein the community members disagree that tourism effects Ubud positively. It is expected that the analysis of tourism impacts at Ubud could be used as an input by tourism stakeholders in developing a plan for future tourism in Ubud tourist destination, and to anticipate and mitigate the undesirable impacts that may occur and in order to maximise the positive results from tourism.

  3. Factor structure of the Halstead-Reitan Neuropsychological Battery for children: a brief report supplement.

    PubMed

    Ross, Sylvia An; Allen, Daniel N; Goldstein, Gerald

    2014-01-01

    The Halstead-Reitan Neuropsychological Battery (HRNB) is the first factor-analyzed neuropsychological battery and consists of three batteries for young children, older children, and adults. Halstead's original factor analysis extracted four factors from the adult version of the battery, which were the basis for his theory of biological intelligence. These factors were called Central Integrative Field, Abstraction, Power, and Directional. Since this original analysis, Reitan's additions to the battery, and the development of the child versions of the test, this factor-analytic research continued. An introduction and the adult literature are reviewed in Ross, Allen, and Goldstein ( in press ). In this supplemental article, factor-analytic studies of the HRNB with children are reviewed. It is concluded that factor analysis of the HRNB or Reitan-Indiana Neuropsychological Battery with children does not replicate the extensiveness of the adult literature, although there is some evidence that when the traditional battery for older children is used, the factor structure is similar to what is found in adult studies. Reitan's changes to the battery appear to have added factors including language and sensory-perceptual factors. When other tests and scoring methods are used in addition to the core battery, differing solutions are produced.

  4. Analysis of factors that influencing the interest of Bali State Polytechnic’s students in entrepreneurship

    NASA Astrophysics Data System (ADS)

    Ayuni, N. W. D.; Sari, I. G. A. M. K. K.

    2018-01-01

    The high rate of unemployment results the economic growth to be hampered. To solve this situation, the government try to change the students’ mindset from becoming a job seeker to become a job creator or entrepreneur. One real action that usually been held in Bali State Polytechnic is Student Entrepreneurial Program. The purpose of this research is to identify and analyze the factors that influence the interest of Bali State Polytechnic’s Students in entrepreneurship, especially in the Entrepreneurial Student Program. Method used in this research is Factor Analysis including Bartlett Test, Kaiser-Mayer Olkin (KMO), Measure of Sampling Adequacy (MSA), factor extraction using Principal Component Analysis (PCA), factor selection using eigen value and scree plot, and factor rotation using orthogonal rotation varimax. Result shows that there are four factors that influencing the interest of Bali State Polytechnic’s Students in Entrepreneurship which are Contextual Factor (including Entrepreneurship Training, Academic Support, Perceived Confidence, and Economic Challenge), Self Efficacy Factor (including Leadership, Mental Maturity, Relation with Entrepreneur, and Authority), Subjective Norm Factor (including Support of Important Relative, Support of Friends, and Family Role), and Attitude Factor (including Self Realization).

  5. Analysis of multi lobe journal bearings with surface roughness using finite difference method

    NASA Astrophysics Data System (ADS)

    PhaniRaja Kumar, K.; Bhaskar, SUdaya; Manzoor Hussain, M.

    2018-04-01

    Multi lobe journal bearings are used for high operating speeds and high loads in machines. In this paper symmetrical multi lobe journal bearings are analyzed to find out the effect of surface roughnessduring non linear loading. Using the fourth order RungeKutta method, time transient analysis was performed to calculate and plot the journal centre trajectories. Flow factor method is used to evaluate the roughness and the finite difference method (FDM) is used to predict the pressure distribution over the bearing surface. The Transient analysis is done on the multi lobe journal bearings for threedifferent surface roughness orientations. Longitudinal surface roughness is more effective when compared with isotopic and traverse surface roughness.

  6. Double temporal sparsity based accelerated reconstruction of compressively sensed resting-state fMRI.

    PubMed

    Aggarwal, Priya; Gupta, Anubha

    2017-12-01

    A number of reconstruction methods have been proposed recently for accelerated functional Magnetic Resonance Imaging (fMRI) data collection. However, existing methods suffer with the challenge of greater artifacts at high acceleration factors. This paper addresses the issue of accelerating fMRI collection via undersampled k-space measurements combined with the proposed method based on l 1 -l 1 norm constraints, wherein we impose first l 1 -norm sparsity on the voxel time series (temporal data) in the transformed domain and the second l 1 -norm sparsity on the successive difference of the same temporal data. Hence, we name the proposed method as Double Temporal Sparsity based Reconstruction (DTSR) method. The robustness of the proposed DTSR method has been thoroughly evaluated both at the subject level and at the group level on real fMRI data. Results are presented at various acceleration factors. Quantitative analysis in terms of Peak Signal-to-Noise Ratio (PSNR) and other metrics, and qualitative analysis in terms of reproducibility of brain Resting State Networks (RSNs) demonstrate that the proposed method is accurate and robust. In addition, the proposed DTSR method preserves brain networks that are important for studying fMRI data. Compared to the existing methods, the DTSR method shows promising potential with an improvement of 10-12 dB in PSNR with acceleration factors upto 3.5 on resting state fMRI data. Simulation results on real data demonstrate that DTSR method can be used to acquire accelerated fMRI with accurate detection of RSNs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. An Analysis of Effects of Variable Factors on Weapon Performance

    DTIC Science & Technology

    1993-03-01

    ALTERNATIVE ANALYSIS A. CATEGORICAL DATA ANALYSIS Statistical methodology for categorical data analysis traces its roots to the work of Francis Galton in the...choice of statistical tests . This thesis examines an analysis performed by Surface Warfare Development Group (SWDG). The SWDG analysis is shown to be...incorrect due to the misapplication of testing methods. A corrected analysis is presented and recommendations suggested for changes to the testing

  8. A Confirmatory Factor Analysis of an Abbreviated Social Support Instrument: The MOS-SSS

    ERIC Educational Resources Information Center

    Gjesfjeld, Christopher D.; Greeno, Catherine G.; Kim, Kevin H.

    2008-01-01

    Objective: Confirm the factor structure of the original 18-item Medical Outcome Study Social Support Survey (MOS-SSS) as well as two abbreviated versions in a sample of mothers with a child in mental health treatment. Method: The factor structure, internal consistency, and concurrent validity of the MOS-SSS were assessed using a convenience sample…

  9. Development and Factor Analysis of an Instrument to Measure Preservice Teachers' Perceptions of Learning Objects

    ERIC Educational Resources Information Center

    Sahin, Sami

    2010-01-01

    The purpose of this study was to develop a questionnaire to measure student teachers' perception of digital learning objects. The participants included 308 voluntary senior students attending courses in a college of education of a public university in Turkey. The items were extracted to their related factors by the principal axis factoring method.…

  10. Analysis of Factors that Affect the Teacher Certification Exam Results in a University System in Puerto Rico

    ERIC Educational Resources Information Center

    Garofalo, Jorge H.

    2009-01-01

    The purpose of this study was to analyze the factors that affect a teacher preparation exam results within a University System in Puerto Rico. Using Bertalanffy's System Theory as theoretical framework, this mixed methods study examined factors in the university system that could have affected student's preparation for a teacher exam (PCMAS by its…

  11. Classification and identification of molecules through factor analysis method based on terahertz spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Jianglou; Liu, Jinsong; Wang, Kejia; Yang, Zhengang; Liu, Xiaming

    2018-06-01

    By means of factor analysis approach, a method of molecule classification is built based on the measured terahertz absorption spectra of the molecules. A data matrix can be obtained by sampling the absorption spectra at different frequency points. The data matrix is then decomposed into the product of two matrices: a weight matrix and a characteristic matrix. By using the K-means clustering to deal with the weight matrix, these molecules can be classified. A group of samples (spirobenzopyran, indole, styrene derivatives and inorganic salts) has been prepared, and measured via a terahertz time-domain spectrometer. These samples are classified with 75% accuracy compared to that directly classified via their molecular formulas.

  12. Comparison of mixed-mode stress-intensity factors obtained through displacement correlation, J-integral formulation, and modified crack-closure integral

    NASA Astrophysics Data System (ADS)

    Bittencourt, Tulio N.; Barry, Ahmabou; Ingraffea, Anthony R.

    This paper presents a comparison among stress-intensity factors for mixed-mode two-dimensional problems obtained through three different approaches: displacement correlation, J-integral, and modified crack-closure integral. All mentioned procedures involve only one analysis step and are incorporated in the post-processor page of a finite element computer code for fracture mechanics analysis (FRANC). Results are presented for a closed-form solution problem under mixed-mode conditions. The accuracy of these described methods then is discussed and analyzed in the framework of their numerical results. The influence of the differences among the three methods on the predicted crack trajectory of general problems is also discussed.

  13. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  14. Lunar Regolith Particle Shape Analysis

    NASA Technical Reports Server (NTRS)

    Kiekhaefer, Rebecca; Hardy, Sandra; Rickman, Douglas; Edmunson, Jennifer

    2013-01-01

    Future engineering of structures and equipment on the lunar surface requires significant understanding of particle characteristics of the lunar regolith. Nearly all sediment characteristics are influenced by particle shape; therefore a method of quantifying particle shape is useful both in lunar and terrestrial applications. We have created a method to quantify particle shape, specifically for lunar regolith, using image processing. Photomicrographs of thin sections of lunar core material were obtained under reflected light. Three photomicrographs were analyzed using ImageJ and MATLAB. From the image analysis measurements for area, perimeter, Feret diameter, orthogonal Feret diameter, Heywood factor, aspect ratio, sieve diameter, and sieve number were recorded. Probability distribution functions were created from the measurements of Heywood factor and aspect ratio.

  15. Evaluation of the Spiritual Well-Being Scale in a Sample of Korean Adults.

    PubMed

    You, Sukkyung; Yoo, Ji Eun

    2016-08-01

    This study explored the psychometric qualities and construct validity of the Spiritual Well-Being Scale (SWBS; Ellison in J Psychol Theol 11:330-340, 1983) using a sample of 470 Korean adults. Two factor analyses, exploratory factor analysis and confirmatory factor analysis, were conducted in order to test the validity of the SWBS. The results of the factor analyses supported the original two-dimensional structure of the SWBS-religious well-being (RWB) and existential well-being (EWB) with method effects associated with negatively worded items. By controlling for method effects, the evaluation of the two-factor structure of SWBS is confirmed with clarity. Further, the differential pattern and magnitude of correlations between the SWB subscales and the religious and psychological variables suggested that two factors of the SWBS were valid for Protestant, Catholic, and religiously unaffiliated groups except Buddhists. The Protestant group scored higher in RWB compared to the Buddhist, Catholic, and unaffiliated groups. The Protestant group scored higher in EWB compared to the unaffiliated groups. Future studies may need to include more Buddhist samples to gain solid evidence for validity of the SWBS on a non-Western religious tradition.

  16. Factors Influencing Cecal Intubation Time during Retrograde Approach Single-Balloon Enteroscopy

    PubMed Central

    Chen, Peng-Jen; Shih, Yu-Lueng; Huang, Hsin-Hung; Hsieh, Tsai-Yuan

    2014-01-01

    Background and Aim. The predisposing factors for prolonged cecal intubation time (CIT) during colonoscopy have been well identified. However, the factors influencing CIT during retrograde SBE have not been addressed. The aim of this study was to determine the factors influencing CIT during retrograde SBE. Methods. We investigated patients who underwent retrograde SBE at a medical center from January 2011 to March 2014. The medical charts and SBE reports were reviewed. The patients' characteristics and procedure-associated data were recorded. These data were analyzed with univariate analysis as well as multivariate logistic regression analysis to identify the possible predisposing factors. Results. We enrolled 66 patients into this study. The median CIT was 17.4 minutes. With univariate analysis, there was no statistical difference in age, sex, BMI, or history of abdominal surgery, except for bowel preparation (P = 0.021). Multivariate logistic regression analysis showed that inadequate bowel preparation (odds ratio 30.2, 95% confidence interval 4.63–196.54; P < 0.001) was the independent predisposing factors for prolonged CIT during retrograde SBE. Conclusions. For experienced endoscopist, inadequate bowel preparation was the independent predisposing factor for prolonged CIT during retrograde SBE. PMID:25505904

  17. Population-Based Questionnaire Survey on Health Effects of Aircraft Noise on Residents Living around U.S. Airfields in the RYUKYUS—PART II: AN Analysis of the Discriminant Score and the Factor Score

    NASA Astrophysics Data System (ADS)

    HIRAMATSU, K.; MATSUI, T.; MIYAKITA, T.; ITO, A.; TOKUYAMA, T.; OSADA, Y.; YAMAMOTO, T.

    2002-02-01

    Discriminant function values of psychosomatics and neurosis are calculated using the 12 scale scores of the Todai Health Index, a general health questionnaire, obtained in the survey done around the Kadena and Futenma U.S. airfields in Okinawa, Japan. The total number of answers available for the analysis is 6301. Factor analysis is applied to the 12 scale scores by means of the principal factor method, and Oblimin rotation is done because the factors extracted are considered likely to correlate with each other to a greater or lesser extent. The logistic regression analysis is made with the independent variables of discriminant function (DF) values and factor scores and with the dependent variables of Ldn, age (six levels), sex, occupation (four categories) and the interaction of age and sex. Results indicate that the odds ratio of the DF values regarding psychosomatic disorder and of the score of somatic factor have clear dose-response relationship. The odds ratios of the DF value of neurosis and of the score of the mental factor increase in the area where noise exposure is very intense.

  18. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  19. Efficient multitasking of Choleski matrix factorization on CRAY supercomputers

    NASA Technical Reports Server (NTRS)

    Overman, Andrea L.; Poole, Eugene L.

    1991-01-01

    A Choleski method is described and used to solve linear systems of equations that arise in large scale structural analysis. The method uses a novel variable-band storage scheme and is structured to exploit fast local memory caches while minimizing data access delays between main memory and vector registers. Several parallel implementations of this method are described for the CRAY-2 and CRAY Y-MP computers demonstrating the use of microtasking and autotasking directives. A portable parallel language, FORCE, is used for comparison with the microtasked and autotasked implementations. Results are presented comparing the matrix factorization times for three representative structural analysis problems from runs made in both dedicated and multi-user modes on both computers. CPU and wall clock timings are given for the parallel implementations and are compared to single processor timings of the same algorithm.

  20. Evaluation of the comprehensive palatability of Japanese sake paired with dishes by multiple regression analysis based on subdomains.

    PubMed

    Nakamura, Ryo; Nakano, Kumiko; Tamura, Hiroyasu; Mizunuma, Masaki; Fushiki, Tohru; Hirata, Dai

    2017-08-01

    Many factors contribute to palatability. In order to evaluate the palatability of Japanese alcohol sake paired with certain dishes by integrating multiple factors, here we applied an evaluation method previously reported for palatability of cheese by multiple regression analysis based on 3 subdomain factors (rewarding, cultural, and informational). We asked 94 Japanese participants/subjects to evaluate the palatability of sake (1st evaluation/E1 for the first cup, 2nd/E2 and 3rd/E3 for the palatability with aftertaste/afterglow of certain dishes) and to respond to a questionnaire related to 3 subdomains. In E1, 3 factors were extracted by a factor analysis, and the subsequent multiple regression analyses indicated that the palatability of sake was interpreted by mainly the rewarding. Further, the results of attribution-dissections in E1 indicated that 2 factors (rewarding and informational) contributed to the palatability. Finally, our results indicated that the palatability of sake was influenced by the dish eaten just before drinking.

  1. Bioinformatics Identification of Modules of Transcription Factor Binding Sites in Alzheimer's Disease-Related Genes by In Silico Promoter Analysis and Microarrays

    PubMed Central

    Augustin, Regina; Lichtenthaler, Stefan F.; Greeff, Michael; Hansen, Jens; Wurst, Wolfgang; Trümbach, Dietrich

    2011-01-01

    The molecular mechanisms and genetic risk factors underlying Alzheimer's disease (AD) pathogenesis are only partly understood. To identify new factors, which may contribute to AD, different approaches are taken including proteomics, genetics, and functional genomics. Here, we used a bioinformatics approach and found that distinct AD-related genes share modules of transcription factor binding sites, suggesting a transcriptional coregulation. To detect additional coregulated genes, which may potentially contribute to AD, we established a new bioinformatics workflow with known multivariate methods like support vector machines, biclustering, and predicted transcription factor binding site modules by using in silico analysis and over 400 expression arrays from human and mouse. Two significant modules are composed of three transcription factor families: CTCF, SP1F, and EGRF/ZBPF, which are conserved between human and mouse APP promoter sequences. The specific combination of in silico promoter and multivariate analysis can identify regulation mechanisms of genes involved in multifactorial diseases. PMID:21559189

  2. Exploratory and Confirmatory Factor Analysis of the Career Decision-Making Difficulties Questionnaire

    PubMed Central

    Farrokhi, Farahman; Mahdavi, Ali; Moradi, Samad

    2012-01-01

    Objective The present study aimed at validating the structure of Career Decision-making Difficulties Questionnaire (CDDQ). Methods Five hundred and eleven undergraduate students took part in this research; from these participants, 63 males and 200 females took part in the first study, and 63 males and 185 females completed the survey for the second study. Results The results of exploratory factor analysis (EFA) indicated strong support for the three-factor structure, consisting of lack of information about the self, inconsistent information, lack of information and lack of readiness factors. A confirmatory factor analysis was run with the second sample using structural equation modeling. As expected, the three-factor solution provided a better fit to the data than the alternative models. Conclusion CDDQ was recommended to be used for college students in this study due to the fact that this instrument measures all three aspects of the model. Future research is needed to learn whether this model would fit other different samples. PMID:22952549

  3. Novel Method of Production Decline Analysis

    NASA Astrophysics Data System (ADS)

    Xie, Shan; Lan, Yifei; He, Lei; Jiao, Yang; Wu, Yong

    2018-02-01

    ARPS decline curves is the most commonly used in oil and gas field due to its minimal data requirements and ease application. And prediction of production decline which is based on ARPS analysis rely on known decline type. However, when coefficient index are very approximate under different decline type, it is difficult to directly recognize decline trend of matched curves. Due to difficulties above, based on simulation results of multi-factor response experiments, a new dynamic decline prediction model is introduced with using multiple linear regression of influence factors. First of all, according to study of effect factors of production decline, interaction experimental schemes are designed. Based on simulated results, annual decline rate is predicted by decline model. Moreover, the new method is applied in A gas filed of Ordos Basin as example to illustrate reliability. The result commit that the new model can directly predict decline tendency without needing recognize decline style. From arithmetic aspect, it also take advantage of high veracity. Finally, the new method improves the evaluation method of gas well production decline in low permeability gas reservoir, which also provides technical support for further understanding of tight gas field development laws.

  4. Recollection, not familiarity, decreases in healthy aging: Converging evidence from four estimation methods

    PubMed Central

    Koen, Joshua D.; Yonelinas, Andrew P.

    2014-01-01

    Although it is generally accepted that aging is associated with recollection impairments, there is considerable disagreement surrounding how healthy aging influences familiarity-based recognition. One factor that might contribute to the mixed findings regarding age differences in familiarity is the estimation method used to quantify the two mnemonic processes. Here, this issue is examined by having a group of older adults (N = 39) between 40 and 81 years of age complete Remember/Know (RK), receiver operating characteristic (ROC), and process dissociation (PD) recognition tests. Estimates of recollection, but not familiarity, showed a significant negative correlation with chronological age. Inconsistent with previous findings, the estimation method did not moderate the relationship between age and estimations of recollection and familiarity. In a final analysis, recollection and familiarity were estimated as latent factors in a confirmatory factor analysis (CFA) that modeled the covariance between measures of free recall and recognition, and the results converged with the results from the RK, PD, and ROC tasks. These results are consistent with the hypothesis that episodic memory declines in older adults are primary driven by recollection deficits, and also suggest that the estimation method plays little to no role in age-related decreases in familiarity. PMID:25485974

  5. A Qualitative Study on Organizational Factors Affecting Occupational Accidents.

    PubMed

    Eskandari, Davood; Jafari, Mohammad Javad; Mehrabi, Yadollah; Kian, Mostafa Pouya; Charkhand, Hossein; Mirghotbi, Mostafa

    2017-03-01

    Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts' experiences and perception of organizational factors. This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Eleven organizational factors' sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents.

  6. Methods of estimating or accounting for neighborhood associations with health using complex survey data.

    PubMed

    Brumback, Babette A; Cai, Zhuangyu; Dailey, Amy B

    2014-05-15

    Reasons for health disparities may include neighborhood-level factors, such as availability of health services, social norms, and environmental determinants, as well as individual-level factors. Investigating health inequalities using nationally or locally representative data often requires an approach that can accommodate a complex sampling design, in which individuals have unequal probabilities of selection into the study. The goal of the present article is to review and compare methods of estimating or accounting for neighborhood influences with complex survey data. We considered 3 types of methods, each generalized for use with complex survey data: ordinary regression, conditional likelihood regression, and generalized linear mixed-model regression. The relative strengths and weaknesses of each method differ from one study to another; we provide an overview of the advantages and disadvantages of each method theoretically, in terms of the nature of the estimable associations and the plausibility of the assumptions required for validity, and also practically, via a simulation study and 2 epidemiologic data analyses. The first analysis addresses determinants of repeat mammography screening use using data from the 2005 National Health Interview Survey. The second analysis addresses disparities in preventive oral health care using data from the 2008 Florida Behavioral Risk Factor Surveillance System Survey.

  7. Finite element analysis of contributing factors to the horizontal splitting cracks in concrete crossties pretensioned with seven-wire strands.

    DOT National Transportation Integrated Search

    2017-04-04

    This paper employs the finite element (FE) modeling : method to investigate the contributing factors to the horizontal : splitting cracks observed in the upper strand plane in some : concrete crossties made with seven-wire strands. The concrete...

  8. Overview of mycotoxin methods, present status and future needs.

    PubMed

    Gilbert, J

    1999-01-01

    This article reviews current requirements for the analysis for mycotoxins in foods and identifies legislative as well as other factors that are driving development and validation of new methods. New regulatory limits for mycotoxins and analytical quality assurance requirements for laboratories to only use validated methods are seen as major factors driving developments. Three major classes of methods are identified which serve different purposes and can be categorized as screening, official and research. In each case the present status and future needs are assessed. In addition to an overview of trends in analytical methods, some other areas of analytical quality assurance such as participation in proficiency testing and reference materials are identified.

  9. Incompressible boundary-layer stability analysis of LFC experimental data for sub-critical Mach numbers. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Berry, S. A.

    1986-01-01

    An incompressible boundary-layer stability analysis of Laminar Flow Control (LFC) experimental data was completed and the results are presented. This analysis was undertaken for three reasons: to study laminar boundary-layer stability on a modern swept LFC airfoil; to calculate incompressible design limits of linear stability theory as applied to a modern airfoil at high subsonic speeds; and to verify the use of linear stability theory as a design tool. The experimental data were taken from the slotted LFC experiment recently completed in the NASA Langley 8-Foot Transonic Pressure Tunnel. Linear stability theory was applied and the results were compared with transition data to arrive at correlated n-factors. Results of the analysis showed that for the configuration and cases studied, Tollmien-Schlichting (TS) amplification was the dominating disturbance influencing transition. For these cases, incompressible linear stability theory correlated with an n-factor for TS waves of approximately 10 at transition. The n-factor method correlated rather consistently to this value despite a number of non-ideal conditions which indicates the method is useful as a design tool for advanced laminar flow airfoils.

  10. The Development of the Problematic Online Gaming Questionnaire (POGQ)

    PubMed Central

    Demetrovics, Zsolt; Urbán, Róbert; Nagygyörgy, Katalin; Farkas, Judit; Griffiths, Mark D.; Pápay, Orsolya; Kökönyei, Gyöngyi; Felvinczi, Katalin; Oláh, Attila

    2012-01-01

    Background Online gaming has become increasingly popular. However, this has led to concerns that these games might induce serious problems and/or lead to dependence for a minority of players. Aim: The aim of this study was to uncover and operationalize the components of problematic online gaming. Methods A total of 3415 gamers (90% males; mean age 21 years), were recruited through online gaming websites. A combined method of exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) was applied. Latent profile analysis was applied to identify persons at-risk. Results EFA revealed a six-factor structure in the background of problematic online gaming that was also confirmed by a CFA. For the assessment of the identified six dimensions – preoccupation, overuse, immersion, social isolation, interpersonal conflicts, and withdrawal – the 18-item Problematic Online Gaming Questionnaire (POGQ) proved to be exceedingly suitable. Based on the latent profile analysis, 3.4% of the gamer population was considered to be at high risk, while another 15.2% was moderately problematic. Conclusions The POGQ seems to be an adequate measurement tool for the differentiated assessment of gaming related problems on six subscales. PMID:22590541

  11. Life-table methods for detecting age-risk factor interactions in long-term follow-up studies.

    PubMed

    Logue, E E; Wing, S

    1986-01-01

    Methodological investigation has suggested that age-risk factor interactions should be more evident in age of experience life tables than in follow-up time tables due to the mixing of ages of experience over follow-up time in groups defined by age at initial examination. To illustrate the two approaches, age modification of the effect of total cholesterol on ischemic heart disease mortality in two long-term follow-up studies was investigated. Follow-up time life table analysis of 116 deaths over 20 years in one study was more consistent with a uniform relative risk due to cholesterol, while age of experience life table analysis was more consistent with a monotonic negative age interaction. In a second follow-up study (160 deaths over 24 years), there was no evidence of a monotonic negative age-cholesterol interaction by either method. It was concluded that age-specific life table analysis should be used when age-risk factor interactions are considered, but that both approaches yield almost identical results in absence of age interaction. The identification of the more appropriate life-table analysis should be ultimately guided by the nature of the age or time phenomena of scientific interest.

  12. Boundary conditions for the solution of compressible Navier-Stokes equations by an implicit factored method

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Smith, G. E.; Springer, G. S.; Rimon, Y.

    1983-01-01

    A method is presented for formulating the boundary conditions in implicit finite-difference form needed for obtaining solutions to the compressible Navier-Stokes equations by the Beam and Warming implicit factored method. The usefulness of the method was demonstrated (a) by establishing the boundary conditions applicable to the analysis of the flow inside an axisymmetric piston-cylinder configuration and (b) by calculating velocities and mass fractions inside the cylinder for different geometries and different operating conditions. Stability, selection of time step and grid sizes, and computer time requirements are discussed in reference to the piston-cylinder problem analyzed.

  13. [Success factors in hospital management].

    PubMed

    Heberer, M

    1998-12-01

    The hospital environment of most Western countries is currently undergoing dramatic changes. Competition among hospitals is increasing, and economic issues have become decisive factors for the allocation of medical care. Hospitals therefore require management tools to respond to these changes adequately. The balanced scorecard is a method of enabling development and implementation of a business strategy that equally respects the financial requirements, the needs of the customers, process development, and organizational learning. This method was used to derive generally valid success factors for hospital management based on an analysis of an academic hospital in Switzerland. Strategic management, the focus of medical services, customer orientation, and integration of professional groups across the hospital value chain were identified as success factors for hospital management.

  14. Direct immersion single drop micro-extraction method for multi-class pesticides analysis in mango using GC-MS.

    PubMed

    Pano-Farias, Norma S; Ceballos-Magaña, Silvia G; Muñiz-Valencia, Roberto; Jurado, Jose M; Alcázar, Ángela; Aguayo-Villarreal, Ismael A

    2017-12-15

    Due the negative effects of pesticides on environment and human health, more efficient and environmentally friendly methods are needed. In this sense, a simple, fast, free from memory effects and economical direct-immersion single drop micro-extraction (SDME) method and GC-MS for multi-class pesticides determination in mango samples was developed. Sample pre-treatment using ultrasound-assisted solvent extraction and factors affecting the SDME procedure (extractant solvent, drop volume, stirring rate, ionic strength, time, pH and temperature) were optimized using factorial experimental design. This method presented high sensitive (LOD: 0.14-169.20μgkg -1 ), acceptable precision (RSD: 0.7-19.1%), satisfactory recovery (69-119%) and high enrichment factors (20-722). Several obtained LOQs are below the MRLs established by the European Commission; therefore, the method could be applied for pesticides determination in routing analysis and custom laboratories. Moreover, this method has shown to be suitable for determination of some of the studied pesticides in lime, melon, papaya, banana, tomato, and lettuce. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Factor analysis and multiple regression between topography and precipitation on Jeju Island, Korea

    NASA Astrophysics Data System (ADS)

    Um, Myoung-Jin; Yun, Hyeseon; Jeong, Chang-Sam; Heo, Jun-Haeng

    2011-11-01

    SummaryIn this study, new factors that influence precipitation were extracted from geographic variables using factor analysis, which allow for an accurate estimation of orographic precipitation. Correlation analysis was also used to examine the relationship between nine topographic variables from digital elevation models (DEMs) and the precipitation in Jeju Island. In addition, a spatial analysis was performed in order to verify the validity of the regression model. From the results of the correlation analysis, it was found that all of the topographic variables had a positive correlation with the precipitation. The relations between the variables also changed in accordance with a change in the precipitation duration. However, upon examining the correlation matrix, no significant relationship between the latitude and the aspect was found. According to the factor analysis, eight topographic variables (latitude being the exception) were found to have a direct influence on the precipitation. Three factors were then extracted from the eight topographic variables. By directly comparing the multiple regression model with the factors (model 1) to the multiple regression model with the topographic variables (model 3), it was found that model 1 did not violate the limits of statistical significance and multicollinearity. As such, model 1 was considered to be appropriate for estimating the precipitation when taking into account the topography. In the study of model 1, the multiple regression model using factor analysis was found to be the best method for estimating the orographic precipitation on Jeju Island.

  16. Design, processing and testing of LSI arrays: Hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.

    1979-01-01

    Mathematical cost factors were generated for both hybrid microcircuit and printed wiring board packaging methods. A mathematical cost model was created for analysis of microcircuit fabrication costs. The costing factors were refined and reduced to formulae for computerization. Efficient methods were investigated for low cost packaging of LSI devices as a function of density and reliability. Technical problem areas such as wafer bumping, inner/outer leading bonding, testing on tape, and tape processing, were investigated.

  17. DoIT Right: Measuring Effectiveness of Different eConsultation Designs

    NASA Astrophysics Data System (ADS)

    Grönlund, Åke; Åström, Joachim

    eConsultations have been used in many countries over many years, yet most research in the field is case descriptions and there is so far little systematic evidence as to the effectiveness of consultations as a tool for enhancing democracy. Using a case survey method we investigate what factors make a consultation succeed or fail based on data from 57 cases reported in the literature. Success is measured as high participation, deliberative mode of discussion, and impact on policy. We test three hypotheses from the literature claiming, respectively, that institutional design, democratic intent, and quality of research are the most important factors behind the reported success. We find support for all hypotheses. Using consultation at the analysis/decision making stage, mixing online and offline methods and active strategic recruiting are institutional factors positively contributing. Democratic intent and content analysis research both have positive influence.

  18. Feasibility of commercial space manufacturing, production of pharmaceuticals. Volume 2: Technical analysis

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A technical analysis on the feasibility of commercial manufacturing of pharmaceuticals in space is presented. The method of obtaining pharmaceutical company involvement, laboratory results of the separation of serum proteins by the continuous flow electrophoresis process, the selection and study of candidate products, and their production requirements is described. The candidate products are antihemophilic factor, beta cells, erythropoietin, epidermal growth factor, alpha-1-antitrypsin and interferon. Production mass balances for antihemophelic factor, beta cells, and erythropoietin were compared for space versus ground operation. A conceptual description of a multiproduct processing system for space operation is discussed. Production requirements for epidermal growth factor of alpha-1-antitrypsin and interferon are presented.

  19. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  20. Analysis of Urinary Calculi Using Infrared Spectroscopic Imaging

    NASA Astrophysics Data System (ADS)

    Sablinskas, Valdas; Lesciute, Daiva; Hendrixson, Vaiva

    2009-06-01

    Kidney stone disease is a cosmopolitan disease, occurring in both industrialized and developing countries and mainly affecting adults aged 2060 years. The formation of kidney stones is a process that includes many factors. Its primary and contributing pathogenic factors are genetic, nutritional and environmental, but also include personal habits. Information about the chemical structure of kidney stones is of great importance to the treatment of the kidney diseases. The usefulness of such information was first recognized in early 1950s. Analysis of urinary stones by various chemical methods, polarization microscopy, x-ray diffraction, porosity determination, solid phase NMR, and thermo analytical procedures have been widely used. Unfortunately, no one method is sufficient to provide all the clinically useful information about the structure and composition of the stones. Infrared spectroscopy can be considered a relatively new method of kidney stone analysis. It allows to identify any organic or inorganic molecules the constituents of kidney stones. So far this method had never been used to collect information about kidney stone component patterns in Lithuania. Since no epidemiological studies have been performed in this field, the medical treatment of kidney stone disease is empirical and often ineffective in hospitals around the country. The aim of this paper is to present some results of analysis of kidney stones extracted from local patients using FTIR spectroscopical microscopy.

  1. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  2. Modelling runway incursion severity.

    PubMed

    Wilke, Sabine; Majumdar, Arnab; Ochieng, Washington Y

    2015-06-01

    Analysis of the causes underlying runway incursions is fundamental for the development of effective mitigation measures. However, there are significant weaknesses in the current methods to model these factors. This paper proposes a structured framework for modelling causal factors and their relationship to severity, which includes a description of the airport surface system architecture, establishment of terminological definitions, the determination and collection of appropriate data, the analysis of occurrences for severity and causes, and the execution of a statistical analysis framework. It is implemented in the context of U.S. airports, enabling the identification of a number of priority interventions, including the need for better investigation and causal factor capture, recommendations for airfield design, operating scenarios and technologies, and better training for human operators in the system. The framework is recommended for the analysis of runway incursions to support safety improvements and the methodology is transferable to other areas of aviation safety risk analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Use of Language Sample Analysis by School-Based SLPs: Results of a Nationwide Survey

    ERIC Educational Resources Information Center

    Pavelko, Stacey L.; Owens, Robert E., Jr.; Ireland, Marie; Hahs-Vaughn, Debbie L.

    2016-01-01

    Purpose: This article examines use of language sample analysis (LSA) by school-based speech-language pathologists (SLPs), including characteristics of language samples, methods of transcription and analysis, barriers to LSA use, and factors affecting LSA use, such as American Speech-Language-Hearing Association certification, number of years'…

  4. Use of Latent Profile Analysis in Studies of Gifted Students

    ERIC Educational Resources Information Center

    Mammadov, Sakhavat; Ward, Thomas J.; Cross, Jennifer Riedl; Cross, Tracy L.

    2016-01-01

    To date, in gifted education and related fields various conventional factor analytic and clustering techniques have been used extensively for investigation of the underlying structure of data. Latent profile analysis is a relatively new method in the field. In this article, we provide an introduction to latent profile analysis for gifted education…

  5. Instream-Flow Analysis for the Luquillo Experimental Forest, Puerto Rico: Methods and Analysis

    Treesearch

    F.N. Scatena; S.L. Johnson

    2001-01-01

    This study develops two habitat-based approaches for evaluating instream-flow requirements within the Luquillo Experimental Forest in northeastern Puerto Rico. The analysis is restricted to instream-flow requirements in upland streams dominated by the common communities of freshwater decapods. In headwater streams, pool volume was the most consistent factor...

  6. Accuracy of the Parallel Analysis Procedure with Polychoric Correlations

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah

    2009-01-01

    The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…

  7. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... nature of the rail system, each carrier must select and document the analysis method/model used and identify the routes to be analyzed. D. The safety and security risk analysis must consider current data and... curvature; 7. Presence or absence of signals and train control systems along the route (“dark” versus...

  8. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    PubMed

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  9. Towards improved migraine management: Determining potential trigger factors in individual patients.

    PubMed

    Peris, Francesc; Donoghue, Stephen; Torres, Ferran; Mian, Alec; Wöber, Christian

    2017-04-01

    Background Certain chronic diseases such as migraine result in episodic, debilitating attacks for which neither cause nor timing is well understood. Historically, possible triggers were identified through analysis of aggregated data from populations of patients. However, triggers common in populations may not be wholly responsible for an individual's attacks. To explore this hypothesis we developed a method to identify individual 'potential trigger' profiles and analysed the degree of inter-individual variation. Methods We applied N = 1 statistical analysis to a 326-migraine-patient database from a study in which patients used paper-based diaries for 90 days to track 33 factors (potential triggers or premonitory symptoms) associated with their migraine attacks. For each patient, univariate associations between factors and migraine events were analysed using Cox proportional hazards models. Results We generated individual factor-attack association profiles for 87% of the patients. The average number of factors associated with attacks was four per patient: Factor profiles were highly individual and were unique in 85% of patients with at least one identified association. Conclusion Accurate identification of individual factor-attack profiles is a prerequisite for testing which are true triggers and for development of trigger avoidance or desensitisation strategies. Our methodology represents a necessary development toward this goal.

  10. A Study on Regional Frequency Analysis using Artificial Neural Network - the Sumjin River Basin

    NASA Astrophysics Data System (ADS)

    Jeong, C.; Ahn, J.; Ahn, H.; Heo, J. H.

    2017-12-01

    Regional frequency analysis means to make up for shortcomings in the at-site frequency analysis which is about a lack of sample size through the regional concept. Regional rainfall quantile depends on the identification of hydrologically homogeneous regions, hence the regional classification based on hydrological homogeneous assumption is very important. For regional clustering about rainfall, multidimensional variables and factors related geographical features and meteorological figure are considered such as mean annual precipitation, number of days with precipitation in a year and average maximum daily precipitation in a month. Self-Organizing Feature Map method which is one of the artificial neural network algorithm in the unsupervised learning techniques solves N-dimensional and nonlinear problems and be shown results simply as a data visualization technique. In this study, for the Sumjin river basin in South Korea, cluster analysis was performed based on SOM method using high-dimensional geographical features and meteorological factor as input data. then, for the results, in order to evaluate the homogeneity of regions, the L-moment based discordancy and heterogeneity measures were used. Rainfall quantiles were estimated as the index flood method which is one of regional rainfall frequency analysis. Clustering analysis using SOM method and the consequential variation in rainfall quantile were analyzed. This research was supported by a grant(2017-MPSS31-001) from Supporting Technology Development Program for Disaster Management funded by Ministry of Public Safety and Security(MPSS) of the Korean government.

  11. [Research on the reliability and validity of postural workload assessment method and the relation to work-related musculoskeletal disorders of workers].

    PubMed

    Qin, D L; Jin, X N; Wang, S J; Wang, J J; Mamat, N; Wang, F J; Wang, Y; Shen, Z A; Sheng, L G; Forsman, M; Yang, L Y; Wang, S; Zhang, Z B; He, L H

    2018-06-18

    To form a new assessment method to evaluate postural workload comprehensively analyzing the dynamic and static postural workload for workers during their work process to analyze the reliability and validity, and to study the relation between workers' postural workload and work-related musculoskeletal disorders (WMSDs). In the study, 844 workers from electronic and railway vehicle manufacturing factories were selected as subjects investigated by using the China Musculoskeletal Questionnaire (CMQ) to form the postural workload comprehensive assessment method. The Cronbach's α, cluster analysis and factor analysis were used to assess the reliability and validity of the new assessment method. Non-conditional Logistic regression was used to analyze the relation between workers' postural workload and WMSDs. Reliability of the assessment method for postural workload: internal consistency analysis results showed that Cronbach's α was 0.934 and the results of split-half reliability indicated that Spearman-Brown coefficient was 0.881 and the correlation coefficient between the first part and the second was 0.787. Validity of the assessment method for postural workload: the results of cluster analysis indicated that square Euclidean distance between dynamic and static postural workload assessment in the same part or work posture was the shortest. The results of factor analysis showed that 2 components were extracted and the cumulative percentage of variance achieved 65.604%. The postural workload score of the different occupational workers showed significant difference (P<0.05) by covariance analysis. The results of nonconditional Logistic regression indicated that alcohol intake (OR=2.141, 95%CI 1.337-3.428) and obesity (OR=3.408, 95%CI 1.629-7.130) were risk factors for WMSDs. The risk for WMSDs would rise as workers' postural workload rose (OR=1.035, 95%CI 1.022-1.048). There was significant different risk for WMSDs in the different groups of workers distinguished by work type, gender and age. Female workers exhibited a higher prevalence for WMSDs (OR=2.626, 95%CI 1.414-4.879) and workers between 30-40 years of age (OR=1.909, 95%CI 1.237-2.946) as compared with those under 30. This method for comprehensively assessing postural workload is reliable and effective when used in assembling workers, and there is certain relation between the postural workload and WMSDs.

  12. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  13. Three Factors Are Critical in Order to Synthesize Intelligible Noise-Vocoded Japanese Speech

    PubMed Central

    Kishida, Takuya; Nakajima, Yoshitaka; Ueda, Kazuo; Remijn, Gerard B.

    2016-01-01

    Factor analysis (principal component analysis followed by varimax rotation) had shown that 3 common factors appear across 20 critical-band power fluctuations derived from spoken sentences of eight different languages [Ueda et al. (2010). Fechner Day 2010, Padua]. The present study investigated the contributions of such power-fluctuation factors to speech intelligibility. The method of factor analysis was modified to obtain factors suitable for resynthesizing speech sounds as 20-critical-band noise-vocoded speech. The resynthesized speech sounds were used for an intelligibility test. The modification of factor analysis ensured that the resynthesized speech sounds were not accompanied by a steady background noise caused by the data reduction procedure. Spoken sentences of British English, Japanese, and Mandarin Chinese were subjected to this modified analysis. Confirming the earlier analysis, indeed 3–4 factors were common to these languages. The number of power-fluctuation factors needed to make noise-vocoded speech intelligible was then examined. Critical-band power fluctuations of the Japanese spoken sentences were resynthesized from the obtained factors, resulting in noise-vocoded-speech stimuli, and the intelligibility of these speech stimuli was tested by 12 native Japanese speakers. Japanese mora (syllable-like phonological unit) identification performances were measured when the number of factors was 1–9. Statistically significant improvement in intelligibility was observed when the number of factors was increased stepwise up to 6. The 12 listeners identified 92.1% of the morae correctly on average in the 6-factor condition. The intelligibility improved sharply when the number of factors changed from 2 to 3. In this step, the cumulative contribution ratio of factors improved only by 10.6%, from 37.3 to 47.9%, but the average mora identification leaped from 6.9 to 69.2%. The results indicated that, if the number of factors is 3 or more, elementary linguistic information is preserved in such noise-vocoded speech. PMID:27199790

  14. Logistic regression analysis of factors associated with avascular necrosis of the femoral head following femoral neck fractures in middle-aged and elderly patients.

    PubMed

    Ai, Zi-Sheng; Gao, You-Shui; Sun, Yuan; Liu, Yue; Zhang, Chang-Qing; Jiang, Cheng-Hua

    2013-03-01

    Risk factors for femoral neck fracture-induced avascular necrosis of the femoral head have not been elucidated clearly in middle-aged and elderly patients. Moreover, the high incidence of screw removal in China and its effect on the fate of the involved femoral head require statistical methods to reflect their intrinsic relationship. Ninety-nine patients older than 45 years with femoral neck fracture were treated by internal fixation between May 1999 and April 2004. Descriptive analysis, interaction analysis between associated factors, single factor logistic regression, multivariate logistic regression, and detailed interaction analysis were employed to explore potential relationships among associated factors. Avascular necrosis of the femoral head was found in 15 cases (15.2 %). Age × the status of implants (removal vs. maintenance) and gender × the timing of reduction were interactive according to two-factor interactive analysis. Age, the displacement of fractures, the quality of reduction, and the status of implants were found to be significant factors in single factor logistic regression analysis. Age, age × the status of implants, and the quality of reduction were found to be significant factors in multivariate logistic regression analysis. In fine interaction analysis after multivariate logistic regression analysis, implant removal was the most important risk factor for avascular necrosis in 56-to-85-year-old patients, with a risk ratio of 26.00 (95 % CI = 3.076-219.747). The middle-aged and elderly have less incidence of avascular necrosis of the femoral head following femoral neck fractures treated by cannulated screws. The removal of cannulated screws can induce a significantly high incidence of avascular necrosis of the femoral head in elderly patients, while a high-quality reduction is helpful to reduce avascular necrosis.

  15. Elimination of chromatographic and mass spectrometric problems in GC-MS analysis of Lavender essential oil by multivariate curve resolution techniques: Improving the peak purity assessment by variable size moving window-evolving factor analysis.

    PubMed

    Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan

    2015-03-01

    In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.

  16. Mapping brain activity in gradient-echo functional MRI using principal component analysis

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Singh, Manbir; Don, Manuel

    1997-05-01

    The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.

  17. Individual Factors Predicting Mental Health Court Diversion Outcome

    ERIC Educational Resources Information Center

    Verhaaff, Ashley; Scott, Hannah

    2015-01-01

    Objective: This study examined which individual factors predict mental health court diversion outcome among a sample of persons with mental illness participating in a postcharge diversion program. Method: The study employed secondary analysis of existing program records for 419 persons with mental illness in a court diversion program. Results:…

  18. Olympic Education as a Factor of Socialization of Preschoolers

    ERIC Educational Resources Information Center

    Varfolomeeva, Zoya S.; Surinov, Ilya A.

    2016-01-01

    The purpose of this study is theoretical substantiation and experimental confirmation of importance of the Olympic education as a socialization factor of the preschoolers. To address the study issues, theoretical methods of analysis, generalization and systematization as well as personal and activity approaches were applied. The older preschoolers…

  19. Scale Development for Perceived School Climate for Girls' Physical Activity

    ERIC Educational Resources Information Center

    Birnbaum, Amanda S.; Evenson, Kelly R.; Motl, Robert W.; Dishman, Rod K.; Voorhees, Carolyn C.; Sallis, James F.; Elder, John P.; Dowda, Marsha

    2005-01-01

    Objectives: To test an original scale assessing perceived school climate for girls' physical activity in middle school girls. Methods: Confirmatory factor analysis (CFA) and structural equation modeling (SEM). Results: CFA retained 5 of 14 original items. A model with 2 correlated factors, perceptions about teachers' and boys' behaviors,…

  20. Personality Factors and Instructional Methods

    ERIC Educational Resources Information Center

    Hunt, Dennis; Randhawa, Bikkar S.

    The Children's Personality Questionnaire (CPQ) was administered to 23 academically handicapped children (mean IQ, 79) and 35 academically gifted students (mean IQ, 147). The CPQ measures 14 distinct personality factors; data on these variables were analyzed using a 2 x 2 (academic ability x sex) analysis of variance design. A stepwise discriminant…

  1. Psychometric Properties of the Commitment to Physical Activity Scale

    ERIC Educational Resources Information Center

    DeBate, Rita DiGioacchino; Huberty, Jennifer; Pettee, Kelley

    2009-01-01

    Objective: To assess psychometric properties of the Commitment to Physical Activity Scale (CPAS). Methods: Girls in third to fifth grades (n = 932) completed the CPAS before and after a physical activity intervention. Psychometric measures included internal consistency, factor analysis, and concurrent validity. Results: Three CPAS factors emerged:…

  2. Measuring the Impact of Education on Productivity. Working Paper #261.

    ERIC Educational Resources Information Center

    Plant, Mark; Welch, Finis

    A theoretical and conceptual analysis of techniques used to measure education's contribution to productivity is followed by a discussion of the empirical measures implemented by various researchers. Standard methods of growth accounting make sense for simple measurement of factor contributions where outputs are well measured and when factor growth…

  3. Measuring Therapeutic Alliance with Children in Residential Treatment and Therapeutic Day Care

    ERIC Educational Resources Information Center

    Roest, Jesse; van der Helm, Peer; Strijbosch, Eefje; van Brandenburg, Mariëtte; Stams, Geert Jan

    2016-01-01

    Purpose: This study examined the construct validity and reliability of a therapeutic alliance measure (Children's Alliance Questionnaire [CAQ]) for children with psychosocial and/or behavioral problems, receiving therapeutic residential care or day care in the Netherlands. Methods: Confirmatory factor analysis of a one-factor model ''therapeutic…

  4. RECENT APPLICATIONS OF SOURCE APPORTIONMENT METHODS AND RELATED NEEDS

    EPA Science Inventory

    Traditional receptor modeling studies have utilized factor analysis (like principal component analysis, PCA) and/or Chemical Mass Balance (CMB) to assess source influences. The limitations with these approaches is that PCA is qualitative and CMB requires the input of source pr...

  5. Analysis of a boron-carbide-drum-controlled critical reactor experiment

    NASA Technical Reports Server (NTRS)

    Mayo, W. T.

    1972-01-01

    In order to validate methods and cross sections used in the neutronic design of compact fast-spectrum reactors for generating electric power in space, an analysis of a boron-carbide-drum-controlled critical reactor was made. For this reactor the transport analysis gave generally satisfactory results. The calculated multiplication factor for the most detailed calculation was only 0.7-percent Delta k too high. Calculated reactivity worth of the control drums was $11.61 compared to measurements of $11.58 by the inverse kinetics methods and $11.98 by the inverse counting method. Calculated radial and axial power distributions were in good agreement with experiment.

  6. Factor Structure and Item Level Psychometrics of the Social Problem Solving Inventory Revised-Short Form in Traumatic Brain Injury

    PubMed Central

    Li, Chih-Ying; Waid-Ebbs, Julia; Velozo, Craig A.; Heaton, Shelley C.

    2016-01-01

    Primary Objective Social problem solving deficits characterize individuals with traumatic brain injury (TBI). Poor social problem solving interferes with daily functioning and productive lifestyles. Therefore, it is of vital importance to use the appropriate instrument to identify deficits in social problem solving for individuals with TBI. This study investigates factor structure and item-level psychometrics of the Social Problem Solving Inventory-Revised Short Form (SPSI-R:S), for adults with moderate and severe TBI. Research Design Secondary analysis of 90 adults with moderate and severe TBI who completed the SPSI-R:S. Methods and Procedures An exploratory factor analysis (EFA), principal components analysis (PCA) and Rasch analysis examined the factor structure and item-level psychometrics of the SPSI-R:S. Main Outcomes and Results The EFA showed three dominant factors, with positively worded items represented as the most definite factor. The other two factors are negative problem solving orientation and skills; and negative problem solving emotion. Rasch analyses confirmed the three factors are each unidimensional constructs. Conclusions The total score interpretability of the SPSI-R:S may be challenging due to the multidimensional structure of the total measure. Instead, we propose using three separate SPSI-R:S subscores to measure social problem solving for the TBI population. PMID:26052731

  7. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  8. An Investigation of the Overlap Between the Statistical Discrete Gust and the Power Spectral Density Analysis Methods

    NASA Technical Reports Server (NTRS)

    Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.

    1989-01-01

    The results of a NASA investigation of a claimed Overlap between two gust response analysis methods: the Statistical Discrete Gust (SDG) Method and the Power Spectral Density (PSD) Method are presented. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented for several different airplanes at several different flight conditions indicate that such an Overlap does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.

  9. Factors affecting the overcrowding in outpatient healthcare

    PubMed Central

    Bahadori, Mohammadkarim; Teymourzadeh, Ehsan; Ravangard, Ramin; Raadabadi, Mehdi

    2017-01-01

    Background: The expansion of outpatient services and the desire to provide more outpatient care than inpatient care create some problems such as the overcrowding in the outpatient clinics. Given the importance of overcrowding in the outpatient clinics, this qualitative study aimed to determine the factors influencing the overcrowding in the specialty and subspecialty clinic of a teaching hospital. Materials and Methods: This was a qualitative study conducted in the specialty and subspecialty clinic of a hospital using content analysis method in the period of January to March 2014. The study population was all managers and heads of the outpatient wards. The studied sample consisted of 22 managers of the clinic wards who were selected using the purposive sampling method. The required data was collected using semi-structured interviews. The collected data was analyzed using conventional content analysis and the MAXQDA 10.0 software. Results: Three themes were identified as the main factors affecting the overcrowding including the internal positive factors, internal negative factors, and external factors. Conclusions: Despite the efforts made to eliminate overcrowding, and reduce waiting times and increase access to the services for patients, the problem of overcrowding still has remained unresolved. In addition, the use of some strategies such as clarifying the working processes of the clinic for staff and patients and the relationships between the clinic and other wards especially emergency department, as well as using a simple triage system on the patients’ arrival at the clinic are recommended. PMID:28546986

  10. Quantitative influence of risk factors on blood glucose level.

    PubMed

    Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu

    2014-01-01

    The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.

  11. Ultrasound-assisted extraction for total sulphur measurement in mine tailings.

    PubMed

    Khan, Adnan Hossain; Shang, Julie Q; Alam, Raquibul

    2012-10-15

    A sample preparation method for percentage recovery of total sulphur (%S) in reactive mine tailings based on ultrasound-assisted digestion (USAD) and inductively coupled plasma-optical emission spectroscopy (ICP-OES) was developed. The influence of various methodological factors was screened by employing a two-level and three-factor (2(3)) full factorial design and using KZK-1, a sericite schist certified reference material (CRM), to find the optimal combination of studied factors and %S. Factors such as the sonication time, temperature and acid combination were studied, with the best result identified as 20 min of sonication, 80°C temperature and 1 ml of HNO(3):1 ml of HCl, which can achieve 100% recovery for the selected CRM. Subsequently a fraction of the 2(3) full factorial design was applied to mine tailings. The percentage relative standard deviation (%RSD) for the ultrasound method is less than 3.0% for CRM and less than 6% for the mine tailings. The investigated method was verified by X-ray diffraction analysis. The USAD method compared favorably with existing methods such as hot plate assisted digestion method, X-ray fluorescence and LECO™-CNS method. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Improvement of Quench Factor Analysis in Phase and Hardness Prediction of a Quenched Steel

    NASA Astrophysics Data System (ADS)

    Kianezhad, M.; Sajjadi, S. A.

    2013-05-01

    The accurate prediction of alloys' properties introduced by heat treatment has been considered by many researchers. The advantages of such predictions are reduction of test trails and materials' consumption as well as time and energy saving. One of the most important methods to predict hardness in quenched steel parts is Quench Factor Analysis (QFA). Classical QFA is based on the Johnson-Mehl-Avrami-Kolmogorov (JMAK) equation. In this study, a modified form of the QFA based on the work by Rometsch et al. is compared with the classical QFA, and they are applied to prediction of hardness of steels. For this purpose, samples of CK60 steel were utilized as raw material. They were austenitized at 1103 K (830 °C). After quenching in different environments, they were cut and their hardness was determined. In addition, the hardness values of the samples were fitted using the classical and modified equations for the quench factor analysis and the results were compared. Results showed a significant improvement in fitted values of the hardness and proved the higher efficiency of the new method.

  14. Resolving the double tension: Toward a new approach to measurement modeling in cross-national research

    NASA Astrophysics Data System (ADS)

    Medina, Tait Runnfeldt

    The increasing global reach of survey research provides sociologists with new opportunities to pursue theory building and refinement through comparative analysis. However, comparison across a broad array of diverse contexts introduces methodological complexities related to the development of constructs (i.e., measurement modeling) that if not adequately recognized and properly addressed undermine the quality of research findings and cast doubt on the validity of substantive conclusions. The motivation for this dissertation arises from a concern that the availability of cross-national survey data has outpaced sociologists' ability to appropriately analyze and draw meaningful conclusions from such data. I examine the implicit assumptions and detail the limitations of three commonly used measurement models in cross-national analysis---summative scale, pooled factor model, and multiple-group factor model with measurement invariance. Using the orienting lens of the double tension I argue that a new approach to measurement modeling that incorporates important cross-national differences into the measurement process is needed. Two such measurement models---multiple-group factor model with partial measurement invariance (Byrne, Shavelson and Muthen 1989) and the alignment method (Asparouhov and Muthen 2014; Muthen and Asparouhov 2014)---are discussed in detail and illustrated using a sociologically relevant substantive example. I demonstrate that the former approach is vulnerable to an identification problem that arbitrarily impacts substantive conclusions. I conclude that the alignment method is built on model assumptions that are consistent with theoretical understandings of cross-national comparability and provides an approach to measurement modeling and construct development that is uniquely suited for cross-national research. The dissertation makes three major contributions: First, it provides theoretical justification for a new cross-national measurement model and explicates a link between theoretical conceptions of cross-national comparability and a statistical method. Second, it provides a clear and detailed discussion of model identification in multiple-group confirmatory factor analysis that is missing from the literature. This discussion sets the stage for the introduction of the identification problem within multiple-group confirmatory factor analysis with partial measurement invariance and the alternative approach to model identification employed by the alignment method. Third, it offers the first pedagogical presentation of the alignment method using a sociologically relevant example.

  15. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis

    PubMed Central

    Lin, Johnny; Bentler, Peter M.

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511

  16. A refined method for multivariate meta-analysis and meta-regression.

    PubMed

    Jackson, Daniel; Riley, Richard D

    2014-02-20

    Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects' standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Profiling the different needs and expectations of patients for population-based medicine: a case study using segmentation analysis

    PubMed Central

    2012-01-01

    Background This study illustrates an evidence-based method for the segmentation analysis of patients that could greatly improve the approach to population-based medicine, by filling a gap in the empirical analysis of this topic. Segmentation facilitates individual patient care in the context of the culture, health status, and the health needs of the entire population to which that patient belongs. Because many health systems are engaged in developing better chronic care management initiatives, patient profiles are critical to understanding whether some patients can move toward effective self-management and can play a central role in determining their own care, which fosters a sense of responsibility for their own health. A review of the literature on patient segmentation provided the background for this research. Method First, we conducted a literature review on patient satisfaction and segmentation to build a survey. Then, we performed 3,461 surveys of outpatient services users. The key structures on which the subjects’ perception of outpatient services was based were extrapolated using principal component factor analysis with varimax rotation. After the factor analysis, segmentation was performed through cluster analysis to better analyze the influence of individual attitudes on the results. Results Four segments were identified through factor and cluster analysis: the “unpretentious,” the “informed and supported,” the “experts” and the “advanced” patients. Their policies and managerial implications are outlined. Conclusions With this research, we provide the following: – a method for profiling patients based on common patient satisfaction surveys that is easily replicable in all health systems and contexts; – a proposal for segments based on the results of a broad-based analysis conducted in the Italian National Health System (INHS). Segments represent profiles of patients requiring different strategies for delivering health services. Their knowledge and analysis might support an effort to build an effective population-based medicine approach. PMID:23256543

  18. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    DOE PAGES

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; ...

    2016-11-25

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  19. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    NASA Astrophysics Data System (ADS)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; Docherty, Kenneth S.; Jimenez, Jose L.

    2016-11-01

    We present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography-mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arranged into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.

  20. Organizational and provider level factors in implementation of trauma-informed care after a city-wide training: an explanatory mixed methods assessment.

    PubMed

    Damian, April Joy; Gallo, Joseph; Leaf, Philip; Mendelson, Tamar

    2017-11-21

    While there is increasing support for training youth-serving providers in trauma-informed care (TIC) as a means of addressing high prevalence of U.S. childhood trauma, we know little about the effects of TIC training on organizational culture and providers' professional quality of life. This mixed-methods study evaluated changes in organizational- and provider-level factors following participation in a citywide TIC training. Government workers and nonprofit professionals (N = 90) who participated in a nine-month citywide TIC training completed a survey before and after the training to assess organizational culture and professional quality of life. Survey data were analyzed using multiple regression analyses. A subset of participants (n = 16) was interviewed using a semi-structured format, and themes related to organizational and provider factors were identified using qualitative methods. Analysis of survey data indicated significant improvements in participants' organizational culture and professional satisfaction at training completion. Participants' perceptions of their own burnout and secondary traumatic stress also increased. Four themes emerged from analysis of the interview data, including "Implementation of more flexible, less-punitive policies towards clients," "Adoption of trauma-informed workplace design," "Heightened awareness of own traumatic stress and need for self-care," and "Greater sense of camaraderie and empathy for colleagues." Use of a mixed-methods approach provided a nuanced understanding of the impact of TIC training and suggested potential benefits of the training on organizational and provider-level factors associated with implementation of trauma-informed policies and practices. Future trainings should explicitly address organizational factors such as safety climate and morale, managerial support, teamwork climate and collaboration, and individual factors including providers' compassion satisfaction, burnout, and secondary traumatic stress, to better support TIC implementation.

Top