Sample records for factor analysis approach

  1. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    NASA Astrophysics Data System (ADS)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  2. A computational intelligent approach to multi-factor analysis of violent crime information system

    NASA Astrophysics Data System (ADS)

    Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing

    2017-02-01

    Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

  3. Factors influencing crime rates: an econometric analysis approach

    NASA Astrophysics Data System (ADS)

    Bothos, John M. A.; Thomopoulos, Stelios C. A.

    2016-05-01

    The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.

  4. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  5. Item Factor Analysis: Current Approaches and Future Directions

    ERIC Educational Resources Information Center

    Wirth, R. J.; Edwards, Michael C.

    2007-01-01

    The rationale underlying factor analysis applies to continuous and categorical variables alike; however, the models and estimation methods for continuous (i.e., interval or ratio scale) data are not appropriate for item-level data that are categorical in nature. The authors provide a targeted review and synthesis of the item factor analysis (IFA)…

  6. A Note on Procrustean Rotation in Exploratory Factor Analysis: A Computer Intensive Approach to Goodness-of-Fit Evaluation.

    ERIC Educational Resources Information Center

    Raykov, Tenko; Little, Todd D.

    1999-01-01

    Describes a method for evaluating results of Procrustean rotation to a target factor pattern matrix in exploratory factor analysis. The approach, based on the bootstrap method, yields empirical approximations of the sampling distributions of: (1) differences between target elements and rotated factor pattern matrices; and (2) the overall…

  7. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach

    PubMed Central

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions. PMID:29033978

  8. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach.

    PubMed

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions.

  9. Experienced quality factors: qualitative evaluation approach to audiovisual quality

    NASA Astrophysics Data System (ADS)

    Jumisko-Pyykkö, Satu; Häkkinen, Jukka; Nyman, Göte

    2007-02-01

    Subjective evaluation is used to identify impairment factors of multimedia quality. The final quality is often formulated via quantitative experiments, but this approach has its constraints, as subject's quality interpretations, experiences and quality evaluation criteria are disregarded. To identify these quality evaluation factors, this study examined qualitatively the criteria participants used to evaluate audiovisual video quality. A semi-structured interview was conducted with 60 participants after a subjective audiovisual quality evaluation experiment. The assessment compared several, relatively low audio-video bitrate ratios with five different television contents on mobile device. In the analysis, methodological triangulation (grounded theory, Bayesian networks and correspondence analysis) was applied to approach the qualitative quality. The results showed that the most important evaluation criteria were the factors of visual quality, contents, factors of audio quality, usefulness - followability and audiovisual interaction. Several relations between the quality factors and the similarities between the contents were identified. As a research methodological recommendation, the focus on content and usage related factors need to be further examined to improve the quality evaluation experiments.

  10. A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis

    ERIC Educational Resources Information Center

    Edwards, Michael C.

    2010-01-01

    Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…

  11. Stabilization and robustness of non-linear unity-feedback system - Factorization approach

    NASA Technical Reports Server (NTRS)

    Desoer, C. A.; Kabuli, M. G.

    1988-01-01

    The paper is a self-contained discussion of a right factorization approach in the stability analysis of the nonlinear continuous-time or discrete-time, time-invariant or time-varying, well-posed unity-feedback system S1(P, C). It is shown that a well-posed stable feedback system S1(P, C) implies that P and C have right factorizations. In the case where C is stable, P has a normalized right-coprime factorization. The factorization approach is used in stabilization and simultaneous stabilization results.

  12. Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences

    PubMed Central

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566

  13. Analysis of stationary and dynamic factors affecting highway accident occurrence: A dynamic correlated grouped random parameters binary logit approach.

    PubMed

    Fountas, Grigorios; Sarwar, Md Tawfiq; Anastasopoulos, Panagiotis Ch; Blatt, Alan; Majka, Kevin

    2018-04-01

    Traditional accident analysis typically explores non-time-varying (stationary) factors that affect accident occurrence on roadway segments. However, the impact of time-varying (dynamic) factors is not thoroughly investigated. This paper seeks to simultaneously identify pre-crash stationary and dynamic factors of accident occurrence, while accounting for unobserved heterogeneity. Using highly disaggregate information for the potential dynamic factors, and aggregate data for the traditional stationary elements, a dynamic binary random parameters (mixed) logit framework is employed. With this approach, the dynamic nature of weather-related, and driving- and pavement-condition information is jointly investigated with traditional roadway geometric and traffic characteristics. To additionally account for the combined effect of the dynamic and stationary factors on the accident occurrence, the developed random parameters logit framework allows for possible correlations among the random parameters. The analysis is based on crash and non-crash observations between 2011 and 2013, drawn from urban and rural highway segments in the state of Washington. The findings show that the proposed methodological framework can account for both stationary and dynamic factors affecting accident occurrence probabilities, for panel effects, for unobserved heterogeneity through the use of random parameters, and for possible correlation among the latter. The comparative evaluation among the correlated grouped random parameters, the uncorrelated random parameters logit models, and their fixed parameters logit counterpart, demonstrate the potential of the random parameters modeling, in general, and the benefits of the correlated grouped random parameters approach, specifically, in terms of statistical fit and explanatory power. Published by Elsevier Ltd.

  14. Determinants of job stress in chemical process industry: A factor analysis approach.

    PubMed

    Menon, Balagopal G; Praveensal, C J; Madhu, G

    2015-01-01

    Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.

  15. Assessing Saudi medical students learning approach using the revised two-factor study process questionnaire.

    PubMed

    Shaik, Shaffi Ahamed; Almarzuqi, Ahmed; Almogheer, Rakan; Alharbi, Omar; Jalal, Abdulaziz; Alorainy, Majed

    2017-08-17

    To assess learning approaches of 1st, 2nd, and 3rd-year medical students by using revised two-factor study process questionnaire, and to assess reliability and validity of the questionnaire. This cross-sectional study was conducted at the College of Medicine, Riyadh, Saudi Arabia in 2014. The revised two-factor study process questionnaire (R-SPQ-2F) was completed by 610 medical students of both genders, from foundation (first year), central nervous system (second year), medicine and surgery (third year) courses. The study process was evaluated by computing mean scores of two research study approaches (deep & surface) using student's t-test and one-way analysis of variance. The internal consistency and construct validity of the questionnaire were assessed using Cronbach's α and factor analysis. The mean score of deep approach was significantly higher than the surface approach among participants(t (770) =7.83, p= 0.000) for the four courses. The mean scores of deep approach were significantly higher among participants with higher grade point average (F (2,768) =13.31, p=0.001) along with more number of study hours by participants (F (2,768) =20.08, p=0.001). The Cronbach's α-values of items at 0.70 indicate the good internal consistency of questionnaire used. Factor analysis confirms two factors (deep and surface approaches) of R-SPQ-2F. The deep approach to learning was the primary approach among 1st, 2nd and 3rd-year King Saud University medical students. This study confirms reliability and validity of the revised two-factor study process questionnaire. Medical educators could use the results of such studies to make required changes in the curriculum.

  16. Determination of apparent coupling factors for adhesive bonded acrylic plates using SEAL approach

    NASA Astrophysics Data System (ADS)

    Pankaj, Achuthan. C.; Shivaprasad, M. V.; Murigendrappa, S. M.

    2018-04-01

    Apparent coupling loss factors (CLF) and velocity responses has been computed for two lap joined adhesive bonded plates using finite element and experimental statistical energy analysis like approach. A finite element model of the plates has been created using ANSYS software. The statistical energy parameters have been computed using the velocity responses obtained from a harmonic forced excitation analysis. Experiments have been carried out for two different cases of adhesive bonded joints and the results have been compared with the apparent coupling factors and velocity responses obtained from finite element analysis. The results obtained from the studies signify the importance of modeling of adhesive bonded joints in computation of the apparent coupling factors and its further use in computation of energies and velocity responses using statistical energy analysis like approach.

  17. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    PubMed

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. A neuro-data envelopment analysis approach for optimization of uncorrelated multiple response problems with smaller the better type controllable factors

    NASA Astrophysics Data System (ADS)

    Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed

    2013-11-01

    In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

  19. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  20. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    NASA Astrophysics Data System (ADS)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  1. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    PubMed

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  2. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  3. Radiogenomics: a systems biology approach to understanding genetic risk factors for radiotherapy toxicity ?

    PubMed Central

    Herskind, Carsten; Talbot, Christopher J.; Kerns, Sarah L.; Veldwijk, Marlon R.; Rosenstein, Barry S.; West, Catharine M. L.

    2016-01-01

    Adverse reactions in normal tissue after radiotherapy (RT) limit the dose that can be given to tumour cells. Since 80% of individual variation in clinical response is estimated to be caused by patient-related factors, identifying these factors might allow prediction of patients with increased risk of developing severe reactions. While inactivation of cell renewal is considered a major cause of toxicity in early-reacting normal tissues, complex interactions involving multiple cell types, cytokines, and hypoxia seem important for late reactions. Here, we review ‘omics’ approaches such as screening of genetic polymorphisms or gene expression analysis, and assess the potential of epigenetic factors, posttranslational modification, signal transduction, and metabolism. Furthermore, functional assays have suggested possible associations with clinical risk of adverse reaction. Pathway analysis incorporating different ‘omics’ approaches may be more efficient in identifying critical pathways than pathway analysis based on single ‘omics’ data sets. Integrating these pathways with functional assays may be powerful in identifying multiple subgroups of RT patients characterized by different mechanisms. Thus ‘omics’ and functional approaches may synergize if they are integrated into radiogenomics ‘systems biology’ to facilitate the goal of individualised radiotherapy. PMID:26944314

  4. Using Formative Scenario Analysis approach for landslide risk analysis in a relatively scarce data environment: preliminary results

    NASA Astrophysics Data System (ADS)

    Zumpano, Veronica; Balteanu, Dan; Mazzorana, Bruno; Micu, Mihai

    2014-05-01

    It is increasingly important to provide to stakeholders tools that will enable them to better understand what is the state of the environment in which they live and manage and to help them to make decisions that aim to minimize the consequences of hydro-meteorological hazards. Very often, however, quantitative studies, especially for large areas, are difficult to perform. This is due to the fact that unfortunately isn't often possible to have the numerous data required to perform the analysis. In addition it has been proven that in scenario analysis, often deterministic approaches are not able to detect some features of the system revealing unexpected behaviors, and resulting in underestimation or omission of some impact factors. Here are presented some preliminary results obtained applying Formative Scenario Analysis that can be considered a possible solution for landslide risk analysis in cases where the data needed even if existent are not available. This method is an expert based approach that integrates intuitions and qualitative evaluations of impact factors with the quantitative analysis of relations between these factors: a group of experts with different but pertinent expertise, determine (by a rating procedure) quantitative relations between these factors, then through mathematical operations the scenarios describing a certain state of the system are obtained. The approach is applied to Buzau County (Romania), an area belonging to the Curvature Romanian Carpathians and Subcarpathians, a region strongly affected by environmental hazards. The region has been previously involved in numerous episodes of severe hydro-meteorological events that caused considerable damages (1975, 2005, 2006). In this application we are referring only to one type of landslides that can be described as shallow and medium-seated with a (mainly) translational movement that can go from slide to flow. The material involved can be either soil, debris or a mixture of both, in Romanian

  5. Analysis of factors affecting satisfaction level on problem based learning approach using structural equation modeling

    NASA Astrophysics Data System (ADS)

    Hussain, Nur Farahin Mee; Zahid, Zalina

    2014-12-01

    Nowadays, in the job market demand, graduates are expected not only to have higher performance in academic but they must also be excellent in soft skill. Problem-Based Learning (PBL) has a number of distinct advantages as a learning method as it can deliver graduates that will be highly prized by industry. This study attempts to determine the satisfaction level of engineering students on the PBL Approach and to evaluate their determinant factors. The Structural Equation Modeling (SEM) was used to investigate how the factors of Good Teaching Scale, Clear Goals, Student Assessment and Levels of Workload affected the student satisfaction towards PBL approach.

  6. Risk Factors Predicting Infectious Lactational Mastitis: Decision Tree Approach versus Logistic Regression Analysis.

    PubMed

    Fernández, Leónides; Mediano, Pilar; García, Ricardo; Rodríguez, Juan M; Marín, María

    2016-09-01

    Objectives Lactational mastitis frequently leads to a premature abandonment of breastfeeding; its development has been associated with several risk factors. This study aims to use a decision tree (DT) approach to establish the main risk factors involved in mastitis and to compare its performance for predicting this condition with a stepwise logistic regression (LR) model. Methods Data from 368 cases (breastfeeding women with mastitis) and 148 controls were collected by a questionnaire about risk factors related to medical history of mother and infant, pregnancy, delivery, postpartum, and breastfeeding practices. The performance of the DT and LR analyses was compared using the area under the receiver operating characteristic (ROC) curve. Sensitivity, specificity and accuracy of both models were calculated. Results Cracked nipples, antibiotics and antifungal drugs during breastfeeding, infant age, breast pumps, familial history of mastitis and throat infection were significant risk factors associated with mastitis in both analyses. Bottle-feeding and milk supply were related to mastitis for certain subgroups in the DT model. The areas under the ROC curves were similar for LR and DT models (0.870 and 0.835, respectively). The LR model had better classification accuracy and sensitivity than the DT model, but the last one presented better specificity at the optimal threshold of each curve. Conclusions The DT and LR models constitute useful and complementary analytical tools to assess the risk of lactational infectious mastitis. The DT approach identifies high-risk subpopulations that need specific mastitis prevention programs and, therefore, it could be used to make the most of public health resources.

  7. FACTOR 9.2: A Comprehensive Program for Fitting Exploratory and Semiconfirmatory Factor Analysis and IRT Models

    ERIC Educational Resources Information Center

    Lorenzo-Seva, Urbano; Ferrando, Pere J.

    2013-01-01

    FACTOR 9.2 was developed for three reasons. First, exploratory factor analysis (FA) is still an active field of research although most recent developments have not been incorporated into available programs. Second, there is now renewed interest in semiconfirmatory (SC) solutions as suitable approaches to the complex structures are commonly found…

  8. Factor Analysis via Components Analysis

    ERIC Educational Resources Information Center

    Bentler, Peter M.; de Leeuw, Jan

    2011-01-01

    When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…

  9. A contrarian view of the five-factor approach to personality description.

    PubMed

    Block, J

    1995-03-01

    The 5-factor approach (FFA) to personality description has been represented as a comprehensive and compelling rubric for assessment. In this article, various misgivings about the FFA are delineated. The algorithmic method of factor analysis may not provide dimensions that are incisive. The "discovery" of the five factors may be influenced by unrecognized constraints on the variable sets analyzed. Lexical analyses are based on questionable conceptual and methodological assumptions, and have achieved uncertain results. The questionnaire version of the FFA has not demonstrated the special merits and sufficiencies of the five factors settled upon. Serious uncertainties have arisen in regard to the claimed 5-factor structure and the substantive meanings of the factors. Some implications of these problems are drawn.

  10. The contribution of psychological factors to recovery after mild traumatic brain injury: is cluster analysis a useful approach?

    PubMed

    Snell, Deborah L; Surgenor, Lois J; Hay-Smith, E Jean C; Williman, Jonathan; Siegert, Richard J

    2015-01-01

    Outcomes after mild traumatic brain injury (MTBI) vary, with slow or incomplete recovery for a significant minority. This study examines whether groups of cases with shared psychological factors but with different injury outcomes could be identified using cluster analysis. This is a prospective observational study following 147 adults presenting to a hospital-based emergency department or concussion services in Christchurch, New Zealand. This study examined associations between baseline demographic, clinical, psychological variables (distress, injury beliefs and symptom burden) and outcome 6 months later. A two-step approach to cluster analysis was applied (Ward's method to identify clusters, K-means to refine results). Three meaningful clusters emerged (high-adapters, medium-adapters, low-adapters). Baseline cluster-group membership was significantly associated with outcomes over time. High-adapters appeared recovered by 6-weeks and medium-adapters revealed improvements by 6-months. The low-adapters continued to endorse many symptoms, negative recovery expectations and distress, being significantly at risk for poor outcome more than 6-months after injury (OR (good outcome) = 0.12; CI = 0.03-0.53; p < 0.01). Cluster analysis supported the notion that groups could be identified early post-injury based on psychological factors, with group membership associated with differing outcomes over time. Implications for clinical care providers regarding therapy targets and cases that may benefit from different intensities of intervention are discussed.

  11. Single-diffractive production of dijets within the kt-factorization approach

    NASA Astrophysics Data System (ADS)

    Łuszczak, Marta; Maciuła, Rafał; Szczurek, Antoni; Babiarz, Izabela

    2017-09-01

    We discuss single-diffractive production of dijets. The cross section is calculated within the resolved Pomeron picture, for the first time in the kt-factorization approach, neglecting transverse momentum of the Pomeron. We use Kimber-Martin-Ryskin unintegrated parton (gluon, quark, antiquark) distributions in both the proton as well as in the Pomeron or subleading Reggeon. The unintegrated parton distributions are calculated based on conventional mmht2014nlo parton distribution functions in the proton and H1 Collaboration diffractive parton distribution functions used previously in the analysis of diffractive structure function and dijets at HERA. For comparison, we present results of calculations performed within the collinear-factorization approach. Our results remain those obtained in the next-to-leading-order approach. The calculation is (must be) supplemented by the so-called gap survival factor, which may, in general, depend on kinematical variables. We try to describe the existing data from Tevatron and make detailed predictions for possible LHC measurements. Several differential distributions are calculated. The E¯T, η ¯ and xp ¯ distributions are compared with the Tevatron data. A reasonable agreement is obtained for the first two distributions. The last one requires introducing a gap survival factor which depends on kinematical variables. We discuss how the phenomenological dependence on one kinematical variable may influence dependence on other variables such as E¯T and η ¯. Several distributions for the LHC are shown.

  12. Factors affecting the surgical approach and timing of bilateral adrenalectomy.

    PubMed

    Lan, Billy Y; Taskin, Halit E; Aksoy, Erol; Birsen, Onur; Dural, Cem; Mitchell, Jamie; Siperstein, Allan; Berber, Eren

    2015-07-01

    Laparoscopic adrenalectomy has gained widespread acceptance. However, the optimal surgical approach to laparoscopic bilateral adrenalectomy has not been clearly defined. The aim of this study is to analyze the patient and intraoperative factors affecting the feasibility and outcome of different surgical approaches to define an algorithm for bilateral adrenalectomy. Between 2000 and 2013, all patients who underwent bilateral adrenalectomy at a single institution were selected for retrospective analysis. Patient factors, surgical approach, operative outcomes, and complications were analyzed. From 2000 to 2013, 28 patients underwent bilateral adrenalectomy. Patient diagnoses included Cushing's disease (n = 19), pheochromocytoma (n = 7), and adrenal metastasis (n = 2). Of these 28 patients, successful laparoscopic adrenalectomy was performed in all but 2 patients. Twenty-three out of the 26 adrenalectomies were completed in a single stage, while three were performed as a staged approach due to deterioration in intraoperative respiratory status in two patients and patient body habitus in one. Of the adrenalectomies completed using the minimally invasive approach, a posterior retroperitoneal (PR) approach was performed in 17 patients and lateral transabdominal (LT) approach in 9 patients. Patients who underwent a LT approach had higher BMI, larger tumor size, and other concomitant intraabdominal pathology. Hospital stay for laparoscopic adrenalectomy was 3.5 days compared to 5 and 12 days for the two open cases. There were no 30-day hospital mortality and 5 patients had minor complications for the entire cohort. A minimally invasive operation is feasible in 93% of patients undergoing bilateral adrenalectomy with 65% of adrenalectomies performed using the PR approach. Indications for the LT approach include morbid obesity, tumor size >6 cm, and other concomitant intraabdominal pathology. Single-stage adrenalectomies are feasible in most patients, with prolonged operative

  13. Gender and education impact on brain aging: a general cognitive factor approach.

    PubMed

    Proust-Lima, Cécile; Amieva, Hélène; Letenneur, Luc; Orgogozo, Jean-Marc; Jacqmin-Gadda, Hélène; Dartigues, Jean-François

    2008-09-01

    In cognitive aging research, the study of a general cognitive factor has been shown to have a substantial explanatory power over the study of isolated tests. The authors aimed at differentiating the impact of gender and education on global cognitive change with age from their differential impact on 4 psychometric tests using a new latent process approach, which intermediates between a single-factor longitudinal model for sum scores and an item-response theory approach for longitudinal data. The analysis was conducted on a sample of 2,228 subjects from PAQUID, a population-based cohort of older adults followed for 13 years with repeated measures of cognition. Adjusted for vascular factors, the analysis confirmed that women performed better in tests involving verbal components, while men performed better in tests involving visuospatial skills. In addition, the model suggested that women had a slightly steeper global cognitive decline with oldest age than men, even after excluding incident dementia or death. Subjects with higher education exhibited a better mean score for the 4 tests, but this difference tended to attenuate with age for tests involving a speed component. (c) 2008 APA, all rights reserved

  14. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  15. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  16. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  17. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. W. Parry; J.A Forester; V.N. Dang

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less

  18. Factors Mediating the Interactions between Adviser and Advisee during the Master's Thesis Project: A Quantitative Approach

    ERIC Educational Resources Information Center

    Rodrigues Jr., Jose Florencio; Lehmann, Angela Valeria Levay; Fleith, Denise De Souza

    2005-01-01

    Building on previous studies centred on the interaction between adviser and advisee in masters thesis projects, in which a qualitative approach was used, the present study uses factor analysis to identify the factors that determine either a successful or unsuccessful outcome for the masters thesis project. There were five factors relating to the…

  19. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rankmore » impacts both overcompleteness and sparsity.« less

  20. Network Analysis: A Novel Approach to Understand Suicidal Behaviour

    PubMed Central

    de Beurs, Derek

    2017-01-01

    Although suicide is a major public health issue worldwide, we understand little of the onset and development of suicidal behaviour. Suicidal behaviour is argued to be the end result of the complex interaction between psychological, social and biological factors. Epidemiological studies resulted in a range of risk factors for suicidal behaviour, but we do not yet understand how their interaction increases the risk for suicidal behaviour. A new approach called network analysis can help us better understand this process as it allows us to visualize and quantify the complex association between many different symptoms or risk factors. A network analysis of data containing information on suicidal patients can help us understand how risk factors interact and how their interaction is related to suicidal thoughts and behaviour. A network perspective has been successfully applied to the field of depression and psychosis, but not yet to the field of suicidology. In this theoretical article, I will introduce the concept of network analysis to the field of suicide prevention, and offer directions for future applications and studies.

  1. Generalized five-dimensional dynamic and spectral factor analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Fakhri, Georges; Sitek, Arkadiusz; Zimmerman, Robert E.

    2006-04-15

    We have generalized the spectral factor analysis and the factor analysis of dynamic sequences (FADS) in SPECT imaging to a five-dimensional general factor analysis model (5D-GFA), where the five dimensions are the three spatial dimensions, photon energy, and time. The generalized model yields a significant advantage in terms of the ratio of the number of equations to that of unknowns in the factor analysis problem in dynamic SPECT studies. We solved the 5D model using a least-squares approach. In addition to the traditional non-negativity constraints, we constrained the solution using a priori knowledge of both time and energy, assuming thatmore » primary factors (spectra) are Gaussian-shaped with full-width at half-maximum equal to gamma camera energy resolution. 5D-GFA was validated in a simultaneous pre-/post-synaptic dual isotope dynamic phantom study where {sup 99m}Tc and {sup 123}I activities were used to model early Parkinson disease studies. 5D-GFA was also applied to simultaneous perfusion/dopamine transporter (DAT) dynamic SPECT in rhesus monkeys. In the striatal phantom, 5D-GFA yielded significantly more accurate and precise estimates of both primary {sup 99m}Tc (bias=6.4%{+-}4.3%) and {sup 123}I (-1.7%{+-}6.9%) time activity curves (TAC) compared to conventional FADS (biases=15.5%{+-}10.6% in {sup 99m}Tc and 8.3%{+-}12.7% in {sup 123}I, p<0.05). Our technique was also validated in two primate dynamic dual isotope perfusion/DAT transporter studies. Biases of {sup 99m}Tc-HMPAO and {sup 123}I-DAT activity estimates with respect to estimates obtained in the presence of only one radionuclide (sequential imaging) were significantly lower with 5D-GFA (9.4%{+-}4.3% for {sup 99m}Tc-HMPAO and 8.7%{+-}4.1% for {sup 123}I-DAT) compared to biases greater than 15% for volumes of interest (VOI) over the reconstructed volumes (p<0.05). 5D-GFA is a novel and promising approach in dynamic SPECT imaging that can also be used in other modalities. It allows accurate and

  2. Bioinformatics approaches to predict target genes from transcription factor binding data.

    PubMed

    Essebier, Alexandra; Lamprecht, Marnie; Piper, Michael; Bodén, Mikael

    2017-12-01

    Transcription factors regulate gene expression and play an essential role in development by maintaining proliferative states, driving cellular differentiation and determining cell fate. Transcription factors are capable of regulating multiple genes over potentially long distances making target gene identification challenging. Currently available experimental approaches to detect distal interactions have multiple weaknesses that have motivated the development of computational approaches. Although an improvement over experimental approaches, existing computational approaches are still limited in their application, with different weaknesses depending on the approach. Here, we review computational approaches with a focus on data dependency, cell type specificity and usability. With the aim of identifying transcription factor target genes, we apply available approaches to typical transcription factor experimental datasets. We show that approaches are not always capable of annotating all transcription factor binding sites; binding sites should be treated disparately; and a combination of approaches can increase the biological relevance of the set of genes identified as targets. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach

    NASA Astrophysics Data System (ADS)

    Jain, Vineet; Raj, Tilak

    2014-09-01

    Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.

  4. A retrospective likelihood approach for efficient integration of multiple omics factors in case-control association studies.

    PubMed

    Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine

    2015-03-01

    Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data. © 2015 Wiley Periodicals, Inc.

  5. Factor analysis of some socio-economic and demographic variables for Bangladesh.

    PubMed

    Islam, S M

    1986-01-01

    The author carries out an exploratory factor analysis of some socioeconomic and demographic variables for Bangladesh using the classical or common factor approach with the varimax rotation method. The socioeconomic and demographic indicators used in this study include literacy, rate of growth, female employment, economic development, urbanization, population density, childlessness, sex ratio, proportion of women ever married, and fertility. The 18 administrative districts of Bangladesh constitute the unit of analysis. 3 common factors--modernization, fertility, and social progress--are identified in this study to explain the correlations among the set of selected socioeconomic and demographic variables.

  6. Quantitative Analysis of Guanine Nucleotide Exchange Factors (GEFs) as Enzymes

    PubMed Central

    Randazzo, Paul A; Jian, Xiaoying; Chen, Pei-Wen; Zhai, Peng; Soubias, Olivier; Northup, John K

    2014-01-01

    The proteins that possess guanine nucleotide exchange factor (GEF) activity, which include about ~800 G protein coupled receptors (GPCRs),1 15 Arf GEFs,2 81 Rho GEFs,3 8 Ras GEFs,4 and others for other families of GTPases,5 catalyze the exchange of GTP for GDP on all regulatory guanine nucleotide binding proteins. Despite their importance as catalysts, relatively few exchange factors (we are aware of only eight for ras superfamily members) have been rigorously characterized kinetically.5–13 In some cases, kinetic analysis has been simplistic leading to erroneous conclusions about mechanism (as discussed in a recent review14). In this paper, we compare two approaches for determining the kinetic properties of exchange factors: (i) examining individual equilibria, and; (ii) analyzing the exchange factors as enzymes. Each approach, when thoughtfully used,14,15 provides important mechanistic information about the exchange factors. The analysis as enzymes is described in further detail. With the focus on the production of the biologically relevant guanine nucleotide binding protein complexed with GTP (G•GTP), we believe it is conceptually simpler to connect the kinetic properties to cellular effects. Further, the experiments are often more tractable than those used to analyze the equilibrium system and, therefore, more widely accessible to scientists interested in the function of exchange factors. PMID:25332840

  7. Risk Factors for Central Serous Chorioretinopathy: Multivariate Approach in a Case-Control Study.

    PubMed

    Chatziralli, Irini; Kabanarou, Stamatina A; Parikakis, Efstratios; Chatzirallis, Alexandros; Xirou, Tina; Mitropoulos, Panagiotis

    2017-07-01

    The purpose of this prospective study was to investigate the potential risk factors associated independently with central serous retinopathy (CSR) in a Greek population, using multivariate approach. Participants in the study were 183 consecutive patients diagnosed with CSR and 183 controls, matched for age. All participants underwent complete ophthalmological examination and information regarding their sociodemographic, clinical, medical and ophthalmological history were recorded, so as to assess potential risk factors for CSR. Univariate and multivariate analysis was performed. Univariate analysis showed that male sex, high educational status, high income, alcohol consumption, smoking, hypertension, coronary heart disease, obstructive sleep apnea, autoimmune disorders, H. pylori infection, type A personality and stress, steroid use, pregnancy and hyperopia were associated with CSR, while myopia was found to protect from CSR. In multivariate analysis, alcohol consumption, hypertension, coronary heart disease and autoimmune disorders lost their significance, while the remaining factors were all independently associated with CSR. It is important to take into account the various risk factors for CSR, so as to define vulnerable groups and to shed light into the pathogenesis of the disease.

  8. Psychometric properties of the Epworth Sleepiness Scale: A factor analysis and item-response theory approach.

    PubMed

    Pilcher, June J; Switzer, Fred S; Munc, Alec; Donnelly, Janet; Jellen, Julia C; Lamm, Claus

    2018-04-01

    The purpose of this study is to examine the psychometric properties of the Epworth Sleepiness Scale (ESS) in two languages, German and English. Students from a university in Austria (N = 292; 55 males; mean age = 18.71 ± 1.71 years; 237 females; mean age = 18.24 ± 0.88 years) and a university in the US (N = 329; 128 males; mean age = 18.71 ± 0.88 years; 201 females; mean age = 21.59 ± 2.27 years) completed the ESS. An exploratory-factor analysis was completed to examine dimensionality of the ESS. Item response theory (IRT) analyses were used to provide information about the response rates on the items on the ESS and provide differential item functioning (DIF) analyses to examine whether the items were interpreted differently between the two languages. The factor analyses suggest that the ESS measures two distinct sleepiness constructs. These constructs indicate that the ESS is probing sleepiness in settings requiring active versus passive responding. The IRT analyses found that overall, the items on the ESS perform well as a measure of sleepiness. However, Item 8 and to a lesser extent Item 6 were being interpreted differently by respondents in comparison to the other items. In addition, the DIF analyses showed that the responses between German and English were very similar indicating that there are only minor measurement differences between the two language versions of the ESS. These findings suggest that the ESS provides a reliable measure of propensity to sleepiness; however, it does convey a two-factor approach to sleepiness. Researchers and clinicians can use the German and English versions of the ESS but may wish to exclude Item 8 when calculating a total sleepiness score.

  9. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  10. Developing Multidimensional Likert Scales Using Item Factor Analysis: The Case of Four-Point Items

    ERIC Educational Resources Information Center

    Asún, Rodrigo A.; Rdz-Navarro, Karina; Alvarado, Jesús M.

    2016-01-01

    This study compares the performance of two approaches in analysing four-point Likert rating scales with a factorial model: the classical factor analysis (FA) and the item factor analysis (IFA). For FA, maximum likelihood and weighted least squares estimations using Pearson correlation matrices among items are compared. For IFA, diagonally weighted…

  11. Improving social connection through a communities-of-practice-inspired cognitive work analysis approach.

    PubMed

    Euerby, Adam; Burns, Catherine M

    2014-03-01

    Increasingly, people work in socially networked environments. With growing adoption of enterprise social network technologies, supporting effective social community is becoming an important factor in organizational success. Relatively few human factors methods have been applied to social connection in communities. Although team methods provide a contribution, they do not suit design for communities. Wenger's community of practice concept, combined with cognitive work analysis, provided one way of designing for community. We used a cognitive work analysis approach modified with principles for supporting communities of practice to generate a new website design. Over several months, the community using the site was studied to examine their degree of social connectedness and communication levels. Social network analysis and communications analysis, conducted at three different intervals, showed increases in connections between people and between people and organizations, as well as increased communication following the launch of the new design. In this work, we suggest that human factors approaches can be effective in social environments, when applied considering social community principles. This work has implications for the development of new human factors methods as well as the design of interfaces for sociotechnical systems that have community building requirements.

  12. Electromagnetic {\\varvec{N}}^{\\varvec{*}} Transition Form Factors in the ANL-Osaka Dynamical Coupled-Channels Approach

    NASA Astrophysics Data System (ADS)

    Kamano, Hiroyuki

    2018-05-01

    We give an overview of our recent efforts to extract electromagnetic transition form factors for N^* and Δ^* baryon resonances through a global analysis of the single-pion electroproductions off the proton within the ANL-Osaka dynamical coupled-channels approach. Preliminary results for the extracted form factors associated with Δ(1232)3/2^+ and the Roper resonance are presented, with emphasis on the complex-valued nature of the transition form factors defined by poles.

  13. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  14. Learning Approaches, Demographic Factors to Predict Academic Outcomes

    ERIC Educational Resources Information Center

    Nguyen, Tuan Minh

    2016-01-01

    Purpose: The purpose of this paper is to predict academic outcome in math and math-related subjects using learning approaches and demographic factors. Design/Methodology/Approach: ASSIST was used as the instrumentation to measure learning approaches. The study was conducted in the International University of Vietnam with 616 participants. An…

  15. A Factor Analytic and Regression Approach to Functional Age: Potential Effects of Race.

    ERIC Educational Resources Information Center

    Colquitt, Alan L.; And Others

    Factor analysis and multiple regression are two major approaches used to look at functional age, which takes account of the extensive variation in the rate of physiological and psychological maturation throughout life. To examine the role of racial or cultural influences on the measurement of functional age, a battery of 12 tests concentrating on…

  16. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  17. Precipitation areal-reduction factor estimation using an annual-maxima centered approach

    USGS Publications Warehouse

    Asquith, W.H.; Famiglietti, J.S.

    2000-01-01

    The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are often computed by multiplying point depths by areal-reduction factors (ARF). ARF range from 0 to 1, vary according to storm characteristics, such as recurrence interval; and are a function of watershed characteristics, such as watershed size, shape, and geographic location. This paper presents a new approach for estimating ARF and includes applications for the 1-day design storm in Austin, Dallas, and Houston, Texas. The approach, termed 'annual-maxima centered,' specifically considers the distribution of concurrent precipitation surrounding an annual-precipitation maxima, which is a feature not seen in other approaches. The approach does not require the prior spatial averaging of precipitation, explicit determination of spatial correlation coefficients, nor explicit definition of a representative area of a particular storm in the analysis. The annual-maxima centered approach was designed to exploit the wide availability of dense precipitation gauge data in many regions of the world. The approach produces ARF that decrease more rapidly than those from TP-29. Furthermore, the ARF from the approach decay rapidly with increasing recurrence interval of the annual-precipitation maxima. (C) 2000 Elsevier Science B.V.The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are

  18. An approach to market analysis for lighter than air transportation of freight

    NASA Technical Reports Server (NTRS)

    Roberts, P. O.; Marcus, H. S.; Pollock, J. H.

    1975-01-01

    An approach is presented to marketing analysis for lighter than air vehicles in a commercial freight market. After a discussion of key characteristics of supply and demand factors, a three-phase approach to marketing analysis is described. The existing transportation systems are quantitatively defined and possible roles for lighter than air vehicles within this framework are postulated. The marketing analysis views the situation from the perspective of both the shipper and the carrier. A demand for freight service is assumed and the resulting supply characteristics are determined. Then, these supply characteristics are used to establish the demand for competing modes. The process is then iterated to arrive at the market solution.

  19. A phasor approach analysis of multiphoton FLIM measurements of three-dimensional cell culture models

    NASA Astrophysics Data System (ADS)

    Lakner, P. H.; Möller, Y.; Olayioye, M. A.; Brucker, S. Y.; Schenke-Layland, K.; Monaghan, M. G.

    2016-03-01

    Fluorescence lifetime imaging microscopy (FLIM) is a useful approach to obtain information regarding the endogenous fluorophores present in biological samples. The concise evaluation of FLIM data requires the use of robust mathematical algorithms. In this study, we developed a user-friendly phasor approach for analyzing FLIM data and applied this method on three-dimensional (3D) Caco-2 models of polarized epithelial luminal cysts in a supporting extracellular matrix environment. These Caco-2 based models were treated with epidermal growth factor (EGF), to stimulate proliferation in order to determine if FLIM could detect such a change in cell behavior. Autofluorescence from nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) in luminal Caco-2 cysts was stimulated by 2-photon laser excitation. Using a phasor approach, the lifetimes of involved fluorophores and their contribution were calculated with fewer initial assumptions when compared to multiexponential decay fitting. The phasor approach simplified FLIM data analysis, making it an interesting tool for non-experts in numerical data analysis. We observed that an increased proliferation stimulated by EGF led to a significant shift in fluorescence lifetime and a significant alteration of the phasor data shape. Our data demonstrates that multiphoton FLIM analysis with the phasor approach is a suitable method for the non-invasive analysis of 3D in vitro cell culture models qualifying this method for monitoring basic cellular features and the effect of external factors.

  20. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research

    PubMed Central

    Golino, Hudson F.; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman’s eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study. PMID:28594839

  1. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research.

    PubMed

    Golino, Hudson F; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman's eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study.

  2. Examining evolving performance on the Force Concept Inventory using factor analysis

    NASA Astrophysics Data System (ADS)

    Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W.

    2017-06-01

    The application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a pre- and post-test, we see factor analysis as a tool by which the changes in conceptual associations made by our students may be gauged given the evolution of their response patterns. This analysis allows us to identify and track conceptual linkages, affording us insight as to how our students have matured due to instruction. We report on our analysis of 427 pre- and post-tests. The factor models for the pre- and post-tests are explored and compared along with the methodology by which these models were fit to the data. The post-test factor pattern is more aligned with an expert's interpretation of the questions' content, as it allows for a more readily identifiable relationship between factors and physical concepts. We discuss this evolution in the context of approaching the characteristics of an expert with force concepts. Also, we find that certain test items do not significantly contribute to the pre- or post-test factor models and attempt explanations as to why this is so. This may suggest that such questions may not be effective in probing the conceptual understanding of our students.

  3. A Factorization Approach to the Linear Regulator Quadratic Cost Problem

    NASA Technical Reports Server (NTRS)

    Milman, M. H.

    1985-01-01

    A factorization approach to the linear regulator quadratic cost problem is developed. This approach makes some new connections between optimal control, factorization, Riccati equations and certain Wiener-Hopf operator equations. Applications of the theory to systems describable by evolution equations in Hilbert space and differential delay equations in Euclidean space are presented.

  4. Hierarchical Bayes approach for subgroup analysis.

    PubMed

    Hsu, Yu-Yi; Zalkikar, Jyoti; Tiwari, Ram C

    2017-01-01

    In clinical data analysis, both treatment effect estimation and consistency assessment are important for a better understanding of the drug efficacy for the benefit of subjects in individual subgroups. The linear mixed-effects model has been used for subgroup analysis to describe treatment differences among subgroups with great flexibility. The hierarchical Bayes approach has been applied to linear mixed-effects model to derive the posterior distributions of overall and subgroup treatment effects. In this article, we discuss the prior selection for variance components in hierarchical Bayes, estimation and decision making of the overall treatment effect, as well as consistency assessment of the treatment effects across the subgroups based on the posterior predictive p-value. Decision procedures are suggested using either the posterior probability or the Bayes factor. These decision procedures and their properties are illustrated using a simulated example with normally distributed response and repeated measurements.

  5. E-Education Applications: Human Factors and Innovative Approaches

    ERIC Educational Resources Information Center

    Ghaoui, Claude, Ed.

    2004-01-01

    "E-Education Applications: Human Factors and Innovative Approaches" enforces the need to take multi-disciplinary and/or inter-disciplinary approaches, when solutions for e-education (or online-, e-learning) are introduced. By focusing on the issues that have impact on the usability of e-learning, the book specifically fills-in a gap in this area,…

  6. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  7. Linear mixed-effects modeling approach to FMRI group analysis

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Britton, Jennifer C.; Pine, Daniel S.; Cox, Robert W.

    2013-01-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance–covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance–covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the

  8. Linear mixed-effects modeling approach to FMRI group analysis.

    PubMed

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  9. Factors affecting construction performance: exploratory factor analysis

    NASA Astrophysics Data System (ADS)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  10. Examining Differential Item Functioning: IRT-Based Detection in the Framework of Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    2017-01-01

    This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.

  11. NEUROTROPHIC FACTORS IN COMBINATORIAL APPROACHES FOR SPINAL CORD REGENERATION

    PubMed Central

    McCall, Julianne; Weidner, Norbert; Blesch, Armin

    2012-01-01

    Axonal regeneration is inhibited by a plethora of different mechanisms in the adult central nervous system (CNS). While neurotrophic factors have been shown to stimulate axonal growth in numerous animal models of nervous system injury, a lack of suitable growth substrates, an insufficient activation of neuron-intrinsic regenerative programs and extracellular inhibitors of regeneration limit the efficacy of neurotrophic factor delivery for anatomical and functional recovery after spinal cord injury. Thus, growth-stimulating factors will likely have to be combined with other treatment approaches to tap into the full potential of growth factor therapy for axonal regeneration. In addition, the temporal and spatial distribution of growth factors have to be tightly controlled to achieve biologically active concentrations, to allow for the chemotropic guidance of axons and to prevent adverse effects related to the widespread distribution of neurotrophic factors. Here, we will review the rationale for combinatorial treatments in axonal regeneration and summarize some recent progress in promoting axonal regeneration in the injured CNS using such approaches. PMID:22526621

  12. Extension Procedures for Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Nagy, Gabriel; Brunner, Martin; Lüdtke, Oliver; Greiff, Samuel

    2017-01-01

    We present factor extension procedures for confirmatory factor analysis that provide estimates of the relations of common and unique factors with external variables that do not undergo factor analysis. We present identification strategies that build upon restrictions of the pattern of correlations between unique factors and external variables. The…

  13. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  14. Chemical factor analysis of skin cancer FTIR-FEW spectroscopic data

    NASA Astrophysics Data System (ADS)

    Bruch, Reinhard F.; Sukuta, Sydney

    2002-03-01

    Chemical Factor Analysis (CFA) algorithms were applied to transform complex Fourier transform infrared fiberoptical evanescent wave (FTIR-FEW) normal and malignant skin tissue spectra into factor spaces for analysis and classification. The factor space approach classified melanoma beyond prior pathological classifications related to specific biochemical alterations to health states in cluster diagrams allowing diagnosis with more biochemical specificity, resolving biochemical component spectra and employing health state eigenvector angular configurations as disease state sensors. This study demonstrated a wealth of new information from in vivo FTIR-FEW spectral tissue data, without extensive a priori information or clinically invasive procedures. In particular, we employed a variety of methods used in CFA to select the rank of spectroscopic data sets of normal benign and cancerous skin tissue. We used the Malinowski indicator function (IND), significance level and F-Tests to rank our data matrices. Normal skin tissue, melanoma and benign tumors were modeled by four, two and seven principal abstract factors, respectively. We also showed that the spectrum of the first eigenvalue was equivalent to the mean spectrum. The graphical depiction of angular disparities between the first abstract factors can be adopted as a new way to characterize and diagnose melanoma cancer.

  15. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  16. Precipitation areal-reduction factor estimation using an annual-maxima centered approach

    NASA Astrophysics Data System (ADS)

    Asquith, W. H.; Famiglietti, J. S.

    2000-04-01

    The adjustment of precipitation depth of a point storm to an effective (mean) depth over a watershed is important for characterizing rainfall-runoff relations and for cost-effective designs of hydraulic structures when design storms are considered. A design storm is the precipitation point depth having a specified duration and frequency (recurrence interval). Effective depths are often computed by multiplying point depths by areal-reduction factors (ARF). ARF range from 0 to 1, vary according to storm characteristics, such as recurrence interval; and are a function of watershed characteristics, such as watershed size, shape, and geographic location. This paper presents a new approach for estimating ARF and includes applications for the 1-day design storm in Austin, Dallas, and Houston, Texas. The approach, termed "annual-maxima centered," specifically considers the distribution of concurrent precipitation surrounding an annual-precipitation maxima, which is a feature not seen in other approaches. The approach does not require the prior spatial averaging of precipitation, explicit determination of spatial correlation coefficients, nor explicit definition of a representative area of a particular storm in the analysis. The annual-maxima centered approach was designed to exploit the wide availability of dense precipitation gauge data in many regions of the world. The approach produces ARF that decrease more rapidly than those from TP-29. Furthermore, the ARF from the approach decay rapidly with increasing recurrence interval of the annual-precipitation maxima.

  17. Detecting and correcting the bias of unmeasured factors using perturbation analysis: a data-mining approach.

    PubMed

    Lee, Wen-Chung

    2014-02-05

    The randomized controlled study is the gold-standard research method in biomedicine. In contrast, the validity of a (nonrandomized) observational study is often questioned because of unknown/unmeasured factors, which may have confounding and/or effect-modifying potential. In this paper, the author proposes a perturbation test to detect the bias of unmeasured factors and a perturbation adjustment to correct for such bias. The proposed method circumvents the problem of measuring unknowns by collecting the perturbations of unmeasured factors instead. Specifically, a perturbation is a variable that is readily available (or can be measured easily) and is potentially associated, though perhaps only very weakly, with unmeasured factors. The author conducted extensive computer simulations to provide a proof of concept. Computer simulations show that, as the number of perturbation variables increases from data mining, the power of the perturbation test increased progressively, up to nearly 100%. In addition, after the perturbation adjustment, the bias decreased progressively, down to nearly 0%. The data-mining perturbation analysis described here is recommended for use in detecting and correcting the bias of unmeasured factors in observational studies.

  18. Sampling factors influencing accuracy of sperm kinematic analysis.

    PubMed

    Owen, D H; Katz, D F

    1993-01-01

    Sampling conditions that influence the accuracy of experimental measurement of sperm head kinematics were studied by computer simulation methods. Several archetypal sperm trajectories were studied. First, mathematical models of typical flagellar beats were input to hydrodynamic equations of sperm motion. The instantaneous swimming velocities of such sperm were computed over sequences of flagellar beat cycles, from which the resulting trajectories were determined. In a second, idealized approach, direct mathematical models of trajectories were utilized, based upon similarities to the previous hydrodynamic constructs. In general, it was found that analyses of sampling factors produced similar results for the hydrodynamic and idealized trajectories. A number of experimental sampling factors were studied, including the number of sperm head positions measured per flagellar beat, and the time interval over which these measurements are taken. It was found that when one flagellar beat is sampled, values of amplitude of lateral head displacement (ALH) and linearity (LIN) approached their actual values when five or more sample points per beat were taken. Mean angular displacement (MAD) values, however, remained sensitive to sampling rate even when large sampling rates were used. Values of MAD were also much more sensitive to the initial starting point of the sampling procedure than were ALH or LIN. On the basis of these analyses of measurement accuracy for individual sperm, simulations were then performed of cumulative effects when studying entire populations of motile cells. It was found that substantial (double digit) errors occurred in the mean values of curvilinear velocity (VCL), LIN, and MAD under the conditions of 30 video frames per second and 0.5 seconds of analysis time. Increasing the analysis interval to 1 second did not appreciably improve the results. However, increasing the analysis rate to 60 frames per second significantly reduced the errors. These findings

  19. Identifying Risk and Protective Factors in Recidivist Juvenile Offenders: A Decision Tree Approach

    PubMed Central

    Ortega-Campos, Elena; García-García, Juan; Gil-Fenoy, Maria José; Zaldívar-Basurto, Flor

    2016-01-01

    Research on juvenile justice aims to identify profiles of risk and protective factors in juvenile offenders. This paper presents a study of profiles of risk factors that influence young offenders toward committing sanctionable antisocial behavior (S-ASB). Decision tree analysis is used as a multivariate approach to the phenomenon of repeated sanctionable antisocial behavior in juvenile offenders in Spain. The study sample was made up of the set of juveniles who were charged in a court case in the Juvenile Court of Almeria (Spain). The period of study of recidivism was two years from the baseline. The object of study is presented, through the implementation of a decision tree. Two profiles of risk and protective factors are found. Risk factors associated with higher rates of recidivism are antisocial peers, age at baseline S-ASB, problems in school and criminality in family members. PMID:27611313

  20. Evaluating voice characteristics of first-year acting students in Israel: factor analysis.

    PubMed

    Amir, Ofer; Primov-Fever, Adi; Kushnir, Tami; Kandelshine-Waldman, Osnat; Wolf, Michael

    2013-01-01

    Acting students require diverse, high-quality, and high-intensity vocal performance from early stages of their training. Demanding vocal activities, before developing the appropriate vocal skills, put them in high risk for developing vocal problems. A retrospective analysis of voice characteristics of first-year acting students using several voice evaluation tools. A total of 79 first-year acting students (55 women and 24 men) were assigned into two study groups: laryngeal findings (LFs) and no laryngeal findings, based on stroboscopic findings. Their voice characteristics were evaluated using acoustic analysis, aerodynamic examination, perceptual scales, and self-report questionnaires. Results obtained from each set of measures were examined using a factor analysis approach. Significant differences between the two groups were found for a single fundamental frequency (F(0))-Regularity factor; a single Grade, Roughness, Breathiness, Asthenia, Strain perceptual factor; and the three self-evaluation factors. Gender differences were found for two acoustic analysis factors, which were based on F(0) and its derivatives, namely an aerodynamic factor that represents expiratory volume measurements and a single self-evaluation factor that represents the tendency to seek therapy. Approximately 50% of the first-year acting students had LFs. These students differed from their peers in the control group in a single acoustic analysis factor, as well as perceptual and self-report factors. No group differences, however, were found for the aerodynamic factors. Early laryngeal examination and voice evaluation of future professional voice users could provide a valuable individual baseline, to which later examinations could be compared, and assist in providing personally tailored treatment. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  1. SEPARABLE FACTOR ANALYSIS WITH APPLICATIONS TO MORTALITY DATA

    PubMed Central

    Fosdick, Bailey K.; Hoff, Peter D.

    2014-01-01

    Human mortality data sets can be expressed as multiway data arrays, the dimensions of which correspond to categories by which mortality rates are reported, such as age, sex, country and year. Regression models for such data typically assume an independent error distribution or an error model that allows for dependence along at most one or two dimensions of the data array. However, failing to account for other dependencies can lead to inefficient estimates of regression parameters, inaccurate standard errors and poor predictions. An alternative to assuming independent errors is to allow for dependence along each dimension of the array using a separable covariance model. However, the number of parameters in this model increases rapidly with the dimensions of the array and, for many arrays, maximum likelihood estimates of the covariance parameters do not exist. In this paper, we propose a submodel of the separable covariance model that estimates the covariance matrix for each dimension as having factor analytic structure. This model can be viewed as an extension of factor analysis to array-valued data, as it uses a factor model to estimate the covariance along each dimension of the array. We discuss properties of this model as they relate to ordinary factor analysis, describe maximum likelihood and Bayesian estimation methods, and provide a likelihood ratio testing procedure for selecting the factor model ranks. We apply this methodology to the analysis of data from the Human Mortality Database, and show in a cross-validation experiment how it outperforms simpler methods. Additionally, we use this model to impute mortality rates for countries that have no mortality data for several years. Unlike other approaches, our methodology is able to estimate similarities between the mortality rates of countries, time periods and sexes, and use this information to assist with the imputations. PMID:25489353

  2. Factorization approach to superintegrable systems: Formalism and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballesteros, Á., E-mail: angelb@ubu.es; Herranz, F. J., E-mail: fjherranz@ubu.es; Kuru, Ş., E-mail: kuru@science.ankara.edu.tr

    2017-03-15

    The factorization technique for superintegrable Hamiltonian systems is revisited and applied in order to obtain additional (higher-order) constants of the motion. In particular, the factorization approach to the classical anisotropic oscillator on the Euclidean plane is reviewed, and new classical (super) integrable anisotropic oscillators on the sphere are constructed. The Tremblay–Turbiner–Winternitz system on the Euclidean plane is also studied from this viewpoint.

  3. Multilevel Factor Analysis by Model Segregation: New Applications for Robust Test Statistics

    ERIC Educational Resources Information Center

    Schweig, Jonathan

    2014-01-01

    Measures of classroom environments have become central to policy efforts that assess school and teacher quality. This has sparked a wide interest in using multilevel factor analysis to test measurement hypotheses about classroom-level variables. One approach partitions the total covariance matrix and tests models separately on the…

  4. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    PubMed

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  5. Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.

    PubMed

    Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

    2012-01-01

    The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS.

  6. Moving Aerospace Structural Design Practice to a Load and Resistance Factor Approach

    NASA Technical Reports Server (NTRS)

    Larsen, Curtis E.; Raju, Ivatury S.

    2016-01-01

    Aerospace structures are traditionally designed using the factor of safety (FOS) approach. The limit load on the structure is determined and the structure is then designed for FOS times the limit load - the ultimate load. Probabilistic approaches utilize distributions for loads and strengths. Failures are predicted to occur in the region of intersection of the two distributions. The load and resistance factor design (LRFD) approach judiciously combines these two approaches by intensive calibration studies on loads and strength to result in structures that are efficient and reliable. This paper discusses these three approaches.

  7. Exploratory Bi-factor Analysis: The Oblique Case.

    PubMed

    Jennrich, Robert I; Bentler, Peter M

    2012-07-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (Psychometrika 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (Psychometrika 76:537-549, 2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bifactor rotation criterion designed to produce a rotated loading matrix that has an approximate bi-factor structure. Among other things this can be used as an aid in finding an explicit bi-factor structure for use in a confirmatory bi-factor analysis. They considered only orthogonal rotation. The purpose of this paper is to consider oblique rotation and to compare it to orthogonal rotation. Because there are many more oblique rotations of an initial loading matrix than orthogonal rotations, one expects the oblique results to approximate a bi-factor structure better than orthogonal rotations and this is indeed the case. A surprising result arises when oblique bi-factor rotation methods are applied to ideal data.

  8. Optimization of healthcare supply chain in context of macro-ergonomics factors by a unique mathematical programming approach.

    PubMed

    Azadeh, A; Motevali Haghighi, S; Gaeini, Z; Shabanpour, N

    2016-07-01

    This study presents an integrated approach for analyzing the impact of macro-ergonomics factors in healthcare supply chain (HCSC) by data envelopment analysis (DEA). The case of this study is the supply chain (SC) of a real hospital. Thus, healthcare standards and macro-ergonomics factors are considered to be modeled by the mathematical programming approach. Over 28 subsidiary SC divisions with parallel missions and objectives are evaluated by analyzing inputs and outputs through DEA. Each division in this HCSC is considered as decision making unit (DMU). This approach can analyze the impact of macro-ergonomics factors on supply chain management (SCM) in healthcare sector. Also, this method ranks the relevant performance efficiencies of each HCSC. In this study by using proposed method, the most effective macro-ergonomics factor on HCSC is identified as "teamwork" issue. Also, this study would help managers to identify the areas of weaknesses in their SCM system and set improvement target plan for the related SCM system in healthcare industry. To the best of our knowledge, this is the first study for macro-ergonomics optimization of HCSC. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  9. A pedagogical approach to the Boltzmann factor through experiments and simulations

    NASA Astrophysics Data System (ADS)

    Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.

    2009-09-01

    The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to see and is not at the level of high school or college students' preparation. We here present some experiments and simulations aimed at directly deriving its mathematical expression and illustrating the fundamental concepts on which it is grounded. Experiments use easily available apparatuses, and simulations are developed in the Net-Logo environment that, besides having a user-friendly interface, allows an easy interaction with the algorithm. The approach supplies pedagogical support for the introduction of the Boltzmann factor at the undergraduate level to students without a background in statistical mechanics.

  10. Risk analysis of the thermal sterilization process. Analysis of factors affecting the thermal resistance of microorganisms.

    PubMed

    Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A

    1999-03-01

    A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct.

  11. A Transformational Approach to Slip-Slide Factoring

    ERIC Educational Resources Information Center

    Steckroth, Jeffrey

    2015-01-01

    In this "Delving Deeper" article, the author introduces the slip-slide method for solving Algebra 1 mathematics problems. This article compares the traditional method approach of trial and error to the slip-slide method of factoring. Tools that used to be taken for granted now make it possible to investigate relationships visually,…

  12. A simplified approach for slope stability analysis of uncontrolled waste dumps.

    PubMed

    Turer, Dilek; Turer, Ahmet

    2011-02-01

    Slope stability analysis of municipal solid waste has always been problematic because of the heterogeneous nature of the waste materials. The requirement for large testing equipment in order to obtain representative samples has identified the need for simplified approaches to obtain the unit weight and shear strength parameters of the waste. In the present study, two of the most recently published approaches for determining the unit weight and shear strength parameters of the waste have been incorporated into a slope stability analysis using the Bishop method to prepare slope stability charts. The slope stability charts were prepared for uncontrolled waste dumps having no liner and leachate collection systems with pore pressure ratios of 0, 0.1, 0.2, 0.3, 0.4 and 0.5, considering the most critical slip surface passing through the toe of the slope. As the proposed slope stability charts were prepared by considering the change in unit weight as a function of height, they reflect field conditions better than accepting a constant unit weight approach in the stability analysis. They also streamline the selection of slope or height as a function of the desired factor of safety.

  13. Random Initialisation of the Spectral Variables: an Alternate Approach for Initiating Multivariate Curve Resolution Alternating Least Square (MCR-ALS) Analysis.

    PubMed

    Kumar, Keshav

    2017-11-01

    Multivariate curve resolution alternating least square (MCR-ALS) analysis is the most commonly used curve resolution technique. The MCR-ALS model is fitted using the alternate least square (ALS) algorithm that needs initialisation of either contribution profiles or spectral profiles of each of the factor. The contribution profiles can be initialised using the evolve factor analysis; however, in principle, this approach requires that data must belong to the sequential process. The initialisation of the spectral profiles are usually carried out using the pure variable approach such as SIMPLISMA algorithm, this approach demands that each factor must have the pure variables in the data sets. Despite these limitations, the existing approaches have been quite a successful for initiating the MCR-ALS analysis. However, the present work proposes an alternate approach for the initialisation of the spectral variables by generating the random variables in the limits spanned by the maxima and minima of each spectral variable of the data set. The proposed approach does not require that there must be pure variables for each component of the multicomponent system or the concentration direction must follow the sequential process. The proposed approach is successfully validated using the excitation-emission matrix fluorescence data sets acquired for certain fluorophores with significant spectral overlap. The calculated contribution and spectral profiles of these fluorophores are found to correlate well with the experimental results. In summary, the present work proposes an alternate way to initiate the MCR-ALS analysis.

  14. Demographic Factors, Personality, and Ability as Predictors of Learning Approaches

    ERIC Educational Resources Information Center

    Xie, Qiuzhi; Zhang, Li-fang

    2015-01-01

    This study investigated the extent to which learning approaches can be accounted for by personal factors (i.e., demographics, ability, and personality). The participants were 443 students in a university in mainland China. The Revised Two-factor Study Process Questionnaire, the NEO Five-Factor Inventory-3, and the short form of Raven's Advanced…

  15. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    EPA Pesticide Factsheets

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  16. Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm.

    PubMed

    Al-Saffar, Ahmed; Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-Bared, Mohammed

    2018-01-01

    Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach.

  17. Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm

    PubMed Central

    Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-bared, Mohammed

    2018-01-01

    Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach. PMID:29684036

  18. A Factor Graph Approach to Automated GO Annotation.

    PubMed

    Spetale, Flavio E; Tapia, Elizabeth; Krsticevic, Flavia; Roda, Fernando; Bulacio, Pilar

    2016-01-01

    As volume of genomic data grows, computational methods become essential for providing a first glimpse onto gene annotations. Automated Gene Ontology (GO) annotation methods based on hierarchical ensemble classification techniques are particularly interesting when interpretability of annotation results is a main concern. In these methods, raw GO-term predictions computed by base binary classifiers are leveraged by checking the consistency of predefined GO relationships. Both formal leveraging strategies, with main focus on annotation precision, and heuristic alternatives, with main focus on scalability issues, have been described in literature. In this contribution, a factor graph approach to the hierarchical ensemble formulation of the automated GO annotation problem is presented. In this formal framework, a core factor graph is first built based on the GO structure and then enriched to take into account the noisy nature of GO-term predictions. Hence, starting from raw GO-term predictions, an iterative message passing algorithm between nodes of the factor graph is used to compute marginal probabilities of target GO-terms. Evaluations on Saccharomyces cerevisiae, Arabidopsis thaliana and Drosophila melanogaster protein sequences from the GO Molecular Function domain showed significant improvements over competing approaches, even when protein sequences were naively characterized by their physicochemical and secondary structure properties or when loose noisy annotation datasets were considered. Based on these promising results and using Arabidopsis thaliana annotation data, we extend our approach to the identification of most promising molecular function annotations for a set of proteins of unknown function in Solanum lycopersicum.

  19. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  20. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  1. Combined target factor analysis and Bayesian soft-classification of interference-contaminated samples: forensic fire debris analysis.

    PubMed

    Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh

    2012-10-10

    A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. [Application of risk-based approach for determination of critical factors in technology transfer of production of medicinal products].

    PubMed

    Beregovykh, V V; Spitskiy, O R

    2014-01-01

    Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.

  3. Exploratory Bi-Factor Analysis: The Oblique Case

    ERIC Educational Resources Information Center

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  4. Analysis of the intellectual structure of human space exploration research using a bibliometric approach: Focus on human related factors

    NASA Astrophysics Data System (ADS)

    Lee, Tai Sik; Lee, Yoon-Sun; Lee, Jaeho; Chang, Byung Chul

    2018-02-01

    Human space exploration (HSE) is an interdisciplinary field composed of a range of subjects that have developed dramatically over the last few decades. This paper investigates the intellectual structure of HSE research with a focus on human related factors. A bibliometric approach with quantitative analytical techniques is applied to study the development and growth of the research. This study retrieves 1921 papers on HSE related to human factors from the year 1990 to the year 2016 from Web of Science and constructs a critical citation network composed of 336 papers. Edge-betweenness-based clustering is used to classify the citation network into twelve distinct research clusters based on four research themes: "biological risks from space radiation," "health and performance during long-duration spaceflight," "program and in-situ resources for HSE missions," and "habitat and life support systems in the space environment." These research themes are also similar to the classification results of a co-occurrence analysis on keywords for a total of 1921 papers. Papers with high centrality scores are identified as important papers in terms of knowledge flow. Moreover, the intermediary role of papers in exchanging knowledge between HSE sub-areas is identified using brokerage analysis. The key-route main path highlights the theoretical development trajectories. Due to the recent dramatic increase in investment by international governments and the private sector, the theoretical development trajectories of key research themes have been expanding from furthering scientific and technical knowledge to include various social and economic issues, thus encouraging massive public participation. This study contributes to an understanding of research trends and popular issues in the field of HSE by introducing a powerful way of determining major research themes and development trajectories. This study will help researchers seek the underlying knowledge diffusion flow from multifaceted

  5. Alternate solution to generalized Bernoulli equations via an integrating factor: an exact differential equation approach

    NASA Astrophysics Data System (ADS)

    Tisdell, C. C.

    2017-08-01

    Solution methods to exact differential equations via integrating factors have a rich history dating back to Euler (1740) and the ideas enjoy applications to thermodynamics and electromagnetism. Recently, Azevedo and Valentino presented an analysis of the generalized Bernoulli equation, constructing a general solution by linearizing the problem through a substitution. The purpose of this note is to present an alternative approach using 'exact methods', illustrating that a substitution and linearization of the problem is unnecessary. The ideas may be seen as forming a complimentary and arguably simpler approach to Azevedo and Valentino that have the potential to be assimilated and adapted to pedagogical needs of those learning and teaching exact differential equations in schools, colleges, universities and polytechnics. We illustrate how to apply the ideas through an analysis of the Gompertz equation, which is of interest in biomathematical models of tumour growth.

  6. Combined acute ecotoxicity of malathion and deltamethrin to Daphnia magna (Crustacea, Cladocera): comparison of different data analysis approaches.

    PubMed

    Toumi, Héla; Boumaiza, Moncef; Millet, Maurice; Radetski, Claudemir Marcos; Camara, Baba Issa; Felten, Vincent; Masfaraud, Jean-François; Férard, Jean-François

    2018-04-19

    We studied the combined acute effect (i.e., after 48 h) of deltamethrin (a pyrethroid insecticide) and malathion (an organophosphate insecticide) on Daphnia magna. Two approaches were used to examine the potential interaction effects of eight mixtures of deltamethrin and malathion: (i) calculation of mixture toxicity index (MTI) and safety factor index (SFI) and (ii) response surface methodology coupled with isobole-based statistical model (using generalized linear model). According to the calculation of MTI and SFI, one tested mixture was found additive while the two other tested mixtures were found no additive (MTI) or antagonistic (SFI), but these differences between index responses are only due to differences in terminology related to these two indexes. Through the surface response approach and isobologram analysis, we concluded that there was a significant antagonistic effect of the binary mixtures of deltamethrin and malathion that occurs on D. magna immobilization, after 48 h of exposure. Index approaches and surface response approach with isobologram analysis are complementary. Calculation of mixture toxicity index and safety factor index allows identifying punctually the type of interaction for several tested mixtures, while the surface response approach with isobologram analysis integrates all the data providing a global outcome about the type of interactive effect. Only the surface response approach and isobologram analysis allowed the statistical assessment of the ecotoxicological interaction. Nevertheless, we recommend the use of both approaches (i) to identify the combined effects of contaminants and (ii) to improve risk assessment and environmental management.

  7. Structural analysis and design of multivariable control systems: An algebraic approach

    NASA Technical Reports Server (NTRS)

    Tsay, Yih Tsong; Shieh, Leang-San; Barnett, Stephen

    1988-01-01

    The application of algebraic system theory to the design of controllers for multivariable (MV) systems is explored analytically using an approach based on state-space representations and matrix-fraction descriptions. Chapters are devoted to characteristic lambda matrices and canonical descriptions of MIMO systems; spectral analysis, divisors, and spectral factors of nonsingular lambda matrices; feedback control of MV systems; and structural decomposition theories and their application to MV control systems.

  8. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  9. A Factor Graph Approach to Automated GO Annotation

    PubMed Central

    Spetale, Flavio E.; Tapia, Elizabeth; Krsticevic, Flavia; Roda, Fernando; Bulacio, Pilar

    2016-01-01

    As volume of genomic data grows, computational methods become essential for providing a first glimpse onto gene annotations. Automated Gene Ontology (GO) annotation methods based on hierarchical ensemble classification techniques are particularly interesting when interpretability of annotation results is a main concern. In these methods, raw GO-term predictions computed by base binary classifiers are leveraged by checking the consistency of predefined GO relationships. Both formal leveraging strategies, with main focus on annotation precision, and heuristic alternatives, with main focus on scalability issues, have been described in literature. In this contribution, a factor graph approach to the hierarchical ensemble formulation of the automated GO annotation problem is presented. In this formal framework, a core factor graph is first built based on the GO structure and then enriched to take into account the noisy nature of GO-term predictions. Hence, starting from raw GO-term predictions, an iterative message passing algorithm between nodes of the factor graph is used to compute marginal probabilities of target GO-terms. Evaluations on Saccharomyces cerevisiae, Arabidopsis thaliana and Drosophila melanogaster protein sequences from the GO Molecular Function domain showed significant improvements over competing approaches, even when protein sequences were naively characterized by their physicochemical and secondary structure properties or when loose noisy annotation datasets were considered. Based on these promising results and using Arabidopsis thaliana annotation data, we extend our approach to the identification of most promising molecular function annotations for a set of proteins of unknown function in Solanum lycopersicum. PMID:26771463

  10. The common risk factor approach: a rational basis for promoting oral health.

    PubMed

    Sheiham, A; Watt, R G

    2000-12-01

    Conventional oral health education is not effective nor efficient. Many oral health programmes are developed and implemented in isolation from other health programmes. This often leads, at best to a duplication of effort, or worse, conflicting messages being delivered to the public. In addition, oral health programmes tend to concentrate on individual behaviour change and largely ignore the influence of socio-political factors as the key determinants of health. Based upon the general principles of health promotion this paper presents a rationale for an alternative approach for oral health policy. The common risk factor approach addresses risk factors common to many chronic conditions within the context of the wider socio-environmental milieu. Oral health is determined by diet, hygiene, smoking, alcohol use, stress and trauma. As these causes are common to a number of other chronic diseases, adopting a collaborative approach is more rational than one that is disease specific. The common risk factor approach can be implemented in a variety of ways. Food policy development and the Health Promoting Schools initiative are used as examples of effective ways of promoting oral health.

  11. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    PubMed

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  12. Using an interdisciplinary approach to identify factors that affect clinicians' compliance with evidence-based guidelines.

    PubMed

    Gurses, Ayse P; Marsteller, Jill A; Ozok, A Ant; Xiao, Yan; Owens, Sharon; Pronovost, Peter J

    2010-08-01

    Our objective was to identify factors that affect clinicians' compliance with the evidence-based guidelines using an interdisciplinary approach and develop a conceptual framework that can provide a comprehensive and practical guide for designing effective interventions. A literature review and a brainstorming session with 11 researchers from a variety of scientific disciplines were used to identify theoretical and conceptual models describing clinicians' guideline compliance. MEDLINE, EMBASE, CINAHL, and the bibliographies of the papers identified were used as data sources for identifying the relevant theoretical and conceptual models. Thirteen different models that originated from various disciplines including medicine, rural sociology, psychology, human factors and systems engineering, organizational management, marketing, and health education were identified. Four main categories of factors that affect compliance emerged from our analysis: clinician characteristics, guideline characteristics, system characteristics, and implementation characteristics. Based on these findings, we developed an interdisciplinary conceptual framework that specifies the expected interrelationships among these four categories of factors and their impact on clinicians' compliance. An interdisciplinary approach is needed to improve clinicians' compliance with evidence-based guidelines. The conceptual framework from this research can provide a comprehensive and systematic guide to identify barriers to guideline compliance and design effective interventions to improve patient safety.

  13. Factor Analysis of Intern Effectiveness

    ERIC Educational Resources Information Center

    Womack, Sid T.; Hannah, Shellie Louise; Bell, Columbus David

    2012-01-01

    Four factors in teaching intern effectiveness, as measured by a Praxis III-similar instrument, were found among observational data of teaching interns during the 2010 spring semester. Those factors were lesson planning, teacher/student reflection, fairness & safe environment, and professionalism/efficacy. This factor analysis was as much of a…

  14. Global analysis of bacterial transcription factors to predict cellular target processes.

    PubMed

    Doerks, Tobias; Andrade, Miguel A; Lathe, Warren; von Mering, Christian; Bork, Peer

    2004-03-01

    Whole-genome sequences are now available for >100 bacterial species, giving unprecedented power to comparative genomics approaches. We have applied genome-context methods to predict target processes that are regulated by transcription factors (TFs). Of 128 orthologous groups of proteins annotated as TFs, to date, 36 are functionally uncharacterized; in our analysis we predict a probable cellular target process or biochemical pathway for half of these functionally uncharacterized TFs.

  15. "Revisiting" the South Pacific Approaches to Learning: A Confirmatory Factor Analysis Study

    ERIC Educational Resources Information Center

    Phan, Huy P.; Deo, Bisun

    2008-01-01

    There has been substantial research evidence concerning the learning approaches of students in Western and non-Western contexts. Nonetheless, it has been a decade since research in the South Pacific was conducted on the learning approaches of tertiary students. The present research examined the learning approaches of Fijian and other Pacific…

  16. Factors that influence the approach to leadership: directors of nursing working in rural health services.

    PubMed

    Bish, Melanie; Kenny, Amanda; Nay, Rhonda

    2015-04-01

    To identify factors that influence directors of nursing in their approach to leadership when working in rural Victoria, Australia. In rural areas, nurses account for the largest component of the health workforce and must be equipped with leadership knowledge and skills to lead reform at a service level. A qualitative descriptive design was used. In-depth semi-structured interviews were undertaken with directors of nursing from rural Victoria. Data were analysed using thematic analysis and a thematic network was developed. Empowerment emerged as the highest order category in the thematic network. This was derived from three organising themes: influence, capital and contextual understanding and the respective basic themes: formal power, informal power, self-knowledge; information, support, resources; and situational factors, career trajectory, connectedness. Rural nurse leaders contend with several issues that influence their approach to leadership. This study provides a platform for further research to foster nurse leadership in rural healthcare services. Acknowledgement of what influences the rural nurse leaders' approach to leadership may assist in the implementation of initiatives designed to develop leadership in a manner that is contextually sensitive. © 2013 John Wiley & Sons Ltd.

  17. Robust Bayesian Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Yuan, Ke-Hai

    2003-01-01

    Bayesian factor analysis (BFA) assumes the normal distribution of the current sample conditional on the parameters. Practical data in social and behavioral sciences typically have significant skewness and kurtosis. If the normality assumption is not attainable, the posterior analysis will be inaccurate, although the BFA depends less on the current…

  18. A sequential factorial analysis approach to characterize the effects of uncertainties for supporting air quality management

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Veawab, A.

    2013-03-01

    This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.

  19. Charmless two-body B decays: A global analysis with QCD factorization

    NASA Astrophysics Data System (ADS)

    Du, Dongsheng; Sun, Junfeng; Yang, Deshan; Zhu, Guohuai

    2003-01-01

    In this paper, we perform a global analysis of B→PP and PV decays with the QCD factorization approach. It is encouraging to observe that the predictions of QCD factorization are in good agreement with experiment. The best fit γ is around 79 °. The penguin-diagram to tree-diagram ratio |Pππ/Tππ| of π+π- decays is preferred to be larger than 0.3. We also show the confidence levels for some interesting channels: B0→π0π0, K+K-, and B+→ωπ+, ωK+. For B→πK* decays, they are expected to have smaller branching ratios with more precise measurements.

  20. Assessing the Structure of the Ways of Coping Questionnaire in Fibromyalgia Patients Using Common Factor Analytic Approaches.

    PubMed

    Van Liew, Charles; Santoro, Maya S; Edwards, Larissa; Kang, Jeremy; Cronan, Terry A

    2016-01-01

    The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed.

  1. Assessing the Structure of the Ways of Coping Questionnaire in Fibromyalgia Patients Using Common Factor Analytic Approaches

    PubMed Central

    Edwards, Larissa; Kang, Jeremy

    2016-01-01

    The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed. PMID:28070160

  2. Human factors systems approach to healthcare quality and patient safety

    PubMed Central

    Carayon, Pascale; Wetterneck, Tosha B.; Rivera-Rodriguez, A. Joy; Hundt, Ann Schoofs; Hoonakker, Peter; Holden, Richard; Gurses, Ayse P.

    2013-01-01

    Human factors systems approaches are critical for improving healthcare quality and patient safety. The SEIPS (Systems Engineering Initiative for Patient Safety) model of work system and patient safety is a human factors systems approach that has been successfully applied in healthcare research and practice. Several research and practical applications of the SEIPS model are described. Important implications of the SEIPS model for healthcare system and process redesign are highlighted. Principles for redesigning healthcare systems using the SEIPS model are described. Balancing the work system and encouraging the active and adaptive role of workers are key principles for improving healthcare quality and patient safety. PMID:23845724

  3. Testing of technology readiness index model based on exploratory factor analysis approach

    NASA Astrophysics Data System (ADS)

    Ariani, AF; Napitupulu, D.; Jati, RK; Kadar, JA; Syafrullah, M.

    2018-04-01

    SMEs readiness in using ICT will determine the adoption of ICT in the future. This study aims to evaluate the model of technology readiness in order to apply the technology on SMEs. The model is tested to find if TRI model is relevant to measure ICT adoption, especially for SMEs in Indonesia. The research method used in this paper is survey to a group of SMEs in South Tangerang. The survey measures the readiness to adopt ICT based on four variables which is Optimism, Innovativeness, Discomfort, and Insecurity. Each variable contains several indicators to make sure the variable is measured thoroughly. The data collected through survey is analysed using factor analysis methodwith the help of SPSS software. The result of this study shows that TRI model gives more descendants on some indicators and variables. This result can be caused by SMEs owners’ knowledge is not homogeneous about either the technology that they are used, knowledge or the type of their business.

  4. [Causal analysis approaches in epidemiology].

    PubMed

    Dumas, O; Siroux, V; Le Moual, N; Varraso, R

    2014-02-01

    Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the

  5. Common and Specific Factors Approaches to Home-Based Treatment: I-FAST and MST

    ERIC Educational Resources Information Center

    Lee, Mo Yee; Greene, Gilbert J.; Fraser, J. Scott; Edwards, Shivani G.; Grove, David; Solovey, Andrew D.; Scott, Pamela

    2013-01-01

    Objectives: This study examined the treatment outcomes of integrated families and systems treatment (I-FAST), a moderated common factors approach, in reference to multisystemic therapy (MST), an established specific factor approach, for treating at risk children and adolescents and their families in an intensive community-based setting. Method:…

  6. The Infinitesimal Jackknife with Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  7. Do we need sustainability as a new approach in human factors and ergonomics?

    PubMed

    Zink, Klaus J; Fischer, Klaus

    2013-01-01

    The International Ergonomics Association Technical Committee 'Human Factors and Sustainable Development' was established to contribute to a broad discourse about opportunities and risks resulting from current societal 'mega-trends' and their impacts on the interactions among humans and other elements of a system, e.g. in work systems. This paper focuses on the underlying key issues: how do the sustainability paradigm and human factors/ergonomics interplay and interact, and is sustainability necessary as a new approach for our discipline? Based on a discussion of the sustainability concept, some general principles for designing new and enhancing existent approaches of human factors and ergonomics regarding their orientation towards sustainability are proposed. The increasing profile of sustainability on the international stage presents new opportunities for human factors/ergonomics. Positioning of the sustainability paradigm within human factors/ergonomics is discussed. Approaches to incorporating sustainability in the design of work systems are considered.

  8. Replica Analysis for Portfolio Optimization with Single-Factor Model

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  9. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  10. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    PubMed

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  11. Assessing the safety effects of cooperative intelligent transport systems: A bowtie analysis approach.

    PubMed

    Ehlers, Ute Christine; Ryeng, Eirin Olaussen; McCormack, Edward; Khan, Faisal; Ehlers, Sören

    2017-02-01

    The safety effects of cooperative intelligent transport systems (C-ITS) are mostly unknown and associated with uncertainties, because these systems represent emerging technology. This study proposes a bowtie analysis as a conceptual framework for evaluating the safety effect of cooperative intelligent transport systems. These seek to prevent road traffic accidents or mitigate their consequences. Under the assumption of the potential occurrence of a particular single vehicle accident, three case studies demonstrate the application of the bowtie analysis approach in road traffic safety. The approach utilizes exemplary expert estimates and knowledge from literature on the probability of the occurrence of accident risk factors and of the success of safety measures. Fuzzy set theory is applied to handle uncertainty in expert knowledge. Based on this approach, a useful tool is developed to estimate the effects of safety-related cooperative intelligent transport systems in terms of the expected change in accident occurrence and consequence probability. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A Comparison of Component and Factor Patterns: A Monte Carlo Approach.

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; And Others

    1982-01-01

    Factor analysis, image analysis, and principal component analysis are compared with respect to the factor patterns they would produce under various conditions. The general conclusion that is reached is that the three methods produce results that are equivalent. (Author/JKS)

  13. Systems analysis of multiple regulator perturbations allows discovery of virulence factors in Salmonella

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Hyunjin; Ansong, Charles; McDermott, Jason E.

    Background: Systemic bacterial infections are highly regulated and complex processes that are orchestrated by numerous virulence factors. Genes that are coordinately controlled by the set of regulators required for systemic infection are potentially required for pathogenicity. Results: In this study we present a systems biology approach in which sample-matched multi-omic measurements of fourteen virulence-essential regulator mutants were coupled with computational network analysis to efficiently identify Salmonella virulence factors. Immunoblot experiments verified network-predicted virulence factors and a subset was determined to be secreted into the host cytoplasm, suggesting that they are virulence factors directly interacting with host cellular components. Two ofmore » these, SrfN and PagK2, were required for full mouse virulence and were shown to be translocated independent of either of the type III secretion systems in Salmonella or the type III injectisome-related flagellar mechanism. Conclusions: Integrating multi-omic datasets from Salmonella mutants lacking virulence regulators not only identified novel virulence factors but also defined a new class of translocated effectors involved in pathogenesis. The success of this strategy at discovery of known and novel virulence factors suggests that the approach may have applicability for other bacterial pathogens.« less

  14. An Approach to Knowledge-Directed Image Analysis,

    DTIC Science & Technology

    1977-09-01

    34AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS D.H. Ballard, C.M.’Brown, J.A. Feldman Computer Science Department iThe University of Rochester...Rochester, New York 14627 DTII EECTE UTIC FILE COPY o n I, n 83 - ’ f t 8 11 28 19 1f.. AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS 5*., D.H...semantic network model and a distributed control structure to accomplish the image analysis process. The process of " understanding an image" leads to

  15. Total Factor Productivity Growth, Technical Progress & Efficiency Change in Vietnam Coal Industry - Nonparametric Approach

    NASA Astrophysics Data System (ADS)

    Phuong, Vu Hung

    2018-03-01

    This research applies Data Envelopment Analysis (DEA) approach to analyze Total Factor Productivity (TFP) and efficiency changes in Vietnam coal mining industry from 2007 to 2013. The TFP of Vietnam coal mining companies decreased due to slow technological progress and unimproved efficiency. The decadence of technical efficiency in many enterprises proved that the coal mining industry has a large potential to increase productivity through technical efficiency improvement. Enhancing human resource training, technology and research & development investment could help the industry to improve efficiency and productivity in Vietnam coal mining industry.

  16. A risk-factor analysis of medical litigation judgments related to fall injuries in Korea.

    PubMed

    Kim, Insook; Won, Seonae; Lee, Mijin; Lee, Won

    2018-01-01

    The aim of this study was to find out the risk factors through analysis of seven medical malpractice judgments related to fall injuries. The risk factors were analysed by using the framework that approaches falls from a systems perspective and comprised people, organisational or environmental factors, with each factor being comprised of subfactors. The risk factors found in each of the seven judgments were aggregated into one framework. The risk factors related to patients (i.e. the people factor) were age, pain, related disease, activities and functional status, urination state, cognitive function impairment, past history of fall, blood transfusion, sleep endoscopy state and uncooperative attitude. The risk factors related to the medical staff and caregivers (i.e. people factor) were observation negligence, no fall prevention activities and negligence in managing high-risk group for fall. Organisational risk factors were a lack of workforce, a lack of training, neglecting the management of the high-risk group, neglecting the management of caregivers and the absence of a fall prevention procedure. Regarding the environment, the risk factors were found to be the emergency room, chairs without a backrest and the examination table. Identifying risk factors is essential for preventing fall accidents, since falls are preventable patient-safety incidents. Falls do not happen as a result of a single risk factor. Therefore, a systems approach is effective to identify risk factors, especially organisational and environmental factors.

  17. Analysis of functional redundancies within the Arabidopsis TCP transcription factor family.

    PubMed

    Danisman, Selahattin; van Dijk, Aalt D J; Bimbo, Andrea; van der Wal, Froukje; Hennig, Lars; de Folter, Stefan; Angenent, Gerco C; Immink, Richard G H

    2013-12-01

    Analyses of the functions of TEOSINTE-LIKE1, CYCLOIDEA, and PROLIFERATING CELL FACTOR1 (TCP) transcription factors have been hampered by functional redundancy between its individual members. In general, putative functionally redundant genes are predicted based on sequence similarity and confirmed by genetic analysis. In the TCP family, however, identification is impeded by relatively low overall sequence similarity. In a search for functionally redundant TCP pairs that control Arabidopsis leaf development, this work performed an integrative bioinformatics analysis, combining protein sequence similarities, gene expression data, and results of pair-wise protein-protein interaction studies for the 24 members of the Arabidopsis TCP transcription factor family. For this, the work completed any lacking gene expression and protein-protein interaction data experimentally and then performed a comprehensive prediction of potential functional redundant TCP pairs. Subsequently, redundant functions could be confirmed for selected predicted TCP pairs by genetic and molecular analyses. It is demonstrated that the previously uncharacterized class I TCP19 gene plays a role in the control of leaf senescence in a redundant fashion with TCP20. Altogether, this work shows the power of combining classical genetic and molecular approaches with bioinformatics predictions to unravel functional redundancies in the TCP transcription factor family.

  18. Analysis of functional redundancies within the Arabidopsis TCP transcription factor family

    PubMed Central

    Danisman, Selahattin; de Folter, Stefan; Immink, Richard G. H.

    2013-01-01

    Analyses of the functions of TEOSINTE-LIKE1, CYCLOIDEA, and PROLIFERATING CELL FACTOR1 (TCP) transcription factors have been hampered by functional redundancy between its individual members. In general, putative functionally redundant genes are predicted based on sequence similarity and confirmed by genetic analysis. In the TCP family, however, identification is impeded by relatively low overall sequence similarity. In a search for functionally redundant TCP pairs that control Arabidopsis leaf development, this work performed an integrative bioinformatics analysis, combining protein sequence similarities, gene expression data, and results of pair-wise protein–protein interaction studies for the 24 members of the Arabidopsis TCP transcription factor family. For this, the work completed any lacking gene expression and protein–protein interaction data experimentally and then performed a comprehensive prediction of potential functional redundant TCP pairs. Subsequently, redundant functions could be confirmed for selected predicted TCP pairs by genetic and molecular analyses. It is demonstrated that the previously uncharacterized class I TCP19 gene plays a role in the control of leaf senescence in a redundant fashion with TCP20. Altogether, this work shows the power of combining classical genetic and molecular approaches with bioinformatics predictions to unravel functional redundancies in the TCP transcription factor family. PMID:24129704

  19. Principal components analysis of an evaluation of the hemiplegic subject based on the Bobath approach.

    PubMed

    Corriveau, H; Arsenault, A B; Dutil, E; Lepage, Y

    1992-01-01

    An evaluation based on the Bobath approach to treatment has previously been developed and partially validated. The purpose of the present study was to verify the content validity of this evaluation with the use of a statistical approach known as principal components analysis. Thirty-eight hemiplegic subjects participated in the study. Analysis of the scores on each of six parameters (sensorium, active movements, muscle tone, reflex activity, postural reactions, and pain) was evaluated on three occasions across a 2-month period. Each time this produced three factors that contained 70% of the variation in the data set. The first component mainly reflected variations in mobility, the second mainly variations in muscle tone, and the third mainly variations in sensorium and pain. The results of such exploratory analysis highlight the fact that some of the parameters are not only important but also interrelated. These results seem to partially support the conceptual framework substantiating the Bobath approach to treatment.

  20. Interstage Flammability Analysis Approach

    NASA Technical Reports Server (NTRS)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  1. A New, More Powerful Approach to Multitrait-Multimethod Analyses: An Application of Second-Order Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Marsh, Herbert W.; Hocevar, Dennis

    The advantages of applying confirmatory factor analysis (CFA) to multitrait-multimethod (MTMM) data are widely recognized. However, because CFA as traditionally applied to MTMM data incorporates single indicators of each scale (i.e., each trait/method combination), important weaknesses are the failure to: (1) correct appropriately for measurement…

  2. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  3. Maximizing the Information and Validity of a Linear Composite in the Factor Analysis Model for Continuous Item Responses

    ERIC Educational Resources Information Center

    Ferrando, Pere J.

    2008-01-01

    This paper develops results and procedures for obtaining linear composites of factor scores that maximize: (a) test information, and (b) validity with respect to external variables in the multiple factor analysis (FA) model. I treat FA as a multidimensional item response theory model, and use Ackerman's multidimensional information approach based…

  4. A Futures Approach to Policy Analysis.

    ERIC Educational Resources Information Center

    Morrison, James L.

    An approach to policy analysis for college officials is described that is based on evaluating and using information about the external environment to consider policy options for the future. The futures approach involves the following tasks: establishing an environmental scanning system to identify critical trends and emerging issues, identifying…

  5. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  6. Benefit-Risk Analysis for Decision-Making: An Approach.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.

  7. Mean structure analysis from an IRT approach: an application in the context of organizational psychology.

    PubMed

    Revuelta Menéndez, Javier; Ximénez Gómez, Carmen

    2012-11-01

    The application of mean and covariance structure analysis with quantitative data is increasing. However, latent means analysis with qualitative data is not as widespread. This article summarizes the procedures to conduct an analysis of latent means of dichotomous data from an item response theory approach. We illustrate the implementation of these procedures in an empirical example referring to the organizational context, where a multi-group analysis was conducted to compare the latent means of three employee groups in two factors measuring personal preferences and the perceived degree of rewards from the organization. Results show that higher personal motivations are associated with higher perceived importance of the organization, and that these perceptions differ across groups, so that higher-level employees have a lower level of personal and perceived motivation. The article shows how to estimate the factor means and the factor correlation from dichotomous data, and how to assess goodness of fit. Lastly, we provide the M-Plus syntax code in order to facilitate the latent means analyses for applied researchers.

  8. Phylogenetic Factor Analysis.

    PubMed

    Tolkoff, Max R; Alfaro, Michael E; Baele, Guy; Lemey, Philippe; Suchard, Marc A

    2018-05-01

    Phylogenetic comparative methods explore the relationships between quantitative traits adjusting for shared evolutionary history. This adjustment often occurs through a Brownian diffusion process along the branches of the phylogeny that generates model residuals or the traits themselves. For high-dimensional traits, inferring all pair-wise correlations within the multivariate diffusion is limiting. To circumvent this problem, we propose phylogenetic factor analysis (PFA) that assumes a small unknown number of independent evolutionary factors arise along the phylogeny and these factors generate clusters of dependent traits. Set in a Bayesian framework, PFA provides measures of uncertainty on the factor number and groupings, combines both continuous and discrete traits, integrates over missing measurements and incorporates phylogenetic uncertainty with the help of molecular sequences. We develop Gibbs samplers based on dynamic programming to estimate the PFA posterior distribution, over 3-fold faster than for multivariate diffusion and a further order-of-magnitude more efficiently in the presence of latent traits. We further propose a novel marginal likelihood estimator for previously impractical models with discrete data and find that PFA also provides a better fit than multivariate diffusion in evolutionary questions in columbine flower development, placental reproduction transitions and triggerfish fin morphometry.

  9. Exploratory factor analysis of the Research and Development Culture Index among qualified nurses.

    PubMed

    Watson, Bill; Clarke, Charlotte; Swallow, Vera; Forster, Stewart

    2005-10-01

    This paper presents the exploratory factor analysis of a rating instrument for assessing the strength of organizational Research and Development (R&D) culture. Despite nursing's limited research capacity, the discipline is capitalizing upon opportunities to become involved in research and is making strong progress. Within the context of the debate on nursing research capacity, the R&D Culture Index was developed as a means of appraising R&D culture within health care organizations. Factor analysis was carried out on data collected from 485 nursing staff. The method of extraction was Principal Components Analysis with oblique rotation. The Index was developed from the findings of qualitative research conducted with NHS staff. Eighteen items, encompassing the main themes from the data, were initially included in the Index. This pilot instrument was distributed to nursing staff within three different types of NHS Trust. Factor analysis resulted in rejection of two items and the analysis was repeated using the remaining 16 items. Three latent factors were extracted accounting for 58.0% of the variance in the data. The factors were: R&D Support, describing the perceived support within the working environment for R&D activity; Personal R&D Skills and Aptitude, describing an individual's perception of their ability towards R&D activity; and Personal R&D Intention, describing an individual's willingness to engage in R&D activity. Each factor had good internal reliability, as did the overall index. The R&D Culture Index provides an efficient means of assessing the strength of an organization's R&D culture in a way that captures the role of the individual practitioner and the organizational environment. These findings suggest that the continuing promotion of R&D within health care organizations is dependent upon a multi-faceted approach that addresses the learning needs of the organization as well as those of the individual practitioners.

  10. Risk factors for incidental durotomy during lumbar surgery: a retrospective study by multivariate analysis.

    PubMed

    Chen, Zhixiang; Shao, Peng; Sun, Qizhao; Zhao, Dong

    2015-03-01

    The purpose of the present study was to use a prospectively collected data to evaluate the rate of incidental durotomy (ID) during lumbar surgery and determine the associated risk factors by using univariate and multivariate analysis. We retrospectively reviewed 2184 patients who underwent lumbar surgery from January 1, 2009 to December 31, 2011 at a single hospital. Patients with ID (n=97) were compared with the patients without ID (n=2019). The influences of several potential risk factors that might affect the occurrence of ID were assessed using univariate and multivariate analyses. The overall incidence of ID was 4.62%. Univariate analysis demonstrated that older age, diabetes, lumbar central stenosis, posterior approach, revision surgery, prior lumber surgery and minimal invasive surgery are risk factors for ID during lumbar surgery. However, multivariate analysis identified older age, prior lumber surgery, revision surgery, and minimally invasive surgery as independent risk factors. Older age, prior lumber surgery, revision surgery, and minimal invasive surgery were independent risk factors for ID during lumbar surgery. These findings may guide clinicians making future surgical decisions regarding ID and aid in the patient counseling process to alleviate risks and complications. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. E-Learning Personalization Using Triple-Factor Approach in Standard-Based Education

    NASA Astrophysics Data System (ADS)

    Laksitowening, K. A.; Santoso, H. B.; Hasibuan, Z. A.

    2017-01-01

    E-Learning can be a tool in monitoring learning process and progress towards the targeted competency. Process and progress on every learner can be different one to another, since every learner may have different learning type. Learning type itself can be identified by taking into account learning style, motivation, and knowledge ability. This study explores personalization for learning type based on Triple-Factor Approach. Considering that factors in Triple-Factor Approach are dynamic, the personalization system needs to accommodate the changes that may occurs. Originated from the issue, this study proposed personalization that guides learner progression dynamically towards stages of their learning process. The personalization is implemented in the form of interventions that trigger learner to access learning contents and discussion forums more often as well as improve their level of knowledge ability based on their state of learning type.

  12. The Structure of Temperament in Preschoolers: A Two-Stage Factor Analytic Approach

    PubMed Central

    Dyson, Margaret W.; Olino, Thomas M.; Durbin, C. Emily; Goldsmith, H. Hill; Klein, Daniel N.

    2012-01-01

    The structure of temperament traits in young children has been the subject of extensive debate, with separate models proposing different trait dimensions. This research has relied almost exclusively on parent-report measures. The present study used an alternative approach, a laboratory observational measure, to explore the structure of temperament in preschoolers. A 2-stage factor analytic approach, exploratory factor analyses (n = 274) followed by confirmatory factor analyses (n = 276), was used. We retrieved an adequately fitting model that consisted of 5 dimensions: Sociability, Positive Affect/Interest, Dysphoria, Fear/Inhibition, and Constraint versus Impulsivity. This solution overlaps with, but is also distinct from, the major models derived from parent-report measures. PMID:21859196

  13. Factors Influencing Cecal Intubation Time during Retrograde Approach Single-Balloon Enteroscopy

    PubMed Central

    Chen, Peng-Jen; Shih, Yu-Lueng; Huang, Hsin-Hung; Hsieh, Tsai-Yuan

    2014-01-01

    Background and Aim. The predisposing factors for prolonged cecal intubation time (CIT) during colonoscopy have been well identified. However, the factors influencing CIT during retrograde SBE have not been addressed. The aim of this study was to determine the factors influencing CIT during retrograde SBE. Methods. We investigated patients who underwent retrograde SBE at a medical center from January 2011 to March 2014. The medical charts and SBE reports were reviewed. The patients' characteristics and procedure-associated data were recorded. These data were analyzed with univariate analysis as well as multivariate logistic regression analysis to identify the possible predisposing factors. Results. We enrolled 66 patients into this study. The median CIT was 17.4 minutes. With univariate analysis, there was no statistical difference in age, sex, BMI, or history of abdominal surgery, except for bowel preparation (P = 0.021). Multivariate logistic regression analysis showed that inadequate bowel preparation (odds ratio 30.2, 95% confidence interval 4.63–196.54; P < 0.001) was the independent predisposing factors for prolonged CIT during retrograde SBE. Conclusions. For experienced endoscopist, inadequate bowel preparation was the independent predisposing factor for prolonged CIT during retrograde SBE. PMID:25505904

  14. A multilayered approach for the analysis of perinatal mortality using different classification systems.

    PubMed

    Gordijn, Sanne J; Korteweg, Fleurisca J; Erwich, Jan Jaap H M; Holm, Jozien P; van Diem, Mariet Th; Bergman, Klasien A; Timmer, Albertus

    2009-06-01

    Many classification systems for perinatal mortality are available, all with their own strengths and weaknesses: none of them has been universally accepted. We present a systematic multilayered approach for the analysis of perinatal mortality based on information related to the moment of death, the conditions associated with death and the underlying cause of death, using a combination of representatives of existing classification systems. We compared the existing classification systems regarding their definition of the perinatal period, level of complexity, inclusion of maternal, foetal and/or placental factors and whether they focus at a clinical or pathological viewpoint. Furthermore, we allocated the classification systems to one of three categories: 'when', 'what' or 'why', dependent on whether the allocation of the individual cases of perinatal mortality is based on the moment of death ('when'), the clinical conditions associated with death ('what'), or the underlying cause of death ('why'). A multilayered approach for the analysis and classification of perinatal mortality is possible by using combinations of existing systems; for example the Wigglesworth or Nordic Baltic ('when'), ReCoDe ('what') and Tulip ('why') classification systems. This approach is useful not only for in depth analysis of perinatal mortality in the developed world but also for analysis of perinatal mortality in the developing countries, where resources to investigate death are often limited.

  15. A SYSTEMIC APPROACH TO MITIGATING URBAN STORM WATER RUNOFF VIA DEVELOPMENT PLANS BASED ON LAND SUITABILITY ANALYSIS

    EPA Science Inventory

    We advocate an approach to reduce the anticipated increase in stormwater runoff from conventional development by demonstrating a low-impact development that incorporates hydrologic factors into an expanded land suitability analysis. This methodology was applied to a 3 hectare exp...

  16. Factors promoting marine invasions: A chemoecological approach

    PubMed Central

    Mollo, Ernesto; Gavagnin, Margherita; Carbone, Marianna; Castelluccio, Francesco; Pozone, Ferdinando; Roussis, Vassilios; Templado, José; Ghiselin, Michael T.; Cimino, Guido

    2008-01-01

    The Mediterranean Sea is losing its biological distinctiveness, and the same phenomenon is occurring in other seas. It gives urgency to a better understanding of the factors that affect marine biological invasions. A chemoecological approach is proposed here to define biotic conditions that promote biological invasions in terms of enemy escape and resource opportunities. Research has focused on the secondary metabolite composition of three exotic sea slugs found in Greece that have most probably entered the Mediterranean basin by Lessepsian migration, an exchange that contributes significantly to Mediterranean biodiversity. We have found toxic compounds with significant activity as feeding deterrents both in the cephalaspidean Haminoea cyanomarginata and in the nudibranch Melibe viridis. These findings led us to propose aposematism in the former and dietary autonomy in producing defensive metabolites in the latter case, as predisposing factors to the migration. In the third mollusk investigated, the anaspidean Syphonota geographica, the topic of marine invasions has been approached through a study of its feeding biology. The identification of the same compounds from both the viscera of each individual, separately analyzed, and their food, the seagrass Halophila stipulacea, implies a dietary dependency. The survival of S. geographica in the Mediterranean seems to be related to the presence of H. stipulacea. The initial invasion of this exotic pest would seem to have paved the way for the subsequent invasion of a trophic specialist that takes advantage of niche opportunities. PMID:18337492

  17. A single factor underlies the metabolic syndrome: a confirmatory factor analysis.

    PubMed

    Pladevall, Manel; Singal, Bonita; Williams, L Keoki; Brotons, Carlos; Guyer, Heidi; Sadurni, Josep; Falces, Carles; Serrano-Rios, Manuel; Gabriel, Rafael; Shaw, Jonathan E; Zimmet, Paul Z; Haffner, Steven

    2006-01-01

    Confirmatory factor analysis (CFA) was used to test the hypothesis that the components of the metabolic syndrome are manifestations of a single common factor. Three different datasets were used to test and validate the model. The Spanish and Mauritian studies included 207 men and 203 women and 1,411 men and 1,650 women, respectively. A third analytical dataset including 847 men was obtained from a previously published CFA of a U.S. population. The one-factor model included the metabolic syndrome core components (central obesity, insulin resistance, blood pressure, and lipid measurements). We also tested an expanded one-factor model that included uric acid and leptin levels. Finally, we used CFA to compare the goodness of fit of one-factor models with the fit of two previously published four-factor models. The simplest one-factor model showed the best goodness-of-fit indexes (comparative fit index 1, root mean-square error of approximation 0.00). Comparisons of one-factor with four-factor models in the three datasets favored the one-factor model structure. The selection of variables to represent the different metabolic syndrome components and model specification explained why previous exploratory and confirmatory factor analysis, respectively, failed to identify a single factor for the metabolic syndrome. These analyses support the current clinical definition of the metabolic syndrome, as well as the existence of a single factor that links all of the core components.

  18. Risk assessment for enterprise resource planning (ERP) system implementations: a fault tree analysis approach

    NASA Astrophysics Data System (ADS)

    Zeng, Yajun; Skibniewski, Miroslaw J.

    2013-08-01

    Enterprise resource planning (ERP) system implementations are often characterised with large capital outlay, long implementation duration, and high risk of failure. In order to avoid ERP implementation failure and realise the benefits of the system, sound risk management is the key. This paper proposes a probabilistic risk assessment approach for ERP system implementation projects based on fault tree analysis, which models the relationship between ERP system components and specific risk factors. Unlike traditional risk management approaches that have been mostly focused on meeting project budget and schedule objectives, the proposed approach intends to address the risks that may cause ERP system usage failure. The approach can be used to identify the root causes of ERP system implementation usage failure and quantify the impact of critical component failures or critical risk events in the implementation process.

  19. Methodological factors affecting gas and methane production during in vitro rumen fermentation evaluated by meta-analysis approach.

    PubMed

    Maccarana, Laura; Cattani, Mirko; Tagliapietra, Franco; Schiavon, Stefano; Bailoni, Lucia; Mantovani, Roberto

    2016-01-01

    Effects of some methodological factors on in vitro measures of gas production (GP, mL/g DM), CH4 production (mL/g DM) and proportion (% CH4 on total GP) were investigated by meta-analysis. These factors were considered: pressure in the GP equipment (0 = constant; 1 = increasing), incubation time (0 = 24; 1 = ≥ 48 h), time of rumen fluid collection (0 = before feeding; 1 = after feeding of donor animals), donor species of rumen fluid (0 = sheep; 1 = bovine), presence of N in the buffer solution (0 = presence; 1 = absence), and ratio between amount of buffered rumen fluid and feed sample (BRF/FS; 0 = ≤ 130 mL/g DM; 1 = 130-140 mL/g DM; 2 = ≥ 140 mL/g DM). The NDF content of feed sample incubated (NDF) was considered as a continuous variable. From an initial database of 105 papers, 58 were discarded because one of the above-mentioned factors was not stated. After discarding 17 papers, the final dataset comprised 30 papers (339 observations). A preliminary mixed model analysis was carried out on experimental data considering the study as random factor. Variables adjusted for study effect were analyzed using a backward stepwise analysis including the above-mentioned variables. The analysis showed that the extension of incubation time and reduction of NDF increased GP and CH4 values. Values of GP and CH4 also increased when rumen fluid was collected after feeding compared to before feeding (+26.4 and +9.0 mL/g DM, for GP and CH4), from bovine compared to sheep (+32.8 and +5.2 mL/g DM, for GP and CH4), and when the buffer solution did not contain N (+24.7 and +6.7 mL/g DM for GP and CH4). The increase of BRF/FS ratio enhanced GP and CH4 production (+7.7 and +3.3 mL/g DM per each class of increase, respectively). In vitro techniques for measuring GP and CH4 production are mostly used as screening methods, thus a full standardization of such techniques is not feasible. However, a greater harmonization

  20. Human factors engineering approaches to patient identification armband design.

    PubMed

    Probst, C Adam; Wolf, Laurie; Bollini, Mara; Xiao, Yan

    2016-01-01

    The task of patient identification is performed many times each day by nurses and other members of the care team. Armbands are used for both direct verification and barcode scanning during patient identification. Armbands and information layout are critical to reducing patient identification errors and dangerous workarounds. We report the effort at two large, integrated healthcare systems that employed human factors engineering approaches to the information layout design of new patient identification armbands. The different methods used illustrate potential pathways to obtain standardized armbands across healthcare systems that incorporate human factors principles. By extension, how the designs have been adopted provides examples of how to incorporate human factors engineering into key clinical processes. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Anthropometric data reduction using confirmatory factor analysis.

    PubMed

    Rohani, Jafri Mohd; Olusegun, Akanbi Gabriel; Rani, Mat Rebi Abdul

    2014-01-01

    The unavailability of anthropometric data especially in developing countries has remained a limiting factor towards the design of learning facilities with sufficient ergonomic consideration. Attempts to use anthropometric data from developed countries have led to provision of school facilities unfit for the users. The purpose of this paper is to use factor analysis to investigate the suitability of the collected anthropometric data as a database for school design in Nigerian tertiary institutions. Anthropometric data were collected from 288 male students in a Federal Polytechnic in North-West of Nigeria. Their age is between 18-25 years. Nine vertical anthropometric dimensions related to heights were collected using the conventional traditional equipment. Exploratory factor analysis was used to categorize the variables into a model consisting of two factors. Thereafter, confirmatory factor analysis was used to investigate the fit of the data to the proposed model. A just identified model, made of two factors, each with three variables was developed. The variables within the model accounted for 81% of the total variation of the entire data. The model was found to demonstrate adequate validity and reliability. Various measuring indices were used to verify that the model fits the data properly. The final model reveals that stature height and eye height sitting were the most stable variables for designs that have to do with standing and sitting construct. The study has shown the application of factor analysis in anthropometric data analysis. The study highlighted the relevance of these statistical tools to investigate variability among anthropometric data involving diverse population, which has not been widely used for analyzing previous anthropometric data. The collected data is therefore suitable for use while designing for Nigerian students.

  2. Assessing Model Fit: Caveats and Recommendations for Confirmatory Factor Analysis and Exploratory Structural Equation Modeling

    ERIC Educational Resources Information Center

    Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee

    2015-01-01

    Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…

  3. The constellation of dietary factors in adolescent acne: a semantic connectivity map approach.

    PubMed

    Grossi, E; Cazzaniga, S; Crotti, S; Naldi, L; Di Landro, A; Ingordo, V; Cusano, F; Atzori, L; Tripodi Cutrì, F; Musumeci, M L; Pezzarossa, E; Bettoli, V; Caproni, M; Bonci, A

    2016-01-01

    Different lifestyle and dietetic factors have been linked with the onset and severity of acne. To assess the complex interconnection between dietetic variables and acne. This was a reanalysis of data from a case-control study by using a semantic connectivity map approach. 563 subjects, aged 10-24 years, involved in a case-control study of acne between March 2009 and February 2010, were considered in this study. The analysis evaluated the link between a moderate to severe acne and anthropometric variables, family history and dietetic factors. Analyses were conducted by relying on an artificial adaptive system, the Auto Semantic Connectivity Map (AutoCM). The AutoCM map showed that moderate-severe acne was closely associated with family history of acne in first degree relatives, obesity (BMI ≥ 30), and high consumption of milk, in particular skim milk, cheese/yogurt, sweets/cakes, chocolate, and a low consumption of fish, and limited intake of fruits/vegetables. Our analyses confirm the link between several dietetic items and acne. When providing care, dermatologists should also be aware of the complex interconnection between dietetic factors and acne. © 2014 European Academy of Dermatology and Venereology.

  4. Bioinformatics Identification of Modules of Transcription Factor Binding Sites in Alzheimer's Disease-Related Genes by In Silico Promoter Analysis and Microarrays

    PubMed Central

    Augustin, Regina; Lichtenthaler, Stefan F.; Greeff, Michael; Hansen, Jens; Wurst, Wolfgang; Trümbach, Dietrich

    2011-01-01

    The molecular mechanisms and genetic risk factors underlying Alzheimer's disease (AD) pathogenesis are only partly understood. To identify new factors, which may contribute to AD, different approaches are taken including proteomics, genetics, and functional genomics. Here, we used a bioinformatics approach and found that distinct AD-related genes share modules of transcription factor binding sites, suggesting a transcriptional coregulation. To detect additional coregulated genes, which may potentially contribute to AD, we established a new bioinformatics workflow with known multivariate methods like support vector machines, biclustering, and predicted transcription factor binding site modules by using in silico analysis and over 400 expression arrays from human and mouse. Two significant modules are composed of three transcription factor families: CTCF, SP1F, and EGRF/ZBPF, which are conserved between human and mouse APP promoter sequences. The specific combination of in silico promoter and multivariate analysis can identify regulation mechanisms of genes involved in multifactorial diseases. PMID:21559189

  5. Modeling the interplay of multilevel risk factors for future academic and behavior problems: a person-centered approach.

    PubMed

    Lanza, Stephanie T; Rhoades, Brittany L; Nix, Robert L; Greenberg, Mark T

    2010-05-01

    This study identified profiles of 13 risk factors across child, family, school, and neighborhood domains in a diverse sample of children in kindergarten from four US locations (n = 750; 45% minority). It then examined the relation of those early risk profiles to externalizing problems, school failure, and low academic achievement in Grade 5. A person-centered approach, latent class analysis, revealed four unique risk profiles, which varied considerably across urban African American, urban White, and rural White children. Profiles characterized by several risks that cut across multiple domains conferred the highest risk for negative outcomes. Compared to a variable-centered approach, such as a cumulative risk index, these findings provide a more nuanced understanding of the early precursors to negative outcomes. For example, results suggested that urban children in single-parent homes that have few other risk factors (i.e., show at least average parenting warmth and consistency and report relatively low stress and high social support) are at quite low risk for externalizing problems, but at relatively high risk for poor grades and low academic achievement. These findings provide important information for refining and targeting preventive interventions to groups of children who share particular constellations of risk factors.

  6. Comparisons of Exploratory and Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…

  7. A Bayes Factor Meta-Analysis of Recent Extrasensory Perception Experiments: Comment on Storm, Tressoldi, and Di Risio (2010)

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Morey, Richard D.; Province, Jordan M.

    2013-01-01

    Psi phenomena, such as mental telepathy, precognition, and clairvoyance, have garnered much recent attention. We reassess the evidence for psi effects from Storm, Tressoldi, and Di Risio's (2010) meta-analysis. Our analysis differs from Storm et al.'s in that we rely on Bayes factors, a Bayesian approach for stating the evidence from data for…

  8. A Bayesian approach for incorporating economic factors in sample size design for clinical trials of individual drugs and portfolios of drugs.

    PubMed

    Patel, Nitin R; Ankolekar, Suresh

    2007-11-30

    Classical approaches to clinical trial design ignore economic factors that determine economic viability of a new drug. We address the choice of sample size in Phase III trials as a decision theory problem using a hybrid approach that takes a Bayesian view from the perspective of a drug company and a classical Neyman-Pearson view from the perspective of regulatory authorities. We incorporate relevant economic factors in the analysis to determine the optimal sample size to maximize the expected profit for the company. We extend the analysis to account for risk by using a 'satisficing' objective function that maximizes the chance of meeting a management-specified target level of profit. We extend the models for single drugs to a portfolio of clinical trials and optimize the sample sizes to maximize the expected profit subject to budget constraints. Further, we address the portfolio risk and optimize the sample sizes to maximize the probability of achieving a given target of expected profit.

  9. A Systemic Approach to Implementing a Protective Factors Framework

    ERIC Educational Resources Information Center

    Parsons, Beverly; Jessup, Patricia; Moore, Marah

    2014-01-01

    The leadership team of the national Quality Improvement Center on early Childhood ventured into the frontiers of deep change in social systems by funding four research projects. The purpose of the research projects was to learn about implementing a protective factors approach with the goal of reducing the likelihood of child abuse and neglect. In…

  10. Bootstrap Standard Error Estimates in Dynamic Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Browne, Michael W.

    2010-01-01

    Dynamic factor analysis summarizes changes in scores on a battery of manifest variables over repeated measurements in terms of a time series in a substantially smaller number of latent factors. Algebraic formulae for standard errors of parameter estimates are more difficult to obtain than in the usual intersubject factor analysis because of the…

  11. A novel approach to the analysis of squeezed-film air damping in microelectromechanical systems

    NASA Astrophysics Data System (ADS)

    Yang, Weilin; Li, Hongxia; Chatterjee, Aveek N.; Elfadel, Ibrahim (Abe M.; Ender Ocak, Ilker; Zhang, TieJun

    2017-01-01

    Squeezed-film damping (SFD) is a phenomenon that significantly affects the performance of micro-electro-mechanical systems (MEMS). The total damping force in MEMS mainly include the viscous damping force and elastic damping force. Quality factor (Q factor) is usually used to evaluate the damping in MEMS. In this work, we measure the Q factor of a resonator through experiments in a wide range of pressure levels. In fact, experimental characterizations of MEMS have some limitations because it is difficult to conduct experiments at very high vacuum and also hard to differentiate the damping mechanisms from the overall Q factor measurements. On the other hand, classical theoretical analysis of SFD is restricted to strong assumptions and simple geometries. In this paper, a novel numerical approach, which is based on lattice Boltzmann simulations, is proposed to investigate SFD in MEMS. Our method considers the dynamics of squeezed air flow as well as fluid-solid interactions in MEMS. It is demonstrated that Q factor can be directly predicted by numerical simulation, and our simulation results agree well with experimental data. Factors that influence SFD, such as pressure, oscillating amplitude, and driving frequency, are investigated separately. Furthermore, viscous damping and elastic damping forces are quantitatively compared based on comprehensive simulation. The proposed numerical approach as well as experimental characterization enables us to reveal the insightful physics of squeezed-film air damping in MEMS.

  12. Deuteron electromagnetic form factors with the light-front approach

    NASA Astrophysics Data System (ADS)

    Sun, Bao-dong; Dong, Yu-bing

    2017-01-01

    The electromagnetic form factors and low-energy observables of the deuteron are studied with the help of the light-front approach, where the deuteron is regarded as a weakly bound state of a proton and a neutron. Both the S and D wave interacting vertexes among the deuteron, proton, and neutron are taken into account. Moreover, the regularization functions are also introduced. In our calculations, the vertex and the regularization functions are employed to simulate the momentum distribution inside the deuteron. Our numerical results show that the light-front approach can roughly reproduce the deuteron electromagnetic form factors, like charge G 0, magnetic G 1, and quadrupole G 2, in the low Q 2 region. The important effect of the D wave vertex on G 2 is also addressed. Supported by National Natural Science Foundation of China (10975146, 11475192), The fund provided by the Sino-German CRC 110 “Symmetries and the Emergence of Structure in QCD" project is also appreciated, YBD thanks FAPESP grant 2011/11973-4 for funding his visit to ICTP-SAIFR

  13. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  14. How Factor Analysis Can Be Used in Classification.

    ERIC Educational Resources Information Center

    Harman, Harry H.

    This is a methodological study that suggests a taxometric technique for objective classification of yeasts. It makes use of the minres method of factor analysis and groups strains of yeast according to their factor profiles. The similarities are judged in the higher-dimensional space determined by the factor analysis, but otherwise rely on the…

  15. Use of traffic displays for general aviation approach spacing : a human factors study

    DOT National Transportation Integrated Search

    2007-12-01

    A flight experiment was conducted to assess human factors issues associated with pilot use of traffic displays for approach : spacing. Sixteen multi-engine rated pilots participated. Eight flew approaches in a twin-engine Piper Aztec originating in :...

  16. Ideas for a pattern-oriented approach towards a VERA analysis ensemble

    NASA Astrophysics Data System (ADS)

    Gorgas, T.; Dorninger, M.

    2010-09-01

    interpolation errors. With the concept of an analysis ensemble we hope to get a more detailed sight on both sources of analysis errors. For the computation of the VERA ensemble members a sample of Gaussian random perturbations is produced for each station and parameter. The deviation of perturbations is based on the correction proposals by the VERA QC scheme to provide some "natural" limits for the ensemble. In order to put more emphasis on the weather situation we aim to integrate the main synoptic field structures as weighting factors for the perturbations. Two widely approved approaches are used for the definition of these main field structures: The Principal Component Analysis and a 2D-Discrete Wavelet Transform. The results of tests concerning the implementation of this pattern-supported analysis ensemble system and a comparison of the different approaches are given in the presentation.

  17. A global optimization approach to multi-polarity sentiment analysis.

    PubMed

    Li, Xinmiao; Li, Jing; Wu, Yukeng

    2015-01-01

    Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From

  18. Educating the ambulance technician, paramedic, and clinical supervisor: using factor analysis to inform the curriculum

    PubMed Central

    Kilner, T

    2004-01-01

    Methods: Data generated by a Delphi study investigating the desirable attributes of ambulance technician, paramedic, and clinical supervisor were subject to factor analysis to explore inter-relations between the variables or desirable attributes. Variables that loaded onto any factor at a correlation level of >0.3 were included in the analysis. Results: Three factors emerged in each of the occupational groups. In respect of the ambulance technician these factors may be described as; core professional skills, individual and collaborative approaches to health and safety, and the management of self and clinical situations. For the paramedic the themes are; core professional skills, management of self and clinical situations, and approaches to health and safety. For the clinical supervisor there is again a theme described as core professional skills, with a further two themes described as role model and lifelong learning. Conclusions: The profile of desirable attributes emerging from this study are remarkably similar to the generic benchmark statements for health care programmes outlined by the Quality Assurance Agency for Higher Education. It seems that a case is emerging for a revision of the curriculum currently used for the education and training of ambulance staff, which is more suited to a consumer led health service and which reflects the broader professional base seen in programmes associated with other healthcare professions. This study has suggested outline content, and module structure for the education of the technician, paramedic, and clinical supervisor, based on empirical evidence. PMID:15107389

  19. A Proposed Solution to the Problem with Using Completely Random Data to Assess the Number of Factors with Parallel Analysis

    ERIC Educational Resources Information Center

    Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo

    2012-01-01

    A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…

  20. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    ERIC Educational Resources Information Center

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…

  1. Factors That Modulate Neurogenesis: A Top-Down Approach.

    PubMed

    LaDage, Lara D

    2016-08-24

    Although hippocampal neurogenesis in the adult brain has been conserved across the vertebrate lineage, laboratory studies have primarily examined this phenomenon in rodent models. This approach has been successful in elucidating important factors and mechanisms that can modulate rates of hippocampal neurogenesis, including hormones, environmental complexity, learning and memory, motor stimulation, and stress. However, recent studies have found that neurobiological research on neurogenesis in rodents may not easily translate to, or explain, neurogenesis patterns in nonrodent systems, particularly in species examined in the field. This review examines some of the evolutionary and ecological variables that may also modulate neurogenesis patterns. This 'top-down' and more naturalistic approach, which incorporates ecology and natural history, particularly of nonmodel species, may allow for a more comprehensive understanding of the functional significance of neurogenesis. © 2016 S. Karger AG, Basel.

  2. BFDCA: A Comprehensive Tool of Using Bayes Factor for Differential Co-Expression Analysis.

    PubMed

    Wang, Duolin; Wang, Juexin; Jiang, Yuexu; Liang, Yanchun; Xu, Dong

    2017-02-03

    Comparing the gene-expression profiles between biological conditions is useful for understanding gene regulation underlying complex phenotypes. Along this line, analysis of differential co-expression (DC) has gained attention in the recent years, where genes under one condition have different co-expression patterns compared with another. We developed an R package Bayes Factor approach for Differential Co-expression Analysis (BFDCA) for DC analysis. BFDCA is unique in integrating various aspects of DC patterns (including Shift, Cross, and Re-wiring) into one uniform Bayes factor. We tested BFDCA using simulation data and experimental data. Simulation results indicate that BFDCA outperforms existing methods in accuracy and robustness of detecting DC pairs and DC modules. Results of using experimental data suggest that BFDCA can cluster disease-related genes into functional DC subunits and estimate the regulatory impact of disease-related genes well. BFDCA also achieves high accuracy in predicting case-control phenotypes by using significant DC gene pairs as markers. BFDCA is publicly available at http://dx.doi.org/10.17632/jdz4vtvnm3.1. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Cement Leakage in Percutaneous Vertebral Augmentation for Osteoporotic Vertebral Compression Fractures: Analysis of Risk Factors.

    PubMed

    Xie, Weixing; Jin, Daxiang; Ma, Hui; Ding, Jinyong; Xu, Jixi; Zhang, Shuncong; Liang, De

    2016-05-01

    The risk factors for cement leakage were retrospectively reviewed in 192 patients who underwent percutaneous vertebral augmentation (PVA). To discuss the factors related to the cement leakage in PVA procedure for the treatment of osteoporotic vertebral compression fractures. PVA is widely applied for the treatment of osteoporotic vertebral fractures. Cement leakage is a major complication of this procedure. The risk factors for cement leakage were controversial. A retrospective review of 192 patients who underwent PVA was conducted. The following data were recorded: age, sex, bone density, number of fractured vertebrae before surgery, number of treated vertebrae, severity of the treated vertebrae, operative approach, volume of injected bone cement, preoperative vertebral compression ratio, preoperative local kyphosis angle, intraosseous clefts, preoperative vertebral cortical bone defect, and ratio and type of cement leakage. To study the correlation between each factor and cement leakage ratio, bivariate regression analysis was employed to perform univariate analysis, whereas multivariate linear regression analysis was employed to perform multivariate analysis. The study included 192 patients (282 treated vertebrae), and cement leakage occurred in 100 vertebrae (35.46%). The vertebrae with preoperative cortical bone defects generally exhibited higher cement leakage ratio, and the leakage is typically type C. Vertebrae with intact cortical bones before the procedure tend to experience type S leakage. Univariate analysis showed that patient age, bone density, number of fractured vertebrae before surgery, and vertebral cortical bone were associated with cement leakage ratio (P<0.05). Multivariate analysis showed that the main factors influencing bone cement leakage are bone density and vertebral cortical bone defect, with standardized partial regression coefficients of -0.085 and 0.144, respectively. High bone density and vertebral cortical bone defect are

  4. How to Construct More Accurate Student Models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis

    ERIC Educational Resources Information Center

    Gong, Yue; Beck, Joseph E.; Heffernan, Neil T.

    2011-01-01

    Student modeling is a fundamental concept applicable to a variety of intelligent tutoring systems (ITS). However, there is not a lot of practical guidance on how to construct and train such models. This paper compares two approaches for student modeling, Knowledge Tracing (KT) and Performance Factors Analysis (PFA), by evaluating their predictive…

  5. Analysis of factors related to vagally mediated reflex bradycardia during gastrectomy.

    PubMed

    Kim, Duk-Kyung; Ahn, Hyun Joo; Lee, Seung Won; Choi, Ji Won

    2015-12-01

    Because vagally mediated reflex bradycardia occurs frequently during gastrectomy and is potentially harmful, we compared the incidence of clinically significant reflex bradycardia between patients undergoing laparoscopic gastrectomy (LG) and open gastrectomy (OG) and examined whether the type of surgery (OG vs. LG) was an independent risk factor for clinically significant reflex bradycardia. This prospective observational study evaluated 358 adult patients (age 18-70 years) who were undergoing elective OG or LG for gastric cancer resection. Symptomatic reflex bradycardia was defined as a sudden decrease in heart rate to <50 beats per minute (bpm), or to 50-59 bpm with a systolic blood pressure <70 mmHg, associated with a specific surgical maneuver. If bradycardia or hypotension developed, atropine or ephedrine was administered, in accordance with a predefined treatment protocol. The overall incidence of symptomatic reflex bradycardia was 24.6% (88/358). Univariate analysis revealed the incidence of symptomatic reflex bradycardia in the LG group was significantly lower than that in the OG group [13.0% (13/100) vs. 29.1% (75/258), p = 0.002]. Multivariate logistic regression analysis revealed that the type of surgery (OG vs. LG), advanced age, preoperative bradycardia, type of muscle relaxant (vecuronium vs. rocuronium), no use of intravenous remifentanil, and low core temperature, were independent risk factors for symptomatic reflex bradycardia (odds ratio 3.184; 95% confidence interval 1.490-6.800; p = 0.003). The LG approach was associated with a reduced risk of clinically significant reflex bradycardia compared with the OG approach.

  6. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    NASA Astrophysics Data System (ADS)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  7. An Overview of Focal Approaches of Critical Discourse Analysis

    ERIC Educational Resources Information Center

    Jahedi, Maryam; Abdullah, Faiz Sathi; Mukundan, Jayakaran

    2014-01-01

    This article aims to present detailed accounts of central approaches to Critical Discourse Analysis. It focuses on the work of three prominent scholars such as Fairclough's critical approach, Wodak's discourse-historical approach and Van Dijk's socio-cognitive approach. This study concludes that a combination of these three approaches can be…

  8. Risk factor investigation for cardiovascular health through WHO STEPS approach in Ardabil, Iran.

    PubMed

    Sadeghi-Bazargani, H; Jafarzadeh, H; Fallah, M; Hekmat, S; Bashiri, J; Hosseingolizadeh, G h; Soltanmohammadzadeh, M S; Mortezazadeh, A; Shaker, A; Danehzan, M; Zohouri, A; Khosravi, O; Nasimidoust, R; Malekpour, N; Kharazmi, E; Babaei, M; Nadirmohammadi, M; Mashhadi-Abdollahi, H

    2011-01-01

    Reliable evidence is the keystone for any noncommunicable disease (NCD) prevention plan to be initiated. In this study we carried out a risk factor investigation based on the WHO Stepwise approach to Surveillance (STEPS). The study was conducted on 1000 adults between 15 and 64 years of age living in Ardabil province, north-west Iran during 2006, based on the WHO STEPS approach to surveillance of risk factors for NCD. At this stage only the first and second steps were carried out. Data were collected through standard questionnaires and methods analyzed using STATA version 8 statistical software package. 29.0% of men and 2.6% of women were current daily tobacco smokers. The mean number of manufactured cigarettes smoked per day was 18.9 among current daily smokers. Smoking was most prevalent among men of low-income families and those of lower education. The mean body mass index (BMI) was 26.6 kg/m(2), and was significantly correlated with systolic blood pressure. 58.9% were overweight or obese; 18.0% had raised blood pressure and 3.7% had isolated systolic hypertension. The mean number of servings of fruit consumed per day was 1.1; 33.1% had low levels of activity. Combined risk factor analysis showed that 4.1% of participants were in the low-risk group (up to 5.1% among men and 3.2% among women). Those in the high-risk group comprised 25.6% in the 25- to 44-year age group and 49.7% in the 45- to 64-year age group. Mean BMI increased by age in both sexes at least at the first three decades of adult life. Based on observed status of risk for cardiovascular health, burden of cardiovascular diseases is expected to increase if an effective prevention strategy is not undertaken.

  9. Wind Tunnel Strain-Gage Balance Calibration Data Analysis Using a Weighted Least Squares Approach

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2017-01-01

    A new approach is presented that uses a weighted least squares fit to analyze wind tunnel strain-gage balance calibration data. The weighted least squares fit is specifically designed to increase the influence of single-component loadings during the regression analysis. The weighted least squares fit also reduces the impact of calibration load schedule asymmetries on the predicted primary sensitivities of the balance gages. A weighting factor between zero and one is assigned to each calibration data point that depends on a simple count of its intentionally loaded load components or gages. The greater the number of a data point's intentionally loaded load components or gages is, the smaller its weighting factor becomes. The proposed approach is applicable to both the Iterative and Non-Iterative Methods that are used for the analysis of strain-gage balance calibration data in the aerospace testing community. The Iterative Method uses a reasonable estimate of the tare corrected load set as input for the determination of the weighting factors. The Non-Iterative Method, on the other hand, uses gage output differences relative to the natural zeros as input for the determination of the weighting factors. Machine calibration data of a six-component force balance is used to illustrate benefits of the proposed weighted least squares fit. In addition, a detailed derivation of the PRESS residuals associated with a weighted least squares fit is given in the appendices of the paper as this information could not be found in the literature. These PRESS residuals may be needed to evaluate the predictive capabilities of the final regression models that result from a weighted least squares fit of the balance calibration data.

  10. Assessing risk factors for dental caries: a statistical modeling approach.

    PubMed

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  11. Modeling the Interplay of Multilevel Risk Factors for Future Academic and Behavior Problems: A Person-Centered Approach

    PubMed Central

    Lanza, Stephanie T.; Rhoades, Brittany L.; Nix, Robert L.; Greenberg, Mark T.

    2010-01-01

    This study identified profiles of 13 risk factors across child, family, school, and neighborhood domains in a diverse sample of children in kindergarten from 4 US locations (n = 750; 45% minority). It then examined the relation of those early risk profiles to externalizing problems, school failure, and low academic achievement in Grade 5. A person-centered approach, latent class analysis, revealed four unique risk profiles, which varied considerably across urban African American, urban white, and rural white children. Profiles characterized by several risks that cut across multiple domains conferred the highest risk for negative outcomes. Compared to a variable-centered approach, such as a cumulative risk index, these findings provide a more nuanced understanding of the early precursors to negative outcomes. For example, results suggested that urban children in single-parent homes that have few other risk factors (i.e., show at least average parenting warmth and consistency and report relatively low stress and high social support) are at quite low risk for externalizing problems, but at relatively high risk for poor grades and low academic achievement. These findings provide important information for refining and targeting preventive interventions to groups of children who share particular constellations of risk factors. PMID:20423544

  12. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing

    PubMed Central

    Wang, Guoli; Ebrahimi, Nader

    2014-01-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345

  13. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing.

    PubMed

    Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader

    2015-04-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.

  14. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    ERIC Educational Resources Information Center

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  15. Factors influencing oncology nurses' approaches to accommodating cultural needs in palliative care.

    PubMed

    Huang, Ya-Ling; Yates, Patsy; Prior, Deborah

    2009-12-01

    The purpose of this study is to explore the social construction of cultural issues in palliative care amongst oncology nurses. Australia is a nation composed of people from different cultural origins with diverse linguistic, spiritual, religious and social backgrounds. The challenge of working with an increasingly culturally diverse population is a common theme expressed by many healthcare professionals from a variety of countries. Grounded theory was used to investigate the processes by which nurses provide nursing care to cancer patients from diverse cultural backgrounds. Semi-structured interviews with seven Australian oncology nurses provided the data for the study; the data was analysed using grounded theory data analysis techniques. The core category emerging from the study was that of accommodating cultural needs. This paper focuses on describing the series of subcategories that were identified as factors which could influence the process by which nurses would accommodate cultural needs. These factors included nurses' views and understandings of culture and cultural mores, their philosophy of cultural care, nurses' previous experiences with people from other cultures and organisational approaches to culture and cultural care. This study demonstrated that previous experiences with people from other cultures and organisational approaches to culture and cultural care often influenced nurses' views and understandings of culture and cultural mores and their beliefs, attitudes and behaviours in providing cultural care. Relevance to clinical practice. It is imperative to appreciate how nurses' experiences with people from other cultures can be recognised and built upon or, if necessary, challenged. Furthermore, nurses' cultural competence and experiences with people from other cultures need to be further investigated in clinical practice.

  16. Analysis of older driver safety interventions : a human factors taxonomic approach

    DOT National Transportation Integrated Search

    1999-03-01

    The careful application of human factors design principles and guidelines is integral to : the development of safe, efficient and usable Intelligent Transportation Systems (ITS). One : segment of the driving population that may significantly benefit ...

  17. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    NASA Astrophysics Data System (ADS)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  18. Selective Prevention Approaches to Build Protective Factors in Early Intervention

    ERIC Educational Resources Information Center

    Shapiro, Cheri J.

    2014-01-01

    Young children with disabilities may be at elevated risk for behavior problems as well as maltreatment. preventive approaches that can be infused into early intervention services are needed to support parents, build competencies among young children, and enhance protective factors that may temper risk. Two interventions--Stepping Stones Triple P,…

  19. Job compensable factors and factor weights derived from job analysis data.

    PubMed

    Chi, Chia-Fen; Chang, Tin-Chang; Hsia, Ping-Ling; Song, Jen-Chieh

    2007-06-01

    Government data on 1,039 job titles in Taiwan were analyzed to assess possible relationships between job attributes and compensation. For each job title, 79 specific variables in six major classes (required education and experience, aptitude, interest, work temperament, physical demands, task environment) were coded to derive the statistical predictors of wage for managers, professionals, technical, clerical, service, farm, craft, operatives, and other workers. Of the 79 variables, only 23 significantly related to pay rate were subjected to a factor and multiple regression analysis for predicting monthly wages. Given the heterogeneous nature of collected job titles, a 4-factor solution (occupational knowledge and skills, human relations skills, work schedule hardships, physical hardships) explaining 43.8% of the total variance but predicting only 23.7% of the monthly pay rate was derived. On the other hand, multiple regression with 9 job analysis items (required education, professional training, professional certificate, professional experience, coordinating, leadership and directing, demand on hearing, proportion of shift working indoors, outdoors and others, rotating shift) better predicted pay and explained 32.5% of the variance. A direct comparison of factors and subfactors of job evaluation plans indicated mental effort and responsibility (accountability) had not been measured with the current job analysis data. Cross-validation of job evaluation factors and ratings with the wage rates is required to calibrate both.

  20. Factor-Analytic and Individualized Approaches to Constructing Brief Measures of ADHD Behaviors

    ERIC Educational Resources Information Center

    Volpe, Robert J.; Gadow, Kenneth D.; Blom-Hoffman, Jessica; Feinberg, Adam B.

    2009-01-01

    Two studies were performed to examine a factor-analytic and an individualized approach to creating short progress-monitoring measures from the longer "ADHD-Symptom Checklist-4" (ADHD-SC4). In Study 1, teacher ratings on items of the ADHD:Inattentive (IA) and ADHD:Hyperactive-Impulsive (HI) scales of the ADHD-SC4 were factor analyzed in a normative…

  1. Analysis of stock investment selection based on CAPM using covariance and genetic algorithm approach

    NASA Astrophysics Data System (ADS)

    Sukono; Susanti, D.; Najmia, M.; Lesmana, E.; Napitupulu, H.; Supian, S.; Putra, A. S.

    2018-03-01

    Investment is one of the economic growth factors of countries, especially in Indonesia. Stocks is a form of investment, which is liquid. In determining the stock investment decisions which need to be considered by investors is to choose stocks that can generate maximum returns with a minimum risk level. Therefore, we need to know how to allocate the capital which may give the optimal benefit. This study discusses the issue of stock investment based on CAPM which is estimated using covariance and Genetic Algorithm approach. It is assumed that the stocks analyzed follow the CAPM model. To do the estimation of beta parameter on CAPM equation is done by two approach, first is to be represented by covariance approach, and second with genetic algorithm optimization. As a numerical illustration, in this paper analyzed ten stocks traded on the capital market in Indonesia. The results of the analysis show that estimation of beta parameters using covariance and genetic algorithm approach, give the same decision, that is, six underpriced stocks with buying decision, and four overpriced stocks with a sales decision. Based on the analysis, it can be concluded that the results can be used as a consideration for investors buying six under-priced stocks, and selling four overpriced stocks.

  2. Factors associated with escalation and problematic approaches toward public figures.

    PubMed

    Meloy, J Reid; James, David V; Mullen, Paul E; Pathé, Michele T; Farnham, Frank R; Preston, Lulu F; Darnley, Brian J

    2011-01-01

    Detailed comparison of factors associated with abnormal approach to the prominent and with escalation from communication to approach has not hitherto been undertaken. This partially reflects the failure of individual studies to adopt compatible terminologies. This study involves a careful dissection of six public figure studies, three involving U.S. politicians, two Hollywood celebrities, and one the British Royal Family. Common findings were unearthed across six headings. Approachers were significantly more likely to exhibit serious mental illness, engage in multiple means of communication, involve multiple contacts/targets, and to incorporate into their communication requests for help. They were significantly less likely to use threatening or antagonistic language in their communications, except in those cases involving security breaches. These results emphasize the importance of integrating mental health findings and preventive measures into risk management. Approach should not be regarded as a single behavioral category and has multiple motivations. Future studies should adopt standard terminology, preferably taken from the general stalking research. © 2010 American Academy of Forensic Sciences.

  3. Using BMDP and SPSS for a Q factor analysis.

    PubMed

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  4. Transcription Factor NRF2 as a Therapeutic Target for Chronic Diseases: A Systems Medicine Approach.

    PubMed

    Cuadrado, Antonio; Manda, Gina; Hassan, Ahmed; Alcaraz, María José; Barbas, Coral; Daiber, Andreas; Ghezzi, Pietro; León, Rafael; López, Manuela G; Oliva, Baldo; Pajares, Marta; Rojo, Ana I; Robledinos-Antón, Natalia; Valverde, Angela M; Guney, Emre; Schmidt, Harald H H W

    2018-04-01

    Systems medicine has a mechanism-based rather than a symptom- or organ-based approach to disease and identifies therapeutic targets in a nonhypothesis-driven manner. In this work, we apply this to transcription factor nuclear factor (erythroid-derived 2)-like 2 (NRF2) by cross-validating its position in a protein-protein interaction network (the NRF2 interactome) functionally linked to cytoprotection in low-grade stress, chronic inflammation, metabolic alterations, and reactive oxygen species formation. Multiscale network analysis of these molecular profiles suggests alterations of NRF2 expression and activity as a common mechanism in a subnetwork of diseases (the NRF2 diseasome). This network joins apparently heterogeneous phenotypes such as autoimmune, respiratory, digestive, cardiovascular, metabolic, and neurodegenerative diseases, along with cancer. Importantly, this approach matches and confirms in silico several applications for NRF2-modulating drugs validated in vivo at different phases of clinical development. Pharmacologically, their profile is as diverse as electrophilic dimethyl fumarate, synthetic triterpenoids like bardoxolone methyl and sulforaphane, protein-protein or DNA-protein interaction inhibitors, and even registered drugs such as metformin and statins, which activate NRF2 and may be repurposed for indications within the NRF2 cluster of disease phenotypes. Thus, NRF2 represents one of the first targets fully embraced by classic and systems medicine approaches to facilitate both drug development and drug repurposing by focusing on a set of disease phenotypes that appear to be mechanistically linked. The resulting NRF2 drugome may therefore rapidly advance several surprising clinical options for this subset of chronic diseases. Copyright © 2018 by The Author(s).

  5. Quality factor analysis for aberrated laser beam

    NASA Astrophysics Data System (ADS)

    Ghafary, B.; Alavynejad, M.; Kashani, F. D.

    2006-12-01

    The quality factor of laser beams has attracted considerable attention and some different approaches have been reported to treat the problem. In this paper we analyze quality factor of laser beam and compare the effect of different aberrations on beam quality by expanding pure phase term of wavefront in terms of Zernike polynomials. Also we analyze experimentally the change of beam quality for different Astigmatism aberrations, and compare theoretical results with experimentally results. The experimental and theoretical results are in good agreement.

  6. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    PubMed

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  7. A more rational, theory-driven approach to analysing the factor structure of the Edinburgh Postnatal Depression Scale.

    PubMed

    Kozinszky, Zoltan; Töreki, Annamária; Hompoth, Emőke A; Dudas, Robert B; Németh, Gábor

    2017-04-01

    We endeavoured to analyze the factor structure of the Edinburgh Postnatal Depression Scale (EPDS) during a screening programme in Hungary, using exploratory (EFA) and confirmatory factor analysis (CFA), testing both previously published models and newly developed theory-driven ones, after a critical analysis of the literature. Between April 2011 and January 2015, a sample of 2967 pregnant women (between 12th and 30th weeks of gestation) and 714 women 6 weeks after delivery completed the Hungarian version of the EPDS in South-East Hungary. EFAs suggested unidimensionality in both samples. 33 out of 42 previously published models showed good and 6 acceptable fit with our antepartum data in CFAs, whilst 10 of them showed good and 28 acceptable fit in our postpartum sample. Using multiple fit indices, our theory-driven anhedonia (items 1,2) - anxiety (items 4,5) - low mood (items 8,9) model provided the best fit in the antepartum sample. In the postpartum sample, our theory-driven models were again among the best performing models, including an anhedonia and an anxiety factor together with either a low mood or a suicidal risk factor (items 3,6,10). The EPDS showed moderate within- and between-culture invariability, although this would also need to be re-examined with a theory-driven approach. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  8. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  9. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  10. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  11. Risk Factors for Overweight/Obesity in Preschool Children: An Ecological Approach

    PubMed Central

    McBride, Brent A.; Fiese, Barbara H.; Jones, Blake L.; Cho, Hyunkeun

    2013-01-01

    Abstract Background Identification of risk factors is critical to preventing the childhood obesity epidemic. Risk factors that contribute to obesity are multifactorial. However, limited research has focused on identifying obesity risk factors using an ecological approach. Methods Baseline self-report survey data from the STRONG Kids program were used. The sample consisted of 329 parent-child dyads recruited from childcare programs in east-central Illinois. Child height and weight were measured and converted to age- and sex-specific z-scores using standard growth charts. An ecological model provided the theoretical framework for the selection of 22 previously reported childhood obesity risk factors. Multiple logistic regression analyses were used to identify risk factors. Results Of 22 potential risk factors, three were found to be significantly associated with child overweight/obesity. These included child nighttime sleep duration (χ2=8.56; p=0.003), parent BMI (χ2=5.62; p=0.01), and parental restrictive feeding for weight control (χ2=4.77; p=0.02). Children who slept for 8 hours and less were 2.2 times more likely to be overweight/obese [95% confidence interval (CI): 1.3–3.7), whereas children with an overweight/obese parent were 1.9 times more likely to be overweight/obese (95% CI: 1.12–3.2). Finally, children whose parents used restrictive feeding practices were 1.75 times more likely to be overweight/obese (95% CI: 1.06–2.9). Conclusions Using an ecological approach, we conclude that childhood obesity prevention efforts may benefit from targeting the key risk factors of child sleep duration, parent BMI, and parental restrictive feeding practices as focus areas for obesity prevention. PMID:24020790

  12. Work-Centered Approach to Insurgency Campaign Analysis

    DTIC Science & Technology

    2007-06-01

    a constructivist or sensemaking philosophy by defining data, information , situation awareness , and situation understanding in the following manner...present paper explores a new approach to understanding transnational insurgency movements –an approach based on a fundamental analysis of the knowledge ...country or region. By focusing at the fundamental level of knowledge creation, the resulting framework allows an understanding of insurgency

  13. Knowledge-driven binning approach for rare variant association analysis: application to neuroimaging biomarkers in Alzheimer's disease.

    PubMed

    Kim, Dokyoon; Basile, Anna O; Bang, Lisa; Horgusluoglu, Emrin; Lee, Seunggeun; Ritchie, Marylyn D; Saykin, Andrew J; Nho, Kwangsik

    2017-05-18

    Rapid advancement of next generation sequencing technologies such as whole genome sequencing (WGS) has facilitated the search for genetic factors that influence disease risk in the field of human genetics. To identify rare variants associated with human diseases or traits, an efficient genome-wide binning approach is needed. In this study we developed a novel biological knowledge-based binning approach for rare-variant association analysis and then applied the approach to structural neuroimaging endophenotypes related to late-onset Alzheimer's disease (LOAD). For rare-variant analysis, we used the knowledge-driven binning approach implemented in Bin-KAT, an automated tool, that provides 1) binning/collapsing methods for multi-level variant aggregation with a flexible, biologically informed binning strategy and 2) an option of performing unified collapsing and statistical rare variant analyses in one tool. A total of 750 non-Hispanic Caucasian participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI) cohort who had both WGS data and magnetic resonance imaging (MRI) scans were used in this study. Mean bilateral cortical thickness of the entorhinal cortex extracted from MRI scans was used as an AD-related neuroimaging endophenotype. SKAT was used for a genome-wide gene- and region-based association analysis of rare variants (MAF (minor allele frequency) < 0.05) and potential confounding factors (age, gender, years of education, intracranial volume (ICV) and MRI field strength) for entorhinal cortex thickness were used as covariates. Significant associations were determined using FDR adjustment for multiple comparisons. Our knowledge-driven binning approach identified 16 functional exonic rare variants in FANCC significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In addition, the approach identified 7 evolutionary conserved regions, which were mapped to FAF1, RFX7, LYPLAL1 and GOLGA3, significantly associated

  14. Analysis of spatio-temporal variability of C-factor derived from remote sensing data

    NASA Astrophysics Data System (ADS)

    Pechanec, Vilem; Benc, Antonin; Purkyt, Jan; Cudlin, Pavel

    2016-04-01

    In some risk areas water erosion as the present task has got the strong influence on agriculture and can threaten inhabitants. In our country combination of USLE and RUSLE models has been used for water erosion assessment (Krása et al., 2013). Role of vegetation cover is characterized by the help of vegetation protection factor, so-called C- factor. Value of C-factor is given by the ratio of washing-off on a plot with arable crops to standard plot which is kept as fallow regularly spud after any rain (Janeček et al., 2012). Under conditions we cannot identify crop structure and its turn, determination of C-factor can be problem in large areas. In such case we only determine C-factor according to the average crop representation. New technologies open possibilities for acceleration and specification of the approach. Present-day approach for the C-factor determination is based on the analysis of multispectral image data. Red and infrared spectrum is extracted and these parts of image are used for computation of vegetation index series (NDVI, TSAVI). Acquired values for fractional time sections (during vegetation period) are averaged out. At the same time values of vegetation indices for a forest and cleared area are determined. Also regressive coefficients are computed. Final calculation is done by the help of regressive equations expressing relation between values of NDVI and C-factor (De Jong, 1994; Van der Knijff, 1999; Karaburun, 2010). Up-to-date land use layer is used for the determination of erosion threatened areas on the base of selection of individual landscape segments of erosion susceptible categories of land use. By means of Landsat 7 data C-factor has been determined for the whole area of the Czech Republic in every month of the year of 2014. At the model area in a small watershed C-factor has been determined by the conventional (tabular) procedure. Analysis was focused on: i) variability assessment of C-factor values while using the conventional

  15. What School Psychologists Need to Know about Factor Analysis

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Dombrowski, Stefan C.

    2017-01-01

    Factor analysis is a versatile class of psychometric techniques used by researchers to provide insight into the psychological dimensions (factors) that may account for the relationships among variables in a given dataset. The primary goal of a factor analysis is to determine a more parsimonious set of variables (i.e., fewer than the number of…

  16. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  17. Meta-Analysis for Sociology – A Measure-Driven Approach

    PubMed Central

    Roelfs, David J.; Shor, Eran; Falzon, Louise; Davidson, Karina W.; Schwartz, Joseph E.

    2013-01-01

    Meta-analytic methods are becoming increasingly important in sociological research. In this article we present an approach for meta-analysis which is especially helpful for sociologists. Conventional approaches to meta-analysis often prioritize “concept-driven” literature searches. However, in disciplines with high theoretical diversity, such as sociology, this search approach might constrain the researcher’s ability to fully exploit the entire body of relevant work. We explicate a “measure-driven” approach, in which iterative searches and new computerized search techniques are used to increase the range of publications found (and thus the range of possible analyses) and to traverse time and disciplinary boundaries. We demonstrate this measure-driven search approach with two meta-analytic projects, examining the effects of various social variables on all-cause mortality. PMID:24163498

  18. Text mining factor analysis (TFA) in green tea patent data

    NASA Astrophysics Data System (ADS)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  19. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  20. Interpersonal Tension: A Two-Factor Approach to the POX Situation.

    ERIC Educational Resources Information Center

    Gupta, Mahesh

    1985-01-01

    A theoretical explanation, in terms of a two-factor approach to a Person-Other-Issue (POX) Situation is offered, in an attempt to fill the void that exists in the face of the Heider-Newcomb controversy about POX balance. Validity and parsimony is demonstrated by applying it to some of the POX data reported in earlier studies. (Author/BL)

  1. The Impact of Redundancy and Teamwork on Resilience Engineering Factors by Fuzzy Mathematical Programming and Analysis of Variance in a Large Petrochemical Plant.

    PubMed

    Azadeh, Ali; Salehi, Vahid; Mirzayi, Mahsa

    2016-12-01

    Resilience engineering (RE) is a new paradigm that can control incidents and reduce their consequences. Integrated RE includes four new factors-self-organization, teamwork, redundancy, and fault-tolerance-in addition to conventional RE factors. This study aimed to evaluate the impacts of these four factors on RE and determine the most efficient factor in an uncertain environment. The required data were collected through a questionnaire in a petrochemical plant in June 2013. The questionnaire was completed by 115 respondents including 37 managers and 78 operators. Fuzzy data envelopment analysis was used in different α-cuts in order to calculate the impact of each factor. Analysis of variance was employed to compare the efficiency score means of the four above-mentioned factors. The results showed that as α approached 0 and the system became fuzzier (α = 0.3 and α = 0.1), teamwork played a significant role and had the highest impact on the resilient system. In contrast, as α approached 1 and the fuzzy system went toward a certain mode (α = 0.9 and α = 1), redundancy had a vital role in the selected resilient system. Therefore, redundancy and teamwork were the most efficient factors. The approach developed in this study could be used for identifying the most important factors in such environments. The results of this study may help managers to have better understanding of weak and strong points in such industries.

  2. Using Linear Regression To Determine the Number of Factors To Retain in Factor Analysis and the Number of Issues To Retain in Delphi Studies and Other Surveys.

    ERIC Educational Resources Information Center

    Jurs, Stephen; And Others

    The scree test and its linear regression technique are reviewed, and results of its use in factor analysis and Delphi data sets are described. The scree test was originally a visual approach for making judgments about eigenvalues, which considered the relationships of the eigenvalues to one another as well as their actual values. The graph that is…

  3. Common factor analysis versus principal component analysis: choice for symptom cluster research.

    PubMed

    Kim, Hee-Ju

    2008-03-01

    The purpose of this paper is to examine differences between two factor analytical methods and their relevance for symptom cluster research: common factor analysis (CFA) versus principal component analysis (PCA). Literature was critically reviewed to elucidate the differences between CFA and PCA. A secondary analysis (N = 84) was utilized to show the actual result differences from the two methods. CFA analyzes only the reliable common variance of data, while PCA analyzes all the variance of data. An underlying hypothetical process or construct is involved in CFA but not in PCA. PCA tends to increase factor loadings especially in a study with a small number of variables and/or low estimated communality. Thus, PCA is not appropriate for examining the structure of data. If the study purpose is to explain correlations among variables and to examine the structure of the data (this is usual for most cases in symptom cluster research), CFA provides a more accurate result. If the purpose of a study is to summarize data with a smaller number of variables, PCA is the choice. PCA can also be used as an initial step in CFA because it provides information regarding the maximum number and nature of factors. In using factor analysis for symptom cluster research, several issues need to be considered, including subjectivity of solution, sample size, symptom selection, and level of measure.

  4. How should health service organizations respond to diversity? A content analysis of six approaches.

    PubMed

    Seeleman, Conny; Essink-Bot, Marie-Louise; Stronks, Karien; Ingleby, David

    2015-11-16

    Health care organizations need to be responsive to the needs of increasingly diverse patient populations. We compared the contents of six publicly available approaches to organizational responsiveness to diversity. The central questions addressed in this paper are: what are the most consistently recommended issues for health care organizations to address in order to be responsive to the needs of diverse groups that differ from the majority population? How much consensus is there between various approaches? We purposively sampled six approaches from the US, Australia and Europe and used qualitative textual analysis to categorize the content of each approach into domains (conceptually distinct topic areas) and, within each domain, into dimensions (operationalizations). The resulting classification framework was used for comparative analysis of the content of the six approaches. We identified seven domains that were represented in most or all approaches: organizational commitment, empirical evidence on inequalities and needs, a competent and diverse workforce, ensuring access for all users, ensuring responsiveness in care provision, fostering patient and community participation, and actively promoting responsiveness. Variations in the operationalization of these domains related to different scopes, contexts and types of diversity. For example, approaches that focus on ethnic diversity mostly provide recommendations to handle cultural and language differences; approaches that take an intersectional approach and broaden their target population to vulnerable groups in a more general sense also pay attention to factors such as socio-economic status and gender. Despite differences in labeling, there is a broad consensus about what health care organizations need to do in order to be responsive to patient diversity. This opens the way to full scale implementation of organizational responsiveness in healthcare and structured evaluation of its effectiveness in improving

  5. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  6. A systematic review of methodology: time series regression analysis for environmental factors and infectious diseases.

    PubMed

    Imai, Chisato; Hashizume, Masahiro

    2015-03-01

    Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases.

  7. A Qualitative Analysis of Faculty and Student Perceptions of Effective Online Class Communities Using Herzberg's Motivator-Hygiene Factors

    ERIC Educational Resources Information Center

    Costello, Rebecca; Welch, S. A.

    2014-01-01

    This article describes a qualitative approach in understanding factors that are evident in effective online class communities. Instructors and students in the same class were asked about their perceptions regarding what constitutes an effective online experience. The analysis was done using both Herzberg's (1962, 1965) motivator-hygiene factors…

  8. Face Aging Effect Simulation Using Hidden Factor Analysis Joint Sparse Representation.

    PubMed

    Yang, Hongyu; Huang, Di; Wang, Yunhong; Wang, Heng; Tang, Yuanyan

    2016-06-01

    Face aging simulation has received rising investigations nowadays, whereas it still remains a challenge to generate convincing and natural age-progressed face images. In this paper, we present a novel approach to such an issue using hidden factor analysis joint sparse representation. In contrast to the majority of tasks in the literature that integrally handle the facial texture, the proposed aging approach separately models the person-specific facial properties that tend to be stable in a relatively long period and the age-specific clues that gradually change over time. It then transforms the age component to a target age group via sparse reconstruction, yielding aging effects, which is finally combined with the identity component to achieve the aged face. Experiments are carried out on three face aging databases, and the results achieved clearly demonstrate the effectiveness and robustness of the proposed method in rendering a face with aging effects. In addition, a series of evaluations prove its validity with respect to identity preservation and aging effect generation.

  9. Inclusive Higgs boson production at the LHC in the kT -factorization approach

    NASA Astrophysics Data System (ADS)

    Abdulov, N. A.; Lipatov, A. V.; Malyshev, M. A.

    2018-03-01

    We investigate the inclusive Higgs boson production in proton-proton collisions at the CERN LHC conditions using the kT-factorization approach. Our analysis is based on the dominant off-shell gluon-gluon fusion subprocess (where the transverse momenta of initial gluons are taken into account) and covers H →γ γ , H →Z Z*→4 l (where l =e , μ ) and H →W+W-→e±μ∓ν ν ¯ decay channels. The transverse momentum dependent (or unintegrated) gluon densities in a proton were derived from Ciafaloni-Catani-Fiorani-Marchesini equation, which resums large logarithmic terms proportional to ln s ˜ln 1 /x , important at high energies. As an alternative choice, we apply the Kimber-Martin-Ryskin prescription, where the transverse momentum dependent gluon density is constructed from the known conventional parton distributions. We estimate the theoretical uncertainties of our calculations and compare our results with next-to-next-to-leading-order plus next-to-next-to-leading-logarithmic ones obtained using collinear QCD factorization. Our predictions agree well with the latest experimental data taken by the CMS and ATLAS Collaborations at √{s }=8 and 13 TeV.

  10. Derived Basic Ability Factors: A Factor Analysis Replication Study.

    ERIC Educational Resources Information Center

    Lee, Mickey, M.; Lee, Lynda Newby

    The purpose of this study was to replicate the study conducted by Potter, Sagraves, and McDonald to determine whether their recommended analysis could separate criterion variables into similar factors that were stable from year to year and from school to school. The replication samples consisted of all students attending Louisiana State University…

  11. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    PubMed

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (P<0.001). Recurrence was found in 18 cases of the benign group and 39 cases of the malignant group, and results were significantly different (P<0.001). Tumor recurrence was shorter in patients with a higher McCormick grade (P<0.001). Recurrence was found in 13 patients with resection and all the patients with partial resection or biopsy/decompression. The results were significantly different (P<0.001). Logistic regression analysis of total resection-related factors showed that total resection

  12. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors

    PubMed Central

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-01-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (P<0.001). Recurrence was found in 18 cases of the benign group and 39 cases of the malignant group, and results were significantly different (P<0.001). Tumor recurrence was shorter in patients with a higher McCormick grade (P<0.001). Recurrence was found in 13 patients with resection and all the patients with partial resection or biopsy/decompression. The results were significantly different (P<0.001). Logistic regression analysis of total resection-related factors showed that total resection

  13. On the Likelihood Ratio Test for the Number of Factors in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.; Yuan, Ke-Hai

    2007-01-01

    In the exploratory factor analysis, when the number of factors exceeds the true number of factors, the likelihood ratio test statistic no longer follows the chi-square distribution due to a problem of rank deficiency and nonidentifiability of model parameters. As a result, decisions regarding the number of factors may be incorrect. Several…

  14. A Brief History of the Philosophical Foundations of Exploratory Factor Analysis.

    ERIC Educational Resources Information Center

    Mulaik, Stanley A.

    1987-01-01

    Exploratory factor analysis derives its key ideas from many sources, including Aristotle, Francis Bacon, Descartes, Pearson and Yule, and Kant. The conclusions of exploratory factor analysis are never complete without subsequent confirmatory factor analysis. (Author/GDC)

  15. Temperature effects on the strainrange partitioning approach for creep-fatigue analysis

    NASA Technical Reports Server (NTRS)

    Halford, G. R.; Hirschberg, M. H.; Manson, S. S.

    1972-01-01

    Examination is made of the influence of temperature on the strainrange partitioning approach to creep-fatigue. Results for Cr-Mo steel and Type 316 stainless steel show the four partitioned strainrange-life relationships to be temperature insensitive to within a factor of two on cyclic life. Monotonic creep and tensile ductilities were also found to be temperature insensitive to within a factor of two. The approach provides bounds on cyclic life that can be readily established for any type of inelastic strain cycle. Continuous strain cycling results obtained over a broad range of high temperatures and frequencies are in excellent agreement with bounds provided by the approach. The observed transition from one bound to the other is also in good agreement with the approach.

  16. g-factor calculations from the generalized seniority approach

    NASA Astrophysics Data System (ADS)

    Maheshwari, Bhoomika; Jain, Ashok Kumar

    2018-05-01

    The generalized seniority approach proposed by us to understand the B(E1)/B(E2)/B(E3) properties of semi-magic nuclei has been widely successful in the explanation of the same and has led to an expansion in the scope of seniority isomers. In the present paper, we apply the generalized seniority scheme to understand the behavior of g-factors in semi-magic nuclei. We find that the magnetic moment and the gfactors do show a particle number independent behavior as expected and the understanding is consistent with the explanation of transition probabilities.

  17. Contribution of biotic and abiotic factors in the natural attenuation of sulfamethoxazole: A path analysis approach.

    PubMed

    Li, Yan; Rashid, Azhar; Wang, Hongjie; Hu, Anyi; Lin, Lifeng; Yu, Chang-Ping; Chen, Meng; Sun, Qian

    2018-08-15

    Sulfamethoxazole (SMX) is a sulfonamide antibiotic, widely used as curative and preventive drug for human, animal, and aquaculture bacterial infections. Its residues have been ubiquitously detected in the surface waters and sediments. In the present study, SMX dissipation and kinetics was studied in the natural water samples from Jiulong River under simulated complex natural conditions as well as conditions to mimic various biotic and abiotic environmental conditions in isolation. Structural equation modeling (SEM) by employing partial least square technique in path coefficient analysis was used to investigate the direct and indirect contributions of different environmental factors in the natural attenuation of SMX. The model explained 81% of the variability in natural attenuation as a dependent variable under the influence of sole effects of direct photo-degradation, indirect photo-degradation, hydrolysis, microbial degradation and bacterial degradation. The results of SEM suggested that the direct and indirect photo-degradation were the major pathways in the SMX natural attenuation. However, other biotic and abiotic factors also play a mediatory role during the natural attenuation and other processes. Furthermore, the potential transformation products of SMX were identified and their toxicity was evaluated. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Human factors and systems engineering approach to patient safety for radiotherapy.

    PubMed

    Rivera, A Joy; Karsh, Ben-Tzion

    2008-01-01

    The traditional approach to solving patient safety problems in healthcare is to blame the last person to touch the patient. But since the publication of To Err is Human, the call has been instead to use human factors and systems engineering methods and principles to solve patient safety problems. However, an understanding of the human factors and systems engineering is lacking, and confusion remains about what it means to apply their principles. This paper provides a primer on them and their applications to patient safety.

  19. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.

  20. Network based transcription factor analysis of regenerating axolotl limbs

    PubMed Central

    2011-01-01

    Background Studies on amphibian limb regeneration began in the early 1700's but we still do not completely understand the cellular and molecular events of this unique process. Understanding a complex biological process such as limb regeneration is more complicated than the knowledge of the individual genes or proteins involved. Here we followed a systems biology approach in an effort to construct the networks and pathways of protein interactions involved in formation of the accumulation blastema in regenerating axolotl limbs. Results We used the human orthologs of proteins previously identified by our research team as bait to identify the transcription factor (TF) pathways and networks that regulate blastema formation in amputated axolotl limbs. The five most connected factors, c-Myc, SP1, HNF4A, ESR1 and p53 regulate ~50% of the proteins in our data. Among these, c-Myc and SP1 regulate 36.2% of the proteins. c-Myc was the most highly connected TF (71 targets). Network analysis showed that TGF-β1 and fibronectin (FN) lead to the activation of these TFs. We found that other TFs known to be involved in epigenetic reprogramming, such as Klf4, Oct4, and Lin28 are also connected to c-Myc and SP1. Conclusions Our study provides a systems biology approach to how different molecular entities inter-connect with each other during the formation of an accumulation blastema in regenerating axolotl limbs. This approach provides an in silico methodology to identify proteins that are not detected by experimental methods such as proteomics but are potentially important to blastema formation. We found that the TFs, c-Myc and SP1 and their target genes could potentially play a central role in limb regeneration. Systems biology has the potential to map out numerous other pathways that are crucial to blastema formation in regeneration-competent limbs, to compare these to the pathways that characterize regeneration-deficient limbs and finally, to identify stem cell markers in

  1. Mars approach for global sensitivity analysis of differential equation models with applications to dynamics of influenza infection.

    PubMed

    Lee, Yeonok; Wu, Hulin

    2012-01-01

    Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.

  2. Hand function evaluation: a factor analysis study.

    PubMed

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  3. What factors influence healthy aging? A person-centered approach among older adults in Taiwan.

    PubMed

    Liu, Li-Fan; Su, Pei-Fang

    2017-05-01

    The present study aimed to identify the health profiles of older adults by using latent class analysis to investigate health heterogeneity and to determine what factors predicted healthy aging among an oldest-old sample cohort that was followed up for 14 years in Taiwan. Data were drawn from five waves (carried out in 1993, 1996, 1999, 2003 and 2007) of the Taiwan Longitudinal Study on Aging to examine the changes in health heterogeneity in a nationally representative oldest-old cohort of Taiwanese. Overall, data from a total of 11 145 observations of 3155 older adults were considered. The influential factors predicting health changes were analyzed by using a generalized estimating equation. The results showed that four health profiles were identified among the aging population observed in the Taiwan Longitudinal Study on Aging. With increasing age, the combined effects of the physical functioning, cognitive and emotional health, and comorbidities of older adults significantly impact their health changes. Apart from health deteriorating with age and sex disparities, educational and economic status, health behaviors, and social participation at the individual level were found to be the robust factors in predicting healthy aging. In considering what factors impact healthy aging, we suggest that a person-centered approach would be useful and critical for policy makers to understand the compositions of health profiles and the influencing factors in view of a life-course perspective. Based on the factors identified as influencing healthy aging at the individual level, it is imperative from a policy-making perspective to maximize opportunities for healthy aging. Geriatr Gerontol Int 2017; 17: 697-707. © 2016 Japan Geriatrics Society.

  4. Factors Influencing Implementation of OHSAS 18001 in Indian Construction Organizations: Interpretive Structural Modeling Approach

    PubMed Central

    Rajaprasad, Sunku Venkata Siva; Chalapathi, Pasupulati Venkata

    2015-01-01

    Background Construction activity has made considerable breakthroughs in the past two decades on the back of increases in development activities, government policies, and public demand. At the same time, occupational health and safety issues have become a major concern to construction organizations. The unsatisfactory safety performance of the construction industry has always been highlighted since the safety management system is neglected area and not implemented systematically in Indian construction organizations. Due to a lack of enforcement of the applicable legislation, most of the construction organizations are forced to opt for the implementation of Occupational Health Safety Assessment Series (OHSAS) 18001 to improve safety performance. Methods In order to better understand factors influencing the implementation of OHSAS 18001, an interpretive structural modeling approach has been applied and the factors have been classified using matrice d'impacts croises-multiplication appliqué a un classement (MICMAC) analysis. The study proposes the underlying theoretical framework to identify factors and to help management of Indian construction organizations to understand the interaction among factors influencing in implementation of OHSAS 18001. Results Safety culture, continual improvement, morale of employees, and safety training have been identified as dependent variables. Safety performance, sustainable construction, and conducive working environment have been identified as linkage variables. Management commitment and safety policy have been identified as the driver variables. Conclusion Management commitment has the maximum driving power and the most influential factor is safety policy, which states clearly the commitment of top management towards occupational safety and health. PMID:26929828

  5. Air-to-air combat analysis - Review of differential-gaming approaches

    NASA Technical Reports Server (NTRS)

    Ardema, M. D.

    1981-01-01

    The problem of evaluating the combat performance of fighter/attack aircraft is discussed, and the mathematical nature of the problem is examined. The following approaches to air combat analysis are reviewed: (1) differential-turning differential game and (2) coplanar differential game. Selected numerical examples of these approaches are presented. The relative advantages and disadvantages of each are analyzed, and it is concluded that air combat analysis is an extremely difficult mathematical problem and that no one method of approach is best for all purposes. The paper concludes with a discussion of how the two approaches might be used in a complementary manner.

  6. Multivariate analysis of prognostic factors in synovial sarcoma.

    PubMed

    Koh, Kyoung Hwan; Cho, Eun Yoon; Kim, Dong Wook; Seo, Sung Wook

    2009-11-01

    Many studies have described the diversity of synovial sarcoma in terms of its biological characteristics and clinical features. Moreover, much effort has been expended on the identification of prognostic factors because of unpredictable behaviors of synovial sarcomas. However, with the exception of tumor size, published results have been inconsistent. We attempted to identify independent risk factors using survival analysis. Forty-one consecutive patients with synovial sarcoma were prospectively followed from January 1997 to March 2008. Overall and progression-free survival for age, sex, tumor size, tumor location, metastasis at presentation, histologic subtype, chemotherapy, radiation therapy, and resection margin were analyzed, and standard multivariate Cox proportional hazard regression analysis was used to evaluate potential prognostic factors. Tumor size (>5 cm), nonlimb-based tumors, metastasis at presentation, and a monophasic subtype were associated with poorer overall survival. Multivariate analysis showed metastasis at presentation and monophasic tumor subtype affected overall survival. For the progression-free survival, monophasic subtype was found to be only 1 prognostic factor. The study confirmed that histologic subtype is the single most important independent prognostic factors of synovial sarcoma regardless of tumor stage.

  7. A GIS-based approach for comparative analysis of potential fire risk assessment

    NASA Astrophysics Data System (ADS)

    Sun, Ying; Hu, Lieqiu; Liu, Huiping

    2007-06-01

    Urban fires are one of the most important sources of property loss and human casualty and therefore it is necessary to assess the potential fire risk with consideration of urban community safety. Two evaluation models are proposed, both of which are integrated with GIS. One is the single factor model concerning the accessibility of fire passage and the other is grey clustering approach based on the multifactor system. In the latter model, fourteen factors are introduced and divided into four categories involving security management, evacuation facility, construction resistance and fire fighting capability. A case study on campus of Beijing Normal University is presented to express the potential risk assessment models in details. A comparative analysis of the two models is carried out to validate the accuracy. The results are approximately consistent with each other. Moreover, modeling with GIS promotes the efficiency the potential risk assessment.

  8. Expression, Purification, and Analysis of Unknown Translation Factors from "Escherichia Coli": A Synthesis Approach

    ERIC Educational Resources Information Center

    Walter, Justin D.; Littlefield, Peter; Delbecq, Scott; Prody, Gerry; Spiegel, P. Clint

    2010-01-01

    New approaches are currently being developed to expose biochemistry and molecular biology undergraduates to a more interactive learning environment. Here, we propose a unique project-based laboratory module, which incorporates exposure to biophysical chemistry approaches to address problems in protein chemistry. Each of the experiments described…

  9. An Integrated Approach to Life Cycle Analysis

    NASA Technical Reports Server (NTRS)

    Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.

    2006-01-01

    Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.

  10. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  11. A Flight Evaluation of the Factors which Influence the Selection of Landing Approach Speeds

    NASA Technical Reports Server (NTRS)

    Drinkwater, Fred J., III; Cooper, George E.

    1958-01-01

    The factors which influence the selection of landing approach speeds are discussed from the pilot's point of view. Concepts were developed and data were obtained during a landing approach flight investigation of a large number of jet airplane configurations which included straight-wing, swept-wing, and delta-wing airplanes as well as several applications of boundary-layer control. Since the fundamental limitation to further reductions in approach speed on most configurations appeared to be associated with the reduction in the pilot's ability to control flight path angle and airspeed, this problem forms the basis of the report. A simplified equation is presented showing the basic parameters which govern the flight path angle and airspeed changes, and pilot control techniques are discussed in relation to this equation. Attention is given to several independent aerodynamic characteristics which do not affect the flight path angle or airspeed directly but which determine to a large extent the effort and attention required of the pilot in controlling these factors during the approach. These include stall characteristics, stability about all axes, and changes in trim due to thrust adjustments. The report considers the relationship between piloting technique and all of the factors previously mentioned. A piloting technique which was found to be highly desirable for control of high-performance airplanes is described and the pilot's attitudes toward low-speed flight which bear heavily on the selection of landing approach speeds under operational conditions are discussed.

  12. Landslides distribution analysis and role of triggering factors in the Foglia river basin (Central Itay)

    NASA Astrophysics Data System (ADS)

    Baioni, Davide; Gallerini, Giuliano; Sgavetti, Maria

    2013-04-01

    The present work is focused on the distribution of landslides in Foglia river basin area (northern Marche-Romagna), using a heuristic approach supported by GIS tools for the construction of statistical analysis and spatial data. The study area is located in the Adriatic side of the northern Apennine in the boundary that marks the transition between the Marche and Emilia-Romagna regions. The Foglia river basin extends from the Apennines to the Adriatic sea with NE-SE trend occupying an area of about 708 km2. The purpose of this study is to investigate any relationships between factors related to the territory, which were taken into account and divided into classes, and landslides, trying to identify any possible existence of relationships between them. For this aim the study of landslides distribution was performed by using a GIS approach superimposing each thematic map, previously created, with landslides surveyed. Furthermore, we tried to isolate the most recurrent classes, to detect if at the same conditions there is a parameter that affects more than others, so as to recognize every direct relationship of cause and effect. Finally, an analysis was conducted by applying the model of uncertainty CF (Certainity Factor). In the Foglia river basin were surveyed a total of 2821 landslides occupy a total area of 155 km2, corresponding to 22% areal extent of the entire basin. The results of analysis carried out highlighted the importance and role of individual factors that led to the development of landslides analyzed. Moreover, this methodology may be applied to all orders of magnitude and scale without any problem by not requiring a commitment important, both from the economic point of view, and of human resources.

  13. Factor Analysis for Clustered Observations.

    ERIC Educational Resources Information Center

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  14. Analysis of Salmonella sp bacterial contamination on Vannamei Shrimp using binary logit model approach

    NASA Astrophysics Data System (ADS)

    Oktaviana, P. P.; Fithriasari, K.

    2018-04-01

    Mostly Indonesian citizen consume vannamei shrimp as their food. Vannamei shrimp also is one of Indonesian exports comodities mainstay. Vannamei shrimp in the ponds and markets could be contaminated by Salmonella sp bacteria. This bacteria will endanger human health. Salmonella sp bacterial contamination on vannamei shrimp could be affected by many factors. This study is intended to identify what factors that supposedly influence the Salmonella sp bacterial contamination on vannamei shrimp. The researchers used the testing result of Salmonella sp bacterial contamination on vannamei shrimp as response variable. This response variable has two categories: 0 = if testing result indicate that there is no Salmonella sp on vannamei shrimp; 1 = if testing result indicate that there is Salmonella sp on vannamei shrimp. There are four factors that supposedly influence the Salmonella sp bacterial contamination on vannamei shrimp, which are the testing result of Salmonella sp bacterial contamination on farmer hand swab; the subdistrict of vannamei shrimp ponds; the fish processing unit supplied by; and the pond are in hectare. This four factors used as predictor variables. The analysis used is Binary Logit Model Approach according to the response variable that has two categories. The analysis result indicates that the factors or predictor variables which is significantly affect the Salmonella sp bacterial contamination on vannamei shrimp are the testing result of Salmonella sp bacterial contamination on farmer hand swab and the subdistrict of vannamei shrimp ponds.

  15. The Common Factors Discrimination Model: An Integrated Approach to Counselor Supervision

    ERIC Educational Resources Information Center

    Crunk, A. Elizabeth; Barden, Sejal M.

    2017-01-01

    Numerous models of clinical supervision have been developed; however, there is little empirical support indicating that any one model is superior. Therefore, common factors approaches to supervision integrate essential components that are shared among counseling and supervision models. The purpose of this paper is to present an innovative model of…

  16. A Mixed-Methods Approach to Demotivating Factors among Iranian EFL Learners

    ERIC Educational Resources Information Center

    Ghonsooly, Behzad; Hassanzadeh, Tahereh; Samavarchi, Laila; Hamedi, Seyyedeh Mina

    2017-01-01

    This study used a mixed-methods approach to investigate Iranian EFL learners' attitudes towards demotivating factors which may hinder their success in a language learning course. In the quantitative phase, a sample of 337 undergraduate students from universities in Mashhad, Yazd and Gonabad completed a 34-item questionnaire. They also completed…

  17. [The Common Risk Factor Approach - An Integrated Population- and Evidence-Based Approach for Reducing Social Inequalities in Oral Health].

    PubMed

    Heilmann, A; Sheiham, A; Watt, R G; Jordan, R A

    2016-10-01

    Worldwide, non-communicable diseases including dental caries and periodontal diseases, remain a major public health problem. Moreover, there is a social gradient in health across society that runs from the top to the bottom in a linear, stepwise fashion. Health promoting behaviours become more difficult to sustain further down the social ladder. Oral health inequalities also exist in Germany. Earlier explanations of social inequalities have mainly focused on individual lifestyle factors, ignoring the broader social determinants of health and disease. Until recently, the dominant approaches to general health promotion focused on actions to reduce specific diseases, separating oral health from general health. An alternative approach is the common risk factor approach (CRFA) where risk factors common to a number of major chronic diseases, including diseases of the mouth and teeth, are tackled. The CRFA focuses on the common underlying determinants of health to improve the overall health of populations, thereby reducing social inequalities. The main implication of the CRFA for oral health policies is to work in partnership with a range of other sectors and disciplines. Oral health issues need to be integrated with recommendations to promote general health. Improvements in oral health and a reduction in oral health inequalities are more likely by working in partnership across sectors and disciplines using strategies that focus upstream on the underlying determinants of oral diseases. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Using factor analysis to identify neuromuscular synergies during treadmill walking

    NASA Technical Reports Server (NTRS)

    Merkle, L. A.; Layne, C. S.; Bloomberg, J. J.; Zhang, J. J.

    1998-01-01

    Neuroscientists are often interested in grouping variables to facilitate understanding of a particular phenomenon. Factor analysis is a powerful statistical technique that groups variables into conceptually meaningful clusters, but remains underutilized by neuroscience researchers presumably due to its complicated concepts and procedures. This paper illustrates an application of factor analysis to identify coordinated patterns of whole-body muscle activation during treadmill walking. Ten male subjects walked on a treadmill (6.4 km/h) for 20 s during which surface electromyographic (EMG) activity was obtained from the left side sternocleidomastoid, neck extensors, erector spinae, and right side biceps femoris, rectus femoris, tibialis anterior, and medial gastrocnemius. Factor analysis revealed 65% of the variance of seven muscles sampled aligned with two orthogonal factors, labeled 'transition control' and 'loading'. These two factors describe coordinated patterns of muscular activity across body segments that would not be evident by evaluating individual muscle patterns. The results show that factor analysis can be effectively used to explore relationships among muscle patterns across all body segments to increase understanding of the complex coordination necessary for smooth and efficient locomotion. We encourage neuroscientists to consider using factor analysis to identify coordinated patterns of neuromuscular activation that would be obscured using more traditional EMG analyses.

  19. Scalable non-negative matrix tri-factorization.

    PubMed

    Čopar, Andrej; Žitnik, Marinka; Zupan, Blaž

    2017-01-01

    Matrix factorization is a well established pattern discovery tool that has seen numerous applications in biomedical data analytics, such as gene expression co-clustering, patient stratification, and gene-disease association mining. Matrix factorization learns a latent data model that takes a data matrix and transforms it into a latent feature space enabling generalization, noise removal and feature discovery. However, factorization algorithms are numerically intensive, and hence there is a pressing challenge to scale current algorithms to work with large datasets. Our focus in this paper is matrix tri-factorization, a popular method that is not limited by the assumption of standard matrix factorization about data residing in one latent space. Matrix tri-factorization solves this by inferring a separate latent space for each dimension in a data matrix, and a latent mapping of interactions between the inferred spaces, making the approach particularly suitable for biomedical data mining. We developed a block-wise approach for latent factor learning in matrix tri-factorization. The approach partitions a data matrix into disjoint submatrices that are treated independently and fed into a parallel factorization system. An appealing property of the proposed approach is its mathematical equivalence with serial matrix tri-factorization. In a study on large biomedical datasets we show that our approach scales well on multi-processor and multi-GPU architectures. On a four-GPU system we demonstrate that our approach can be more than 100-times faster than its single-processor counterpart. A general approach for scaling non-negative matrix tri-factorization is proposed. The approach is especially useful parallel matrix factorization implemented in a multi-GPU environment. We expect the new approach will be useful in emerging procedures for latent factor analysis, notably for data integration, where many large data matrices need to be collectively factorized.

  20. A Systematic Review of Methodology: Time Series Regression Analysis for Environmental Factors and Infectious Diseases

    PubMed Central

    Imai, Chisato; Hashizume, Masahiro

    2015-01-01

    Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149

  1. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    PubMed

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  2. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaye, R.D.; Henriksen, K.; Jones, R.

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatmentmore » requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.« less

  3. Factors Contributing to Changes in a Deep Approach to Learning in Different Learning Environments

    ERIC Educational Resources Information Center

    Postareff, Liisa; Parpala, Anna; Lindblom-Ylänne, Sari

    2015-01-01

    The study explored factors explaining changes in a deep approach to learning. The data consisted of interviews with 12 students from four Bachelor-level courses representing different disciplines. We analysed and compared descriptions of students whose deep approach either increased, decreased or remained relatively unchanged during their courses.…

  4. Derringer desirability and kinetic plot LC-column comparison approach for MS-compatible lipopeptide analysis.

    PubMed

    D'Hondt, Matthias; Verbeke, Frederick; Stalmans, Sofie; Gevaert, Bert; Wynendaele, Evelien; De Spiegeleer, Bart

    2014-06-01

    Lipopeptides are currently re-emerging as an interesting subgroup in the peptide research field, having historical applications as antibacterial and antifungal agents and new potential applications as antiviral, antitumor, immune-modulating and cell-penetrating compounds. However, due to their specific structure, chromatographic analysis often requires special buffer systems or the use of trifluoroacetic acid, limiting mass spectrometry detection. Therefore, we used a traditional aqueous/acetonitrile based gradient system, containing 0.1% (m/v) formic acid, to separate four pharmaceutically relevant lipopeptides (polymyxin B 1 , caspofungin, daptomycin and gramicidin A 1 ), which were selected based upon hierarchical cluster analysis (HCA) and principal component analysis (PCA). In total, the performance of four different C18 columns, including one UPLC column, were evaluated using two parallel approaches. First, a Derringer desirability function was used, whereby six single and multiple chromatographic response values were rescaled into one overall D -value per column. Using this approach, the YMC Pack Pro C18 column was ranked as the best column for general MS-compatible lipopeptide separation. Secondly, the kinetic plot approach was used to compare the different columns at different flow rate ranges. As the optimal kinetic column performance is obtained at its maximal pressure, the length elongation factor λ ( P max / P exp ) was used to transform the obtained experimental data (retention times and peak capacities) and construct kinetic performance limit (KPL) curves, allowing a direct visual and unbiased comparison of the selected columns, whereby the YMC Triart C18 UPLC and ACE C18 columns performed as best. Finally, differences in column performance and the (dis)advantages of both approaches are discussed.

  5. Microscopic saw mark analysis: an empirical approach.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  6. Analysis of yield and oil from a series of canola breeding trials. Part II. Exploring variety by environment interaction using factor analysis.

    PubMed

    Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A

    2010-11-01

    Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.

  7. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  8. A Factor Analysis of the BSRI and the PAQ.

    ERIC Educational Resources Information Center

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  9. Technical factors that affect anastomotic integrity following esophagectomy: systematic review and meta-analysis.

    PubMed

    Markar, Sheraz R; Arya, Shobhit; Karthikesalingam, Alan; Hanna, George B

    2013-12-01

    Due to the significant contribution of anastomotic leak, with its disastrous consequences to patient morbidity and mortality, multiple parameters have been proposed and individually meta-analyzed for the formation of the ideal esophagogastric anastomosis following cancer resection. The purpose of this pooled analysis was to examine the main technical parameters that impact on anastomotic integrity. Medline, Embase, trial registries, and conference proceedings were searched. Technical factors evaluated included hand-sewn versus stapled esophagogastric anastomosis (EGA), cervical versus thoracic EGA, minimally invasive versus open esophagectomy, anterior versus posterior route of reconstruction and ischemic conditioning of the gastric conduit. The outcome of interest was the incidence of anastomotic leak, for which pooled odds ratios were calculated for each technical factor. No significant difference in the incidence of anastomotic leak was demonstrated for the following technical factors: hand-sewn versus stapled EGA, minimally invasive versus open esophagectomy, anterior versus posterior route of reconstruction and ischemic conditioning of the gastric conduit. Four randomized, controlled trials comprising 298 patients were included that compared cervical and thoracic EGA. Anastomotic leak was seen more commonly in the cervical group (13.64 %) than in the thoracic group (2.96 %). Pooled analysis demonstrated a significantly increased incidence of anastomotic leak in the cervical group (pooled odds ratio = 4.73; 95 % CI 1.61-13.9; P = 0.005). A tailored surgical approach to the patient's physiology and esophageal cancer stage is the most important factor that influences anastomotic integrity after esophagectomy.

  10. Analysis of Factors Influencing Building Refurbishment Project Performance

    NASA Astrophysics Data System (ADS)

    Ishak, Nurfadzillah; Aswad Ibrahim, Fazdliel; Azizi Azizan, Muhammad

    2018-03-01

    Presently, the refurbishment approach becomes favourable as it creates opportunities to incorporate sustainable value with other building improvement. In this regard, this approach needs to be implemented due to the issues on overwhelming ratio of existing building to new construction, which also can contribute to the environmental problem. Refurbishment principles imply to minimize the environmental impact and upgrading the performance of an existing building to meet new requirements. In theoretically, building project's performance has a direct bearing on related to its potential for project success. However, in refurbishment building projects, the criteria for measure are become wider because the projects are a complex and multi-dimensional which encompassing many factors which reflect to the nature of works. Therefore, this impetus could be achieve by examine the direct empirical relationship between critical success factors (CSFs) and complexity factors (CFs) during managing the project in relation to delivering success on project performance. The research findings will be expected as the basis of future research in establish appropriate framework that provides information on managing refurbishment building projects and enhancing the project management competency for a better-built environment.

  11. Establishing Factor Validity Using Variable Reduction in Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Hofmann, Rich

    1995-01-01

    Using a 21-statement attitude-type instrument, an iterative procedure for improving confirmatory model fit is demonstrated within the context of the EQS program of P. M. Bentler and maximum likelihood factor analysis. Each iteration systematically eliminates the poorest fitting statement as identified by a variable fit index. (SLD)

  12. Hypertension in Black and Other Populations: Environmental Factors and Approaches to Management

    PubMed Central

    Hosten, Adrian O.

    1980-01-01

    Hypertension is a major health problem for industrialized as well as developing countries, especially those with sizeable black populations. The author analyzes various aspects of hypertension in black and other populations with emphasis on contributing factors and therapeutic approaches. PMID:7365811

  13. Working conditions, socioeconomic factors and low birth weight: path analysis.

    PubMed

    Mahmoodi, Zohreh; Karimlou, Masoud; Sajjadi, Homeira; Dejman, Masoumeh; Vameghi, Meroe; Dolatian, Mahrokh

    2013-09-01

    In recent years, with socioeconomic changes in the society, the presence of women in the workplace is inevitable. The differences in working condition, especially for pregnant women, has adverse consequences like low birth weight. This study was conducted with the aim to model the relationship between working conditions, socioeconomic factors, and birth weight. This study was conducted in case-control design. The control group consisted of 500 women with normal weight babies, and the case group, 250 women with low weight babies from selected hospitals in Tehran. Data were collected using a researcher-made questionnaire to determine mothers' lifestyle during pregnancy with low birth weight with health-affecting social determinants approach. This questionnaire investigated women's occupational lifestyle in terms of working conditions, activities, and job satisfaction. Data were analyzed with SPSS-16 and Lisrel-8.8 software using statistical path analysis. The final path model fitted well (CFI =1, RMSEA=0.00) and showed that among direct paths, working condition (β=-0.032), among indirect paths, household income (β=-0.42), and in the overall effect, unemployed spouse (β=-0.1828) had the most effects on the low birth weight. Negative coefficients indicate decreasing effect on birth weight. Based on the path analysis model, working condition and socioeconomic status directly and indirectly influence birth weight. Thus, as well as attention to treatment and health care (biological aspect), special attention must also be paid to mothers' socioeconomic factors.

  14. Working Conditions, Socioeconomic Factors and Low Birth Weight: Path Analysis

    PubMed Central

    Mahmoodi, Zohreh; Karimlou, Masoud; Sajjadi, Homeira; Dejman, Masoumeh; Vameghi, Meroe; Dolatian, Mahrokh

    2013-01-01

    Background In recent years, with socioeconomic changes in the society, the presence of women in the workplace is inevitable. The differences in working condition, especially for pregnant women, has adverse consequences like low birth weight. Objectives This study was conducted with the aim to model the relationship between working conditions, socioeconomic factors, and birth weight. Patients and Methods This study was conducted in case-control design. The control group consisted of 500 women with normal weight babies, and the case group, 250 women with low weight babies from selected hospitals in Tehran. Data were collected using a researcher-made questionnaire to determine mothers’ lifestyle during pregnancy with low birth weight with health-affecting social determinants approach. This questionnaire investigated women’s occupational lifestyle in terms of working conditions, activities, and job satisfaction. Data were analyzed with SPSS-16 and Lisrel-8.8 software using statistical path analysis. Results The final path model fitted well (CFI =1, RMSEA=0.00) and showed that among direct paths, working condition (β=-0.032), among indirect paths, household income (β=-0.42), and in the overall effect, unemployed spouse (β=-0.1828) had the most effects on the low birth weight. Negative coefficients indicate decreasing effect on birth weight. Conclusions Based on the path analysis model, working condition and socioeconomic status directly and indirectly influence birth weight. Thus, as well as attention to treatment and health care (biological aspect), special attention must also be paid to mothers’ socioeconomic factors. PMID:24616796

  15. Specifying an implementation framework for Veterans Affairs antimicrobial stewardship programmes: using a factor analysis approach.

    PubMed

    Chou, Ann F; Graber, Christopher J; Zhang, Yue; Jones, Makoto; Goetz, Matthew Bidwell; Madaras-Kelly, Karl; Samore, Matthew; Glassman, Peter A

    2018-06-04

    Inappropriate antibiotic use poses a serious threat to patient safety. Antimicrobial stewardship programmes (ASPs) may optimize antimicrobial use and improve patient outcomes, but their implementation remains an organizational challenge. Using the Promoting Action on Research Implementation in Health Services (PARiHS) framework, this study aimed to identify organizational factors that may facilitate ASP design, development and implementation. Among 130 Veterans Affairs facilities that offered acute care, we classified organizational variables supporting antimicrobial stewardship activities into three PARiHS domains: evidence to encompass sources of knowledge; contexts to translate evidence into practice; and facilitation to enhance the implementation process. We conducted a series of exploratory factor analyses to identify conceptually linked factor scales. Cronbach's alphas were calculated. Variables with large uniqueness values were left as single factors. We identified 32 factors, including six constructs derived from factor analyses under the three PARiHS domains. In the evidence domain, four factors described guidelines and clinical pathways. The context domain was broken into three main categories: (i) receptive context (15 factors describing resources, affiliations/networks, formalized policies/practices, decision-making, receptiveness to change); (ii) team functioning (1 factor); and (iii) evaluation/feedback (5 factors). Within facilitation, two factors described facilitator roles and tasks and five captured skills and training. We mapped survey data onto PARiHS domains to identify factors that may be adapted to facilitate ASP uptake. Our model encompasses mostly mutable factors whose relationships with performance outcomes may be explored to optimize antimicrobial use. Our framework also provides an analytical model for determining whether leveraging existing organizational processes can potentially optimize ASP performance.

  16. Factors affecting job satisfaction in nurse faculty: a meta-analysis.

    PubMed

    Gormley, Denise K

    2003-04-01

    Evidence in the literature suggests job satisfaction can make a difference in keeping qualified workers on the job, but little research has been conducted focusing specifically on nursing faculty. Several studies have examined nurse faculty satisfaction in relationship to one or two influencing factors. These factors include professional autonomy, leader role expectations, organizational climate, perceived role conflict and role ambiguity, leadership behaviors, and organizational characteristics. This meta-analysis attempts to synthesize the various studies conducted on job satisfaction in nursing faculty and analyze which influencing factors have the greatest effect. The procedure used for this meta-analysis consisted of reviewing studies to identify factors influencing job satisfaction, research questions, sample size reported, instruments used for measurement of job satisfaction and influencing factors, and results of statistical analysis.

  17. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  18. Charmless B_{(s)}→ VV decays in factorization-assisted topological-amplitude approach

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Zhang, Qi-An; Li, Ying; Lü, Cai-Dian

    2017-05-01

    Within the factorization-assisted topological-amplitude approach, we studied the 33 charmless B_{(s)} → VV decays, where V stands for a light vector meson. According to the flavor flows, the amplitude of each process can be decomposed into eight different topologies. In contrast to the conventional flavor diagrammatic approach, we further factorize each topological amplitude into decay constant, form factors and unknown universal parameters. By χ ^2 fitting 46 experimental observables, we extracted 10 theoretical parameters with χ ^2 per degree of freedom around 2. Using the fitted parameters, we calculated the branching fractions, polarization fractions, CP asymmetries and relative phases between polarization amplitudes of each decay mode. The decay channels dominated by tree diagram have large branching fractions and large longitudinal polarization fraction. The branching fractions and longitudinal polarization fractions of color-suppressed decays become smaller. Current experimental data of large transverse polarization fractions in the penguin dominant decay channels can be explained by only one transverse amplitude of penguin annihilation diagram. Our predictions of the not yet measured channels can be tested in the ongoing LHCb experiment and the Belle-II experiment in the future.

  19. Comparison of posterior retroperitoneal and transabdominal lateral approaches in robotic adrenalectomy: an analysis of 200 cases.

    PubMed

    Kahramangil, Bora; Berber, Eren

    2018-04-01

    Although numerous studies have been published on robotic adrenalectomy (RA) in the literature, none has done a comparison of posterior retroperitoneal (PR) and transabdominal lateral (TL) approaches. The aim of this study was to compare the outcomes of robotic PR and TL adrenalectomy. This is a retrospective analysis of a prospectively maintained database. Between September 2008 and January 2017, perioperative outcomes of patients undergoing RA through PR and TL approaches were recorded into an IRB-approved database. Clinical and perioperative parameters were compared using Student's t test, Wilcoxon rank-sum test, and χ 2 test. Multivariate regression analysis was performed to determine factors associated with total operative time. 188 patients underwent 200 RAs. 110 patients were operated through TL and 78 patients through PR approach. Overall, conversion rate to open was 2.5% and 90-day morbidity 4.8%. The perioperative outcomes of TL and PR approaches were similar regarding estimated blood loss, rate of conversion to open, length of hospital stay, and 90-day morbidity. PR approach resulted in a shorter mean ± SD total operative time (136.3 ± 38.7 vs. 154.6 ± 48.4 min; p = 0.005) and lower visual analog scale pain score on postoperative day #1 (4.3 ± 2.5 vs. 5.4 ± 2.4; p = 0.001). After excluding tumors larger than 6 cm operated through TL approach, the difference in operative times persisted (136.3 ± 38.7 vs. 153.7 ± 45.7 min; p = 0.009). On multivariate regression analysis, increasing BMI and TL approaches were associated with longer total operative time. This study shows that robotic PR and TL approaches are equally safe and efficacious. With experience, shorter operative time and less postoperative pain can be achieved with PR technique. This supports the preferential utilization of PR approach in high-volume centers with enough experience.

  20. Factors Associated with Fatal Occupational Accidents among Mexican Workers: A National Analysis

    PubMed Central

    Gonzalez-Delgado, Mery; Gómez-Dantés, Héctor; Fernández-Niño, Julián Alfredo; Robles, Eduardo; Borja, Víctor H.; Aguilar, Miriam

    2015-01-01

    Objective To identify the factors associated with fatal occupational injuries in Mexico in 2012 among workers affiliated with the Mexican Social Security Institute. Methods Analysis of secondary data using information from the National Occupational Risk Information System, with the consequence of the occupational injury (fatal versus non-fatal) as the response variable. The analysis included 406,222 non-fatal and 1,140 fatal injuries from 2012. The factors associated with the lethality of the injury were identified using a logistic regression model with the Firth approach. Results Being male (OR=5.86; CI95%: 4.22-8.14), age (OR=1.04; CI95%: 1.03-1.06), employed in the position for 1 to 10 years (versus less than 1 year) (OR=1.37; CI95%: 1.15-1.63), working as a facilities or machine operator or assembler (OR: 3.28; CI95%: 2.12- 5.07) and being a worker without qualifications (OR=1.96; CI95%: 1.18-3.24) (versus an office worker) were associated with fatality in the event of an injury. Additionally, companies classified as maximum risk (OR=1.90; CI 95%: 1.38-2.62), workplace conditions (OR=7.15; CI95%: 3.63-14.10) and factors related to the work environment (OR=9.18; CI95%:4.36-19.33) were identified as risk factors for fatality in the event of an occupational injury. Conclusions Fatality in the event of an occupational injury is associated with factors related to sociodemographics (age, sex and occupation), the work environment and workplace conditions. Worker protection policies should be created for groups with a higher risk of fatal occupational injuries in Mexico. PMID:25790063

  1. An independent confirmatory factor analysis of the Wechsler Intelligence Scale for Children-fourth Edition (WISC-IV) integrated: what do the process approach subtests measure?

    PubMed

    Benson, Nicholas; Hulac, David M; Bernstein, Joshua D

    2013-09-01

    The Wechsler intelligence scale for children--fourth edition (WISC-IV) Integrated contains the WISC-IV core and supplemental subtests along with process approach subtests designed to facilitate a process-oriented approach to score interpretation. The purpose of this study was to examine the extent to which WISC-IV Integrated subtests measure the constructs they are purported to measure. In addition to examining the measurement and scoring model provided in the manual, this study also tested hypotheses regarding Cattell-Horn-Carroll abilities that might be measured along with other substantive questions regarding the factor structure of the WISC-IV Integrated and the nature of abilities measured by process approach subtests. Results provide insight regarding the constructs measured by these subtests. Many subtests appear to be good to excellent measures of psychometric g (i.e., the general factor presumed to cause the positive correlation of mental tasks). Other abilities measured by subtests are described. For some subtests, the majority of variance is not accounted for by theoretical constructs included in the scoring model. Modifications made to remove demands such as memory recall and verbal expression were found to reduce construct-irrelevant variance. The WISC-IV Integrated subtests appear to measure similar constructs across ages 6-16, although strict factorial invariance was not supported.

  2. Optimization of tribological performance of SiC embedded composite coating via Taguchi analysis approach

    NASA Astrophysics Data System (ADS)

    Maleque, M. A.; Bello, K. A.; Adebisi, A. A.; Akma, N.

    2017-03-01

    Tungsten inert gas (TIG) torch is one of the most recently used heat source for surface modification of engineering parts, giving similar results to the more expensive high power laser technique. In this study, ceramic-based embedded composite coating has been produced by precoated silicon carbide (SiC) powders on the AISI 4340 low alloy steel substrate using TIG welding torch process. A design of experiment based on Taguchi approach has been adopted to optimize the TIG cladding process parameters. The L9 orthogonal array and the signal-to-noise was used to study the effect of TIG welding parameters such as arc current, travelling speed, welding voltage and argon flow rate on tribological response behaviour (wear rate, surface roughness and wear track width). The objective of the study was to identify optimal design parameter that significantly minimizes each of the surface quality characteristics. The analysis of the experimental results revealed that the argon flow rate was found to be the most influential factor contributing to the minimum wear and surface roughness of the modified coating surface. On the other hand, the key factor in reducing wear scar is the welding voltage. Finally, a convenient and economical Taguchi approach used in this study was efficient to find out optimal factor settings for obtaining minimum wear rate, wear scar and surface roughness responses in TIG-coated surfaces.

  3. Bootstrap Confidence Intervals for Ordinary Least Squares Factor Loadings and Correlations in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong

    2010-01-01

    This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…

  4. Donor retention in health care in Iran: a factor analysis

    PubMed Central

    Aghababa, Sara; Nasiripour, Amir Ashkan; Maleki, Mohammadreza; Gohari, Mahmoodreza

    2017-01-01

    Background: Long-term financial support is essential for the survival of a charitable organization. Health charities need to identify the effective factors influencing donor retention. Methods: In the present study, the items of a questionnaire were derived from both literature review and semi-structured interviews related to donor retention. Using a purposive sampling, 300 academic and executive practitioners were selected. After the follow- up, a total of 243 usable questionnaires were prepared for factor analysis. The questionnaire was validated based on the face and content validity and reliability through Cronbach’s α-coefficient. Results: The results of exploratory factor analysis extracted 2 factors for retention: donor factor (variance = 33.841%; Cronbach’s α-coefficient = 90.2) and charity factor (variance = 29.038%; Cronbach’s α-coefficient = 82.8), respectively. Subsequently, confirmatory factor analysis was applied to support the overall reasonable fit. Conclusions: In this study, it was found that repeated monetary donations are supplied to the charitable organizations when both aspects of donor factor (retention factor and charity factor) for retention are taken into consideration. This model could provide a perspective for making sustainable donations and charitable giving PMID:28955663

  5. The Factor Structure of the English Language Development Assessment: A Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Kuriakose, Anju

    2011-01-01

    This study investigated the internal factor structure of the English language development Assessment (ELDA) using confirmatory factor analysis. ELDA is an English language proficiency test developed by a consortium of multiple states and is used to identify and reclassify English language learners in kindergarten to grade 12. Scores on item…

  6. Influential Observations in Principal Factor Analysis.

    ERIC Educational Resources Information Center

    Tanaka, Yutaka; Odaka, Yoshimasa

    1989-01-01

    A method is proposed for detecting influential observations in iterative principal factor analysis. Theoretical influence functions are derived for two components of the common variance decomposition. The major mathematical tool is the influence function derived by Tanaka (1988). (SLD)

  7. Analysis of In Vivo Chromatin and Protein Interactions of Arabidopsis Transcript Elongation Factors.

    PubMed

    Pfab, Alexander; Antosz, Wojciech; Holzinger, Philipp; Bruckmann, Astrid; Griesenbeck, Joachim; Grasser, Klaus D

    2017-01-01

    A central step to elucidate the function of proteins commonly comprises the analysis of their molecular interactions in vivo. For nuclear regulatory proteins this involves determining protein-protein interactions as well as mapping of chromatin binding sites. Here, we present two protocols to identify protein-protein and chromatin interactions of transcript elongation factors (TEFs) in Arabidopsis. The first protocol (Subheading 3.1) describes protein affinity-purification coupled to mass spectrometry (AP-MS) that utilizes suspension cultured cells as experimental system. This approach provides an unbiased view of proteins interacting with epitope-tagged TEFs. The second protocol (Subheading 3.2) depicts details about a chromatin immunoprecipitation (ChIP) procedure to characterize genomic binding sites of TEFs. These methods should be valuable tools for the analysis of a broad variety of nuclear proteins.

  8. A Polyglot Approach to Bioinformatics Data Integration: A Phylogenetic Analysis of HIV-1

    PubMed Central

    Reisman, Steven; Hatzopoulos, Thomas; Läufer, Konstantin; Thiruvathukal, George K.; Putonti, Catherine

    2016-01-01

    As sequencing technologies continue to drop in price and increase in throughput, new challenges emerge for the management and accessibility of genomic sequence data. We have developed a pipeline for facilitating the storage, retrieval, and subsequent analysis of molecular data, integrating both sequence and metadata. Taking a polyglot approach involving multiple languages, libraries, and persistence mechanisms, sequence data can be aggregated from publicly available and local repositories. Data are exposed in the form of a RESTful web service, formatted for easy querying, and retrieved for downstream analyses. As a proof of concept, we have developed a resource for annotated HIV-1 sequences. Phylogenetic analyses were conducted for >6,000 HIV-1 sequences revealing spatial and temporal factors influence the evolution of the individual genes uniquely. Nevertheless, signatures of origin can be extrapolated even despite increased globalization. The approach developed here can easily be customized for any species of interest. PMID:26819543

  9. Application of factor analysis to the water quality in reservoirs

    NASA Astrophysics Data System (ADS)

    Silva, Eliana Costa e.; Lopes, Isabel Cristina; Correia, Aldina; Gonçalves, A. Manuela

    2017-06-01

    In this work we present a Factor Analysis of chemical and environmental variables of the water column and hydro-morphological features of several Portuguese reservoirs. The objective is to reduce the initial number of variables, keeping their common characteristics. Using the Factor Analysis, the environmental variables measured in the epilimnion and in the hypolimnion, together with the hydromorphological characteristics of the dams were reduced from 63 variables to only 13 factors, which explained a total of 83.348% of the variance in the original data. After performing rotation using the Varimax method, the relations between the factors and the original variables got clearer and more explainable, which provided a Factor Analysis model for these environmental variables using 13 varifactors: Water quality and distance to the source, Hypolimnion chemical composition, Sulfite-reducing bacteria and nutrients, Coliforms and faecal streptococci, Reservoir depth, Temperature, Location, among other factors.

  10. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies

    PubMed Central

    2010-01-01

    Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data. PMID:21062443

  11. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies.

    PubMed

    Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence

    2010-11-09

    Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  12. Path analysis and multi-criteria decision making: an approach for multivariate model selection and analysis in health.

    PubMed

    Vasconcelos, A G; Almeida, R M; Nobre, F F

    2001-08-01

    This paper introduces an approach that includes non-quantitative factors for the selection and assessment of multivariate complex models in health. A goodness-of-fit based methodology combined with fuzzy multi-criteria decision-making approach is proposed for model selection. Models were obtained using the Path Analysis (PA) methodology in order to explain the interrelationship between health determinants and the post-neonatal component of infant mortality in 59 municipalities of Brazil in the year 1991. Socioeconomic and demographic factors were used as exogenous variables, and environmental, health service and agglomeration as endogenous variables. Five PA models were developed and accepted by statistical criteria of goodness-of fit. These models were then submitted to a group of experts, seeking to characterize their preferences, according to predefined criteria that tried to evaluate model relevance and plausibility. Fuzzy set techniques were used to rank the alternative models according to the number of times a model was superior to ("dominated") the others. The best-ranked model explained above 90% of the endogenous variables variation, and showed the favorable influences of income and education levels on post-neonatal mortality. It also showed the unfavorable effect on mortality of fast population growth, through precarious dwelling conditions and decreased access to sanitation. It was possible to aggregate expert opinions in model evaluation. The proposed procedure for model selection allowed the inclusion of subjective information in a clear and systematic manner.

  13. Transcriptional Regulatory Network Analysis of MYB Transcription Factor Family Genes in Rice.

    PubMed

    Smita, Shuchi; Katiyar, Amit; Chinnusamy, Viswanathan; Pandey, Dev M; Bansal, Kailash C

    2015-01-01

    MYB transcription factor (TF) is one of the largest TF families and regulates defense responses to various stresses, hormone signaling as well as many metabolic and developmental processes in plants. Understanding these regulatory hierarchies of gene expression networks in response to developmental and environmental cues is a major challenge due to the complex interactions between the genetic elements. Correlation analyses are useful to unravel co-regulated gene pairs governing biological process as well as identification of new candidate hub genes in response to these complex processes. High throughput expression profiling data are highly useful for construction of co-expression networks. In the present study, we utilized transcriptome data for comprehensive regulatory network studies of MYB TFs by "top-down" and "guide-gene" approaches. More than 50% of OsMYBs were strongly correlated under 50 experimental conditions with 51 hub genes via "top-down" approach. Further, clusters were identified using Markov Clustering (MCL). To maximize the clustering performance, parameter evaluation of the MCL inflation score (I) was performed in terms of enriched GO categories by measuring F-score. Comparison of co-expressed cluster and clads analyzed from phylogenetic analysis signifies their evolutionarily conserved co-regulatory role. We utilized compendium of known interaction and biological role with Gene Ontology enrichment analysis to hypothesize function of coexpressed OsMYBs. In the other part, the transcriptional regulatory network analysis by "guide-gene" approach revealed 40 putative targets of 26 OsMYB TF hubs with high correlation value utilizing 815 microarray data. The putative targets with MYB-binding cis-elements enrichment in their promoter region, functional co-occurrence as well as nuclear localization supports our finding. Specially, enrichment of MYB binding regions involved in drought-inducibility implying their regulatory role in drought response in rice

  14. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  15. A powerful and flexible approach to the analysis of RNA sequence count data

    PubMed Central

    Zhou, Yi-Hui; Xia, Kai; Wright, Fred A.

    2011-01-01

    Motivation: A number of penalization and shrinkage approaches have been proposed for the analysis of microarray gene expression data. Similar techniques are now routinely applied to RNA sequence transcriptional count data, although the value of such shrinkage has not been conclusively established. If penalization is desired, the explicit modeling of mean–variance relationships provides a flexible testing regimen that ‘borrows’ information across genes, while easily incorporating design effects and additional covariates. Results: We describe BBSeq, which incorporates two approaches: (i) a simple beta-binomial generalized linear model, which has not been extensively tested for RNA-Seq data and (ii) an extension of an expression mean–variance modeling approach to RNA-Seq data, involving modeling of the overdispersion as a function of the mean. Our approaches are flexible, allowing for general handling of discrete experimental factors and continuous covariates. We report comparisons with other alternate methods to handle RNA-Seq data. Although penalized methods have advantages for very small sample sizes, the beta-binomial generalized linear model, combined with simple outlier detection and testing approaches, appears to have favorable characteristics in power and flexibility. Availability: An R package containing examples and sample datasets is available at http://www.bios.unc.edu/research/genomic_software/BBSeq Contact: yzhou@bios.unc.edu; fwright@bios.unc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21810900

  16. A powerful and flexible approach to the analysis of RNA sequence count data.

    PubMed

    Zhou, Yi-Hui; Xia, Kai; Wright, Fred A

    2011-10-01

    A number of penalization and shrinkage approaches have been proposed for the analysis of microarray gene expression data. Similar techniques are now routinely applied to RNA sequence transcriptional count data, although the value of such shrinkage has not been conclusively established. If penalization is desired, the explicit modeling of mean-variance relationships provides a flexible testing regimen that 'borrows' information across genes, while easily incorporating design effects and additional covariates. We describe BBSeq, which incorporates two approaches: (i) a simple beta-binomial generalized linear model, which has not been extensively tested for RNA-Seq data and (ii) an extension of an expression mean-variance modeling approach to RNA-Seq data, involving modeling of the overdispersion as a function of the mean. Our approaches are flexible, allowing for general handling of discrete experimental factors and continuous covariates. We report comparisons with other alternate methods to handle RNA-Seq data. Although penalized methods have advantages for very small sample sizes, the beta-binomial generalized linear model, combined with simple outlier detection and testing approaches, appears to have favorable characteristics in power and flexibility. An R package containing examples and sample datasets is available at http://www.bios.unc.edu/research/genomic_software/BBSeq yzhou@bios.unc.edu; fwright@bios.unc.edu Supplementary data are available at Bioinformatics online.

  17. Pressure and protective factors influencing nursing students' self-esteem: A content analysis study.

    PubMed

    Valizadeh, Leila; Zamanzadeh, Vahid; Gargari, Rahim Badri; Ghahramanian, Akram; Tabrizi, Faranak Jabbarzadeh; Keogh, Brian

    2016-01-01

    A review of the literature shows that the range of self-esteem in nursing students ranges from normal to low. It is hypothesized that different contextual factors could affect levels of self-esteem. The main aim of this study was to explore these factors from the viewpoint of Iranian nursing students using a qualitative approach. A qualitative content analysis study. Faculty of Nursing and Midwifery, 2014. Fourteen student nurses and two qualified nurses. This study has been applied to various depths of interpretation. Semi-structured interviews were used to collect the data. Fourteen student nurses and two qualified nurses were interviewed. Two main themes of the "pressure factors" with subthemes: low self-efficacy, sense of triviality, ineffective instructor-student interaction, low self-confidence and "protective factors" with subthemes: knowledge acquisition, mirror of valuability, professional autonomy, religious beliefs, and choosing the nursing field with interest was extracted in this study. Results showed that these themes have interaction with each other like a seesaw, as pressure factors decrease, the effect of protective factors on the self-esteem are increased. Nurse educators not only should try to improve the students' skills and knowledge, but should also try to enhance the protective factors and decrease pressure factors by enhancing the nursing students' feeling of being important, using participatory teaching methods, considering students' feedback, and attempting to improve facilities at the clinics are also recommended. Copyright © 2015. Published by Elsevier Ltd.

  18. Pathway Analysis in Attention Deficit Hyperactivity Disorder: An Ensemble Approach

    PubMed Central

    Mooney, Michael A.; McWeeney, Shannon K.; Faraone, Stephen V.; Hinney, Anke; Hebebrand, Johannes; Nigg, Joel T.; Wilmot, Beth

    2016-01-01

    Despite a wealth of evidence for the role of genetics in attention deficit hyperactivity disorder (ADHD), specific and definitive genetic mechanisms have not been identified. Pathway analyses, a subset of gene-set analyses, extend the knowledge gained from genome-wide association studies (GWAS) by providing functional context for genetic associations. However, there are numerous methods for association testing of gene sets and no real consensus regarding the best approach. The present study applied six pathway analysis methods to identify pathways associated with ADHD in two GWAS datasets from the Psychiatric Genomics Consortium. Methods that utilize genotypes to model pathway-level effects identified more replicable pathway associations than methods using summary statistics. In addition, pathways implicated by more than one method were significantly more likely to replicate. A number of brain-relevant pathways, such as RhoA signaling, glycosaminoglycan biosynthesis, fibroblast growth factor receptor activity, and pathways containing potassium channel genes, were nominally significant by multiple methods in both datasets. These results support previous hypotheses about the role of regulation of neurotransmitter release, neurite outgrowth and axon guidance in contributing to the ADHD phenotype and suggest the value of cross-method convergence in evaluating pathway analysis results. PMID:27004716

  19. Designing medical technology for resilience: integrating health economics and human factors approaches.

    PubMed

    Borsci, Simone; Uchegbu, Ijeoma; Buckle, Peter; Ni, Zhifang; Walne, Simon; Hanna, George B

    2018-01-01

    The slow adoption of innovation into healthcare calls into question the manner of evidence generation for medical technology. This paper identifies potential reasons for this including a lack of attention to human factors, poor evaluation of economic benefits, lack of understanding of the existing healthcare system and a failure to recognise the need to generate resilient products. Areas covered: Recognising a cross-disciplinary need to enhance evidence generation early in a technology's life cycle, the present paper proposes a new approach that integrates human factors and health economic evaluation as part of a wider systems approach to the design of technology. This approach (Human and Economic Resilience Design for Medical Technology or HERD MedTech) supports early stages of product development and is based on the recent experiences of the National Institute for Health Research London Diagnostic Evidence Co-operative in the UK. Expert commentary: HERD MedTech i) proposes a shift from design for usability to design for resilience, ii) aspires to reduce the need for service adaptation to technological constraints iii) ensures value of innovation at the time of product development, and iv) aims to stimulate discussion around the integration of pre- and post-market methods of assessment of medical technology.

  20. Definition of the thermographic regions of interest in cycling by using a factor analysis

    NASA Astrophysics Data System (ADS)

    Priego Quesada, Jose Ignacio; Lucas-Cuevas, Angel Gabriel; Salvador Palmer, Rosario; Pérez-Soriano, Pedro; Cibrián Ortiz de Anda, Rosa M.a.

    2016-03-01

    Research in exercise physiology using infrared thermography has increased in the last years. However, the definition of the Regions of Interest (ROIs) varies strongly between studies. Therefore, the aim of this study was to use a factor analysis approach to define highly correlated groups of thermographic ROIs during a cycling test. Factor analyses were performed based on the moment of measurement and on the variation of skin temperatures as a result of the cycling exercise. 19 male participants cycled during 45 min at 50% of their individual peak power output with a cadence of 90 rpm. Infrared thermography was used to measure skin temperatures in sixteen ROIs of the trunk and lower limbs at three moments: before, immediately after and 10 min after the cycling test. Factor analyses were used to identify groups of ROIs based on the skin absolute temperatures at each moment of measurement as well as on skin temperature variations between moments. All the factor analyses performed for each moment and skin temperature variation explained more than the 80% of the variance. Different groups of ROIs were obtained when the analysis was based on the moment of measurement or on the effect of exercise on the skin temperature. Furthermore, some ROIs were grouped in the same way in both analyses (e.g. the ROIs of the trunk), whereas other regions (legs and their joints) were grouped differently in each analysis. Differences between groups of ROIs are related to their tissue composition, muscular activity and capacity of sweating. In conclusion, the resultant groups of ROIs were coherent and could help researchers to define the ROIs in future thermal studies.

  1. Multi-factor challenge/response approach for remote biometric authentication

    NASA Astrophysics Data System (ADS)

    Al-Assam, Hisham; Jassim, Sabah A.

    2011-06-01

    Although biometric authentication is perceived to be more reliable than traditional authentication schemes, it becomes vulnerable to many attacks when it comes to remote authentication over open networks and raises serious privacy concerns. This paper proposes a biometric-based challenge-response approach to be used for remote authentication between two parties A and B over open networks. In the proposed approach, a remote authenticator system B (e.g. a bank) challenges its client A who wants to authenticate his/her self to the system by sending a one-time public random challenge. The client A responds by employing the random challenge along with secret information obtained from a password and a token to produce a one-time cancellable representation of his freshly captured biometric sample. The one-time biometric representation, which is based on multi-factor, is then sent back to B for matching. Here, we argue that eavesdropping of the one-time random challenge and/or the resulting one-time biometric representation does not compromise the security of the system, and no information about the original biometric data is leaked. In addition to securing biometric templates, the proposed protocol offers a practical solution for the replay attack on biometric systems. Moreover, we propose a new scheme for generating a password-based pseudo random numbers/permutation to be used as a building block in the proposed approach. The proposed scheme is also designed to provide protection against repudiation. We illustrate the viability and effectiveness of the proposed approach by experimental results based on two biometric modalities: fingerprint and face biometrics.

  2. Psychometric Evaluation of the Student Authorship Questionnaire: A Confirmatory Factor Analysis Approach

    ERIC Educational Resources Information Center

    Ballantine, Joan; Guo, Xin; Larres, Patricia

    2015-01-01

    This research provides new insights into the measurement of students' authorial identity and its potential for minimising the incidence of unintentional plagiarism by providing evidence about the psychometric properties of the Student Authorship Questionnaire (SAQ). Exploratory and confirmatory factor analyses (EFA and CFA) are employed to…

  3. Comparative analysis of risk-based cleanup levels and associated remediation costs using linearized multistage model (cancer slope factor) vs. threshold approach (reference dose) for three chlorinated alkenes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, L.J.; Mihalich, J.P.

    1995-12-31

    The chlorinated alkenes 1,1-dichloroethene (1,1-DCE), tetrachloroethene (PCE), and trichloroethene (TCE) are common environmental contaminants found in soil and groundwater at hazardous waste sites. Recent assessment of data from epidemiology and mechanistic studies indicates that although exposure to 1,1-DCE, PCE, and TCE causes tumor formation in rodents, it is unlikely that these chemicals are carcinogenic to humans. Nevertheless, many state and federal agencies continue to regulate these compounds as carcinogens through the use of the linearized multistage model and resulting cancer slope factor (CSF). The available data indicate that 1,1-DCE, PCE, and TCE should be assessed using a threshold (i.e., referencemore » dose [RfD]) approach rather than a CSF. This paper summarizes the available metabolic, toxicologic, and epidemiologic data that question the use of the linear multistage model (and CSF) for extrapolation from rodents to humans. A comparative analysis of potential risk-based cleanup goals (RBGs) for these three compounds in soil is presented for a hazardous waste site. Goals were calculated using the USEPA CSFs and using a threshold (i.e., RfD) approach. Costs associated with remediation activities required to meet each set of these cleanup goals are presented and compared.« less

  4. Risk Factors of Mortality from All Asbestos-Related Diseases: A Competing Risk Analysis.

    PubMed

    Abós-Herràndiz, Rafael; Rodriguez-Blanco, Teresa; Garcia-Allas, Isabel; Rosell-Murphy, Isabel-Magdalena; Albertí-Casas, Constança; Tarrés, Josep; Krier-Günther, Illona; Martinez-Artés, Xavier; Orriols, Ramon; Grimau-Malet, Isidre; Canela-Soler, Jaume

    2017-01-01

    The mortality from all malignant and nonmalignant asbestos-related diseases remains unknown. The authors assessed the incidence and risk factors for all asbestos-related deaths. The sample included 544 patients from an asbestos-exposed community in the area of Barcelona (Spain), between Jan 1, 1970, and Dec 31, 2006. Competing risk regression through a subdistribution hazard analysis was used to estimate risk factors for the outcomes. Asbestos-related deaths were observed in 167 (30.7%) patients and 57.5% of these deaths were caused by some type of mesothelioma. The incidence rate after diagnosis was 3,600 per 100,000 person-years. In 7.5% of patients death was non-asbestos-related, while pleural and peritoneal mesothelioma were identified in 87 (16.0%) and 18 (3.3%) patients, respectively. Age, sex, household exposure, cumulative nonmalignant asbestos-related disease, and single malignant pathology were identified as risk factors for asbestos-related death. These findings suggest the need to develop a preventive approach to the community and to improve the clinical follow-up process of these patients.

  5. Dual-Tracer PET Using Generalized Factor Analysis of Dynamic Sequences

    PubMed Central

    Fakhri, Georges El; Trott, Cathryn M.; Sitek, Arkadiusz; Bonab, Ali; Alpert, Nathaniel M.

    2013-01-01

    Purpose With single-photon emission computed tomography, simultaneous imaging of two physiological processes relies on discrimination of the energy of the emitted gamma rays, whereas the application of dual-tracer imaging to positron emission tomography (PET) imaging has been limited by the characteristic 511-keV emissions. Procedures To address this limitation, we developed a novel approach based on generalized factor analysis of dynamic sequences (GFADS) that exploits spatio-temporal differences between radiotracers and applied it to near-simultaneous imaging of 2-deoxy-2-[18F]fluoro-D-glucose (FDG) (brain metabolism) and 11C-raclopride (D2) with simulated human data and experimental rhesus monkey data. We show theoretically and verify by simulation and measurement that GFADS can separate FDG and raclopride measurements that are made nearly simultaneously. Results The theoretical development shows that GFADS can decompose the studies at several levels: (1) It decomposes the FDG and raclopride study so that they can be analyzed as though they were obtained separately. (2) If additional physiologic/anatomic constraints can be imposed, further decomposition is possible. (3) For the example of raclopride, specific and nonspecific binding can be determined on a pixel-by-pixel basis. We found good agreement between the estimated GFADS factors and the simulated ground truth time activity curves (TACs), and between the GFADS factor images and the corresponding ground truth activity distributions with errors less than 7.3±1.3 %. Biases in estimation of specific D2 binding and relative metabolism activity were within 5.9±3.6 % compared to the ground truth values. We also evaluated our approach in simultaneous dual-isotope brain PET studies in a rhesus monkey and obtained accuracy of better than 6 % in a mid-striatal volume, for striatal activity estimation. Conclusions Dynamic image sequences acquired following near-simultaneous injection of two PET radiopharmaceuticals

  6. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  7. Cluster analysis: a new approach for identification of underlying risk factors for coronary artery disease in essential hypertensive patients.

    PubMed

    Guo, Qi; Lu, Xiaoni; Gao, Ya; Zhang, Jingjing; Yan, Bin; Su, Dan; Song, Anqi; Zhao, Xi; Wang, Gang

    2017-03-07

    Grading of essential hypertension according to blood pressure (BP) level may not adequately reflect clinical heterogeneity of hypertensive patients. This study was carried out to explore clinical phenotypes in essential hypertensive patients using cluster analysis. This study recruited 513 hypertensive patients and evaluated BP variations with ambulatory blood pressure monitoring. Four distinct hypertension groups were identified using cluster analysis: (1) younger male smokers with relatively high BP had the most severe carotid plaque thickness but no coronary artery disease (CAD); (2) older women with relatively low diastolic BP had more diabetes; (3) non-smokers with a low systolic BP level had neither diabetes nor CAD; (4) hypertensive patients with BP reverse dipping were most likely to have CAD but had least severe carotid plaque thickness. In binary logistic analysis, reverse dipping was significantly associated with prevalence of CAD. Cluster analysis was shown to be a feasible approach for investigating the heterogeneity of essential hypertension in clinical studies. BP reverse dipping might be valuable for prediction of CAD in hypertensive patients when compared with carotid plaque thickness. However, large-scale prospective trials with more information of plaque morphology are necessary to further compare the predicative power between BP dipping pattern and carotid plaque.

  8. Cluster analysis: a new approach for identification of underlying risk factors for coronary artery disease in essential hypertensive patients

    PubMed Central

    Guo, Qi; Lu, Xiaoni; Gao, Ya; Zhang, Jingjing; Yan, Bin; Su, Dan; Song, Anqi; Zhao, Xi; Wang, Gang

    2017-01-01

    Grading of essential hypertension according to blood pressure (BP) level may not adequately reflect clinical heterogeneity of hypertensive patients. This study was carried out to explore clinical phenotypes in essential hypertensive patients using cluster analysis. This study recruited 513 hypertensive patients and evaluated BP variations with ambulatory blood pressure monitoring. Four distinct hypertension groups were identified using cluster analysis: (1) younger male smokers with relatively high BP had the most severe carotid plaque thickness but no coronary artery disease (CAD); (2) older women with relatively low diastolic BP had more diabetes; (3) non-smokers with a low systolic BP level had neither diabetes nor CAD; (4) hypertensive patients with BP reverse dipping were most likely to have CAD but had least severe carotid plaque thickness. In binary logistic analysis, reverse dipping was significantly associated with prevalence of CAD. Cluster analysis was shown to be a feasible approach for investigating the heterogeneity of essential hypertension in clinical studies. BP reverse dipping might be valuable for prediction of CAD in hypertensive patients when compared with carotid plaque thickness. However, large-scale prospective trials with more information of plaque morphology are necessary to further compare the predicative power between BP dipping pattern and carotid plaque. PMID:28266630

  9. Risk factors and prediction of very short term versus short/intermediate term post-stroke mortality: a data mining approach.

    PubMed

    Easton, Jonathan F; Stephens, Christopher R; Angelova, Maia

    2014-11-01

    Data mining and knowledge discovery as an approach to examining medical data can limit some of the inherent bias in the hypothesis assumptions that can be found in traditional clinical data analysis. In this paper we illustrate the benefits of a data mining inspired approach to statistically analysing a bespoke data set, the academic multicentre randomised control trial, U.K Glucose Insulin in Stroke Trial (GIST-UK), with a view to discovering new insights distinct from the original hypotheses of the trial. We consider post-stroke mortality prediction as a function of days since stroke onset, showing that the time scales that best characterise changes in mortality risk are most naturally defined by examination of the mortality curve. We show that certain risk factors differentiate between very short term and intermediate term mortality. In particular, we show that age is highly relevant for intermediate term risk but not for very short or short term mortality. We suggest that this is due to the concept of frailty. Other risk factors are highlighted across a range of variable types including socio-demographics, past medical histories and admission medication. Using the most statistically significant risk factors we build predictive classification models for very short term and short/intermediate term mortality. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  10. Extractive waste management: A risk analysis approach.

    PubMed

    Mehta, Neha; Dino, Giovanna Antonella; Ajmone-Marsan, Franco; Lasagna, Manuela; Romè, Chiara; De Luca, Domenico Antonio

    2018-05-01

    Abandoned mine sites continue to present serious environmental hazards because the heavy metals associated with extractive waste are continuously released into the environment, where they threaten human life and the environment. Remediating and securing extractive waste are complex, lengthy and costly processes. Thus, in most European countries, a site is considered for intervention when it poses a risk to human health and the surrounding environment. As a consequence, risk analysis presents a viable decisional approach towards the management of extractive waste. To evaluate the effects posed by extractive waste to human health and groundwater, a risk analysis approach was used for an abandoned nickel extraction site in Campello Monti in North Italy. This site is located in the Southern Italian Alps. The area consists of large and voluminous mafic rocks intruded by mantle peridotite. The mining activities in this area have generated extractive waste. A risk analysis of the site was performed using Risk Based Corrective Action (RBCA) guidelines, considering the properties of extractive waste and water for the properties of environmental matrices. The results showed the presence of carcinogenic risk due to arsenic and risks to groundwater due to nickel. The results of the risk analysis form a basic understanding of the current situation at the site, which is affected by extractive waste. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A replication of a factor analysis of motivations for trapping

    USGS Publications Warehouse

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  12. On the Relations among Regular, Equal Unique Variances, and Image Factor Analysis Models.

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.

    2000-01-01

    Investigated the conditions under which the matrix of factor loadings from the factor analysis model with equal unique variances will give a good approximation to the matrix of factor loadings from the regular factor analysis model. Extends the results to the image factor analysis model. Discusses implications for practice. (SLD)

  13. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  14. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  15. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    PubMed Central

    Sediyama, Cristina Y. N.; Moura, Ricardo; Garcia, Marina S.; da Silva, Antonio G.; Soraggi, Carolina; Neves, Fernando S.; Albuquerque, Maicon R.; Whiteside, Setephen P.; Malloy-Diniz, Leandro F.

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS. PMID:28484414

  16. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale.

    PubMed

    Sediyama, Cristina Y N; Moura, Ricardo; Garcia, Marina S; da Silva, Antonio G; Soraggi, Carolina; Neves, Fernando S; Albuquerque, Maicon R; Whiteside, Setephen P; Malloy-Diniz, Leandro F

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach's alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  17. Understanding key factors affecting electronic medical record implementation: a sociotechnical approach.

    PubMed

    Cucciniello, Maria; Lapsley, Irvine; Nasi, Greta; Pagliari, Claudia

    2015-07-17

    Recent health care policies have supported the adoption of Information and Communication Technologies (ICT) but examples of failed ICT projects in this sector have highlighted the need for a greater understanding of the processes used to implement such innovations in complex organizations. This study examined the interaction of sociological and technological factors in the implementation of an Electronic Medical Record (EMR) system by a major national hospital. It aimed to obtain insights for managers planning such projects in the future and to examine the usefulness of Actor Network Theory (ANT) as a research tool in this context. Case study using documentary analysis, interviews and observations. Qualitative thematic analysis drawing on ANT. Qualitative analyses revealed a complex network of interactions between organizational stakeholders and technology that helped to shape the system and influence its acceptance and adoption. The EMR clearly emerged as a central 'actor' within this network. The results illustrate how important it is to plan innovative and complex information systems with reference to (i) the expressed needs and involvement of different actors, starting from the initial introductory phase; (ii) promoting commitment to the system and adopting a participative approach; (iii) defining and resourcing new roles within the organization capable of supporting and sustaining the change and (iv) assessing system impacts in order to mobilize the network around a common goal. The paper highlights the organizational, cultural, technological, and financial considerations that should be taken into account when planning strategies for the implementation of EMR systems in hospital settings. It also demonstrates how ANT may be usefully deployed in evaluating such projects.

  18. A Human Factors Approach to Bridging Systems and Introducing New Technologies

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.

    2011-01-01

    The application of human factors in aviation has grown to cover a wide range of disciplines and methods capable of assessing human-systems integration at many levels. For example, at the individual level, pilot workload may be studied while at the team level, coordinated workload distribution may be the focal point. At the organizational level, the way in which individuals and teams are supported by training and standards, policies and procedures may introduce additional, relevant topics. A consideration of human factors at each level contributes to our understanding of successes and failures in pilot performance, but this system focused on the flight deck alone -- is only one part of the airspace system. In the FAA's NextGen plan to overhaul the National Airspace System (NAS), new capabilities will enhance flightdeck systems (pilots), flight operations centers (dispatchers) and air traffic control systems (controllers and air traffic managers). At a minimum, the current roles and responsibilities of these three systems are likely to change. Since increased automation will be central to many of the enhancements, the role of automation is also likely to change. Using NextGen examples, a human factors approach for bridging complex airspace systems will be the main focus of this presentation. It is still crucial to consider the human factors within each system, but the successful implementation of new technologies in the NAS requires an understanding of the collaborations that occur when these systems intersect. This human factors approach to studying collaborative systems begins with detailed task descriptions within each system to establish a baseline of the current operations. The collaborative content and context are delineated through the review of regulatory and advisory materials, letters of agreement, policies, procedures and documented practices. Field observations and interviews also help to fill out the picture. Key collaborative functions across systems

  19. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  20. Theoretical and methodological approaches in discourse analysis.

    PubMed

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  1. Theoretical and methodological approaches in discourse analysis.

    PubMed

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  2. Multiple Statistical Models Based Analysis of Causative Factors and Loess Landslides in Tianshui City, China

    NASA Astrophysics Data System (ADS)

    Su, Xing; Meng, Xingmin; Ye, Weilin; Wu, Weijiang; Liu, Xingrong; Wei, Wanhong

    2018-03-01

    Tianshui City is one of the mountainous cities that are threatened by severe geo-hazards in Gansu Province, China. Statistical probability models have been widely used in analyzing and evaluating geo-hazards such as landslide. In this research, three approaches (Certainty Factor Method, Weight of Evidence Method and Information Quantity Method) were adopted to quantitively analyze the relationship between the causative factors and the landslides, respectively. The source data used in this study are including the SRTM DEM and local geological maps in the scale of 1:200,000. 12 causative factors (i.e., altitude, slope, aspect, curvature, plan curvature, profile curvature, roughness, relief amplitude, and distance to rivers, distance to faults, distance to roads, and the stratum lithology) were selected to do correlation analysis after thorough investigation of geological conditions and historical landslides. The results indicate that the outcomes of the three models are fairly consistent.

  3. Beyond Repair: Conversation Analysis as an Approach to SLA

    ERIC Educational Resources Information Center

    Kasper, Gabriele

    2006-01-01

    As one of several approaches to SLA as social practice, Conversation Analysis (CA) has the capacity to examine in detail how opportunities for L2 learning arise in different interactional activities. Its particular strength, and one that distinguishes it from other social practice approaches, is its consistent focus on the orientations and…

  4. A qualitative, interprofessional analysis of barriers to and facilitators of implementation of the Department of Veterans Affairs' Clostridium difficile prevention bundle using a human factors engineering approach.

    PubMed

    Yanke, Eric; Moriarty, Helene; Carayon, Pascale; Safdar, Nasia

    2018-03-01

    Clostridium difficile infection (CDI) is increasingly prevalent, severe, and costly. Adherence to infection prevention practices remains suboptimal. More effective strategies to implement guidelines and evidence are needed. Interprofessional focus groups consisting of physicians, resident physicians, nurses, and health technicians were conducted for a quality improvement project evaluating adherence to the Department of Veterans Affairs' (VA) nationally mandated C difficile prevention bundle. Qualitative analysis with a visual matrix display identified barrier and facilitator themes guided by the Systems Engineering Initiative for Patient Safety model, a human factors engineering approach. Several themes, encompassing both barriers and facilitators to bundle adherence, emerged. Rapid turnaround time of C difficile polymerase chain reaction testing was a facilitator of timely diagnosis. Too few, poorly located, and cluttered sinks were barriers to appropriate hand hygiene. Patient care workload and the time-consuming process of contact isolation precautions were also barriers to adherence. Multiple work system components serve as barriers to and facilitators of adherence to the VA CDI prevention bundle among an interprofessional group of health care workers. Organizational factors appear to significantly influence bundle adherence. Interprofessional perspectives are needed to identify barriers to and facilitators of bundle implementation, which is a necessary first step to address adherence to bundled infection prevention practices. Published by Elsevier Inc.

  5. A Factor Analysis of Learning Data and Selected Ability Test Scores

    ERIC Educational Resources Information Center

    Jones, Dorothy L.

    1976-01-01

    A verbal concept-learning task permitting the externalizing and quantifying of learning behavior and 16 ability tests were administered to female graduate students. Data were analyzed by alpha factor analysis and incomplete image analysis. Six alpha factors and 12 image factors were extracted and orthogonally rotated. Four areas of cognitive…

  6. Comparison of approaches for mobile document image analysis using server supported smartphones

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  7. Biological risk factors for suicidal behaviors: a meta-analysis

    PubMed Central

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09–1.81) and suicide death (wOR=1.28; CI: 1.13–1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias—cytokines (wOR=2.87; CI: 1.40–5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01–1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors. PMID:27622931

  8. Risk analysis of sterile production plants: a new and simple, workable approach.

    PubMed

    Gapp, Guenther; Holzknecht, Peter

    2011-01-01

    A sterile active ingredient plant and a sterile finished dosage filling plant both comprise very complex production processes and systems. The sterility of the final product cannot be assured solely by sterility testing, in-process controls, environmental monitoring of cleanrooms, and media fill validations. Based on more than 15 years experience, 4 years ago the authors created a new but very simple approach to the risk analysis of sterile plants. This approach is not a failure mode and effects analysis and therefore differs from the PDA Technical Report 44 Quality Risk Management for Aseptic Processes of 2008. The principle involves specific questions, which have been defined in the risk analysis questionnaire in advance, to be answered by an expert team. If the questionnaire item is dealt with appropriately, the answer is assigned a low-risk number (1) and if very weak or deficient it gets a high-risk number (5). In addition to the numbers, colors from green (not problematic) through orange to red (very problematic) are attributed to make the results more striking. Because the individual units of each production plant have a defined and different impact on the overall sterility of the final product, different risk emphasis factors have to be taken into account (impact factor 1, 3, or 5). In a well run cleanroom, the cleanroom operators have a lower impact than other units with regard to the contamination risk. The resulting number of the analyzed production plant and the diagram of the assessment subsequently offers very important and valuable information about a) the risk for microbiological contamination (sterility/endotoxins) of the product, and b) the compliance status of the production plant and the risk of failing lots, as well as probable observations of upcoming regulatory agency audits. Both items above are highly important for the safety of the patient. It is also an ideal tool to identify deficient or weak systems requiring improvement and upgrade

  9. Connecting the Dots: State Health Department Approaches to Addressing Shared Risk and Protective Factors Across Multiple Forms of Violence.

    PubMed

    Wilkins, Natalie; Myers, Lindsey; Kuehl, Tomei; Bauman, Alice; Hertz, Marci

    Violence takes many forms, including intimate partner violence, sexual violence, child abuse and neglect, bullying, suicidal behavior, and elder abuse and neglect. These forms of violence are interconnected and often share the same root causes. They can also co-occur together in families and communities and can happen at the same time or at different stages of life. Often, due to a variety of factors, separate, "siloed" approaches are used to address each form of violence. However, understanding and implementing approaches that prevent and address the overlapping root causes of violence (risk factors) and promote factors that increase the resilience of people and communities (protective factors) can help practitioners more effectively and efficiently use limited resources to prevent multiple forms of violence and save lives. This article presents approaches used by 2 state health departments, the Maryland Department of Health and Mental Hygiene and the Colorado Department of Public Health and Environment, to integrate a shared risk and protective factor approach into their violence prevention work and identifies key lessons learned that may serve to inform crosscutting violence prevention efforts in other states.

  10. Human Factors Vehicle Displacement Analysis: Engineering In Motion

    NASA Technical Reports Server (NTRS)

    Atencio, Laura Ashley; Reynolds, David; Robertson, Clay

    2010-01-01

    While positioned on the launch pad at the Kennedy Space Center, tall stacked launch vehicles are exposed to the natural environment. Varying directional winds and vortex shedding causes the vehicle to sway in an oscillating motion. The Human Factors team recognizes that vehicle sway may hinder ground crew operation, impact the ground system designs, and ultimately affect launch availability . The objective of this study is to physically simulate predicted oscillation envelopes identified by analysis. and conduct a Human Factors Analysis to assess the ability to carry out essential Upper Stage (US) ground operator tasks based on predicted vehicle motion.

  11. Ambulatory Antibiotic Stewardship through a Human Factors Engineering Approach: A Systematic Review.

    PubMed

    Keller, Sara C; Tamma, Pranita D; Cosgrove, Sara E; Miller, Melissa A; Sateia, Heather; Szymczak, Julie; Gurses, Ayse P; Linder, Jeffrey A

    2018-01-01

    In the United States, most antibiotics are prescribed in ambulatory settings. Human factors engineering, which explores interactions between people and the place where they work, has successfully improved quality of care. However, human factors engineering models have not been explored to frame what is known about ambulatory antibiotic stewardship (AS) interventions and barriers and facilitators to their implementation. We conducted a systematic review and searched OVID MEDLINE, Embase, Scopus, Web of Science, and CINAHL to identify controlled interventions and qualitative studies of ambulatory AS and determine whether and how they incorporated principles from a human factors engineering model, the Systems Engineering Initiative for Patient Safety 2.0 model. This model describes how a work system (ambulatory clinic) contributes to a process (antibiotic prescribing) that leads to outcomes. The work system consists of 5 components, tools and technology, organization, person, tasks, and environment, within an external environment. Of 1,288 abstracts initially identified, 42 quantitative studies and 17 qualitative studies met inclusion criteria. Effective interventions focused on tools and technology (eg, clinical decision support and point-of-care testing), the person (eg, clinician education), organization (eg, audit and feedback and academic detailing), tasks (eg, delayed antibiotic prescribing), the environment (eg, commitment posters), and the external environment (media campaigns). Studies have not focused on clinic-wide approaches to AS. A human factors engineering approach suggests that investigating the role of the clinic's processes or physical layout or external pressures' role in antibiotic prescribing may be a promising way to improve ambulatory AS. © Copyright 2018 by the American Board of Family Medicine.

  12. Factor analysis shows association between family activity environment and children's health behaviour.

    PubMed

    Hendrie, Gilly A; Coveney, John; Cox, David N

    2011-12-01

    To characterise the family activity environment in a questionnaire format, assess the questionnaire's reliability and describe its predictive ability by examining the relationships between the family activity environment and children's health behaviours - physical activity, screen time and fruit and vegetable intake. This paper describes the creation of a tool, based on previously validated scales, adapted from the food domain. Data are from 106 children and their parents (Adelaide, South Australia). Factor analysis was used to characterise factors within the family activity environment. Pearson-Product Moment correlations between the family environment and child outcomes, controlling for demographic variation, were examined. Three factors described the family activity environment - parental activity involvement, opportunity for role modelling and parental support for physical activity - and explained 37.6% of the variance. Controlling for demographic factors, the scale was significantly correlated with children's health behaviour - physical activity (r=0.27), screen time (r=-0.24) and fruit and vegetable intake (r=0.34). The family activity environment questionnaire shows high internal consistency and moderate predictive ability. This study has built on previous research by taking a more comprehensive approach to measuring the family activity environment. This research suggests the family activity environment should be considered in family-based health promotion interventions. © 2011 The Authors. ANZJPH © 2011 Public Health Association of Australia.

  13. Normalization of RNA-seq data using factor analysis of control genes or samples

    PubMed Central

    Risso, Davide; Ngai, John; Speed, Terence P.; Dudoit, Sandrine

    2015-01-01

    Normalization of RNA-seq data has proven essential to ensure accurate inference of expression levels. Here we show that usual normalization approaches mostly account for sequencing depth and fail to correct for library preparation and other more-complex unwanted effects. We evaluate the performance of the External RNA Control Consortium (ERCC) spike-in controls and investigate the possibility of using them directly for normalization. We show that the spike-ins are not reliable enough to be used in standard global-scaling or regression-based normalization procedures. We propose a normalization strategy, remove unwanted variation (RUV), that adjusts for nuisance technical effects by performing factor analysis on suitable sets of control genes (e.g., ERCC spike-ins) or samples (e.g., replicate libraries). Our approach leads to more-accurate estimates of expression fold-changes and tests of differential expression compared to state-of-the-art normalization methods. In particular, RUV promises to be valuable for large collaborative projects involving multiple labs, technicians, and/or platforms. PMID:25150836

  14. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  15. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra

    NASA Astrophysics Data System (ADS)

    Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.

  16. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra.

    PubMed

    Luce, Robert; Hildebrandt, Peter; Kuhlmann, Uwe; Liesen, Jörg

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for nonnegative matrix factorization that is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with the vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed. © The Author(s) 2016.

  17. Face recognition using an enhanced independent component analysis approach.

    PubMed

    Kwak, Keun-Chang; Pedrycz, Witold

    2007-03-01

    This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.

  18. Biomechanical approaches to identify and quantify injury mechanisms and risk factors in women's artistic gymnastics.

    PubMed

    Bradshaw, Elizabeth J; Hume, Patria A

    2012-09-01

    Targeted injury prevention strategies, based on biomechanical analyses, have the potential to help reduce the incidence and severity of gymnastics injuries. This review outlines the potential benefits of biomechanics research to contribute to injury prevention strategies for women's artistic gymnastics by identification of mechanisms of injury and quantification of the effects of injury risk factors. One hundred and twenty-three articles were retained for review after searching electronic databases using key words, including 'gymnastic', 'biomech*', and 'inj*', and delimiting by language and relevance to the paper aim. Impact load can be measured biomechanically by the use of instrumented equipment (e.g. beatboard), instrumentation on the gymnast (accelerometers), or by landings on force plates. We need further information on injury mechanisms and risk factors in gymnastics and practical methods of monitoring training loads. We have not yet shown, beyond a theoretical approach, how biomechanical analysis of gymnastics can help reduce injury risk through injury prevention interventions. Given the high magnitude of impact load, both acute and accumulative, coaches should monitor impact loads per training session, taking into consideration training quality and quantity such as the control of rotation and the height from which the landings are executed.

  19. Application of Factor Analysis on the Financial Ratios of Indian Cement Industry and Validation of the Results by Cluster Analysis

    NASA Astrophysics Data System (ADS)

    De, Anupam; Bandyopadhyay, Gautam; Chakraborty, B. N.

    2010-10-01

    Financial ratio analysis is an important and commonly used tool in analyzing financial health of a firm. Quite a large number of financial ratios, which can be categorized in different groups, are used for this analysis. However, to reduce number of ratios to be used for financial analysis and regrouping them into different groups on basis of empirical evidence, Factor Analysis technique is being used successfully by different researches during the last three decades. In this study Factor Analysis has been applied over audited financial data of Indian cement companies for a period of 10 years. The sample companies are listed on the Stock Exchange India (BSE and NSE). Factor Analysis, conducted over 44 variables (financial ratios) grouped in 7 categories, resulted in 11 underlying categories (factors). Each factor is named in an appropriate manner considering the factor loads and constituent variables (ratios). Representative ratios are identified for each such factor. To validate the results of Factor Analysis and to reach final conclusion regarding the representative ratios, Cluster Analysis had been performed.

  20. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  1. Cleared for the visual approach: Human factor problems in air carrier operations

    NASA Technical Reports Server (NTRS)

    Monan, W. P.

    1983-01-01

    The study described herein, a set of 353 ASRS reports of unique aviation occurrences significantly involving visual approaches was examined to identify hazards and pitfalls embedded in the visual approach procedure and to consider operational practices that might help avoid future mishaps. Analysis of the report set identified nine aspects of the visual approach procedure that appeared to be predisposing conditions for inducing or exacerbating the effects of operational errors by flight crew members or controllers. Predisposing conditions, errors, and operational consequences of the errors are discussed. In a summary, operational policies that might mitigate the problems are examined.

  2. Two Experiments to Approach the Boltzmann Factor: Chemical Reaction and Viscous Flow

    ERIC Educational Resources Information Center

    Fazio, Claudio; Battaglia, Onofrio R.; Guastella, Ivan

    2012-01-01

    In this paper we discuss a pedagogical approach aimed at pointing out the role played by the Boltzmann factor in describing phenomena usually perceived as regulated by different mechanisms of functioning. Experimental results regarding some aspects of a chemical reaction and of the viscous flow of some liquids are analysed and described in terms…

  3. The Effect of Differentiation Approach Developed on Creativity of Gifted Students: Cognitive and Affective Factors

    ERIC Educational Resources Information Center

    Altintas, Esra; Özdemir, Ahmet S.

    2015-01-01

    The aim of the study is to develop a differentiation approach for the mathematics education of gifted middle school students and to determine the effect of the differentiation approach on creative thinking skills of gifted students based on both cognitive and affective factors. In this context, the answer to the following question was searched:…

  4. Deep Learning with Hierarchical Convolutional Factor Analysis

    PubMed Central

    Chen, Bo; Polatkan, Gungor; Sapiro, Guillermo; Blei, David; Dunson, David; Carin, Lawrence

    2013-01-01

    Unsupervised multi-layered (“deep”) models are considered for general data, with a particular focus on imagery. The model is represented using a hierarchical convolutional factor-analysis construction, with sparse factor loadings and scores. The computation of layer-dependent model parameters is implemented within a Bayesian setting, employing a Gibbs sampler and variational Bayesian (VB) analysis, that explicitly exploit the convolutional nature of the expansion. In order to address large-scale and streaming data, an online version of VB is also developed. The number of basis functions or dictionary elements at each layer is inferred from the data, based on a beta-Bernoulli implementation of the Indian buffet process. Example results are presented for several image-processing applications, with comparisons to related models in the literature. PMID:23787342

  5. Factor Analysis by Generalized Least Squares.

    ERIC Educational Resources Information Center

    Joreskog, Karl G.; Goldberger, Arthur S.

    Aitkin's generalized least squares (GLS) principle, with the inverse of the observed variance-covariance matrix as a weight matrix, is applied to estimate the factor analysis model in the exploratory (unrestricted) case. It is shown that the GLS estimates are scale free and asymptotically efficient. The estimates are computed by a rapidly…

  6. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  7. Exploratory Factor Analysis of a Force Concept Inventory Data Set

    ERIC Educational Resources Information Center

    Scott, Terry F.; Schumayer, Daniel; Gray, Andrew R.

    2012-01-01

    We perform a factor analysis on a "Force Concept Inventory" (FCI) data set collected from 2109 respondents. We address two questions: the appearance of conceptual coherence in student responses to the FCI and some consequences of this factor analysis on the teaching of Newtonian mechanics. We will highlight the apparent conflation of Newton's…

  8. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  9. What to do When Scalar Invariance Fails: The Extended Alignment Method for Multi-Group Factor Analysis Comparison of Latent Means Across Many Groups.

    PubMed

    Marsh, Herbert W; Guo, Jiesi; Parker, Philip D; Nagengast, Benjamin; Asparouhov, Tihomir; Muthén, Bengt; Dicke, Theresa

    2017-01-12

    Scalar invariance is an unachievable ideal that in practice can only be approximated; often using potentially questionable approaches such as partial invariance based on a stepwise selection of parameter estimates with large modification indices. Study 1 demonstrates an extension of the power and flexibility of the alignment approach for comparing latent factor means in large-scale studies (30 OECD countries, 8 factors, 44 items, N = 249,840), for which scalar invariance is typically not supported in the traditional confirmatory factor analysis approach to measurement invariance (CFA-MI). Importantly, we introduce an alignment-within-CFA (AwC) approach, transforming alignment from a largely exploratory tool into a confirmatory tool, and enabling analyses that previously have not been possible with alignment (testing the invariance of uniquenesses and factor variances/covariances; multiple-group MIMIC models; contrasts on latent means) and structural equation models more generally. Specifically, it also allowed a comparison of gender differences in a 30-country MIMIC AwC (i.e., a SEM with gender as a covariate) and a 60-group AwC CFA (i.e., 30 countries × 2 genders) analysis. Study 2, a simulation study following up issues raised in Study 1, showed that latent means were more accurately estimated with alignment than with the scalar CFA-MI, and particularly with partial invariance scalar models based on the heavily criticized stepwise selection strategy. In summary, alignment augmented by AwC provides applied researchers from diverse disciplines considerable flexibility to address substantively important issues when the traditional CFA-MI scalar model does not fit the data. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Applicability of action planning and coping planning to dental flossing among Norwegian adults: a confirmatory factor analysis approach.

    PubMed

    Astrøm, Anne Nordrehaug

    2008-06-01

    Using a prospective design and a representative sample of 25-yr-old Norwegians, this study hypothesized that action planning and coping planning will add to the prediction of flossing at 4 wk of follow-up over and above the effect of intention and previous flossing. This study tested the validity of a proposed 3-factor structure of the measurement model of intention, action planning, and coping planning and for its invariance across gender. A survey was conducted in three Norwegian counties, and 1,509 out of 8,000 randomly selected individuals completed questionnaires assessing the constructs of action planning and coping planning related to daily flossing. A random subsample of 500 participants was followed up at 4 wk with a telephone interview to assess flossing. Confirmatory factor analysis (CFA) confirmed the proposed 3-factor model after respecification. Although the chi-square test was statistically significant [chi(2) = 58.501, degrees of freedom (d.f.) = 17), complementary fit indices were satisfactory [goodness-of-fit index (GFI) = 0.99, root mean squared error of approximation (RMSEA) = 0.04]. Multigroup CFA provided evidence of complete invariance of the measurement model across gender. After controlling for previous flossing, intention (beta = 0.08) and action planning (beta = 0.11) emerged as independent predictors of subsequent flossing, accounting for 2.3% of its variance. Factorial validity of intention, action planning and coping planning, and the validity of action planning in predicting flossing prospectively, was confirmed by the present study.

  11. The motion commotion: Human factors in transportation

    NASA Technical Reports Server (NTRS)

    Millar, A. E., Jr. (Editor); Rosen, R. L. (Editor); Gibson, J. D. (Editor); Crum, R. G. (Editor)

    1972-01-01

    The program for a systems approach to the problem of incorporating human factors in designing transportation systems is summarized. The importance of the human side of transportation is discussed along with the three major factors related to maintaining a mobile and quality life. These factors are (1) people, as individuals and groups, (2) society as a whole, and (3) the natural environment and man-made environs. The problems and bottlenecks are presented along with approaches to their solutions through systems analysis. Specific recommendations essential to achieving improved mobility within environmental constraints are presented.

  12. Pediatric differentiated thyroid carcinoma in stage I: risk factor analysis for disease free survival

    PubMed Central

    2009-01-01

    Background To examine the outcomes and risk factors in pediatric differentiated thyroid carcinoma (DTC) patients who were defined as TNM stage I because some patients develop disease recurrence but treatment strategy for such stage I pediatric patients is still controversial. Methods We reviewed 57 consecutive TNM stage I patients (15 years or less) with DTC (46 papillary and 11 follicular) who underwent initial treatment at Ito Hospital between 1962 and 2004 (7 males and 50 females; mean age: 13.1 years; mean follow-up: 17.4 years). Clinicopathological results were evaluated in all patients. Multivariate analysis was performed to reveal the risk factors for disease-free survival (DFS) in these 57 patients. Results Extrathyroid extension and clinical lymphadenopathy at diagnosis were found in 7 and 12 patients, respectively. Subtotal/total thyroidectomy was performed in 23 patients, modified neck dissection in 38, and radioactive iodine therapy in 10. Pathological node metastasis was confirmed in 37 patients (64.9%). Fifteen patients (26.3%) exhibited local recurrence and 3 of them also developed metachronous lung metastasis. Ten of these 15 achieved disease-free after further treatments and no patients died of disease. In multivariate analysis, male gender (p = 0.017), advanced tumor (T3, 4a) stage (p = 0.029), and clinical lymphadenopathy (p = 0.006) were risk factors for DFS in stage I pediatric patients. Conclusion Male gender, tumor stage, and lymphadenopathy are risk factors for DFS in stage I pediatric DTC patients. Aggressive treatment (total thyroidectomy, node dissection, and RI therapy) is considered appropriate for patients with risk factors, whereas conservative or stepwise approach may be acceptable for other patients. PMID:19723317

  13. Analysis of Factors Influencing Creative Personality of Elementary School Students

    ERIC Educational Resources Information Center

    Park, Jongman; Kim, Minkee; Jang, Shinho

    2017-01-01

    This quantitative research examined factors that affect elementary students' creativity and how those factors correlate. Aiming to identify significant factors that affect creativity and to clarify the relationship between these factors by path analysis, this research was designed to be a stepping stone for creativity enhancement studies. Data…

  14. Associated Υ+γ production at the LHC in the kt-factorization approach

    NASA Astrophysics Data System (ADS)

    Baranov, S. P.

    2010-09-01

    In the framework of the kt-factorization approach, the photon-associated production of Υ mesons at the present-day LHC conditions is studied. The differential cross sections and polarization parameters are calculated in the “helicity” and Collins-Soper systems. Special attention is paid to the effect of experimental cuts that can dramatically change the visible lepton angular distributions.

  15. Associated ϒ + γ production at the LHC in the k-factorization approach

    NASA Astrophysics Data System (ADS)

    Baranov, S. P.

    2011-05-01

    In the framework of k-factorization approach, the photon-associated production of ϒ mesons at the present-day LHC conditions is studied. The differential cross sections and polarization parameters are calculated in the 'helicity' and Collins-Soper systems. Special attention is paid to the effect of experimental cuts that can dramatically change the visible lepton angular distributions.

  16. Impact of different dietary approaches on glycemic control and cardiovascular risk factors in patients with type 2 diabetes: a protocol for a systematic review and network meta-analysis.

    PubMed

    Schwingshackl, Lukas; Chaimani, Anna; Hoffmann, Georg; Schwedhelm, Carolina; Boeing, Heiner

    2017-03-20

    Dietary advice is one of the cornerstones in the management of type 2 diabetes mellitus. The American Diabetes Association recommended a hypocaloric diet for overweight or obese adults with type 2 diabetes in order to induce weight loss. However, there is limited evidence on the optimal approaches to control hyperglycemia in type 2 diabetes patients. The aim of the present study is to assess the comparative efficacy of different dietary approaches on glycemic control and blood lipids in patients with type 2 diabetes mellitus in a systematic review including a standard pairwise and network meta-analysis of randomized trials. We will conduct searches in Cochrane Central Register of Controlled Trials (CENTRAL) on the Cochrane Library, PubMed (from 1966), and Google Scholar. Citations, abstracts, and relevant papers will be screened for eligibility by two reviewers independently. Randomized controlled trials (with a control group or randomized trials with at least two intervention groups) will be included if they meet the following criteria: (1) include type 2 diabetes mellitus, (2) include patients aged ≥18 years, (3) include dietary intervention (different type of diets: e.g., Mediterranean dietary pattern, low-carbohydrate diet, low-fat diet, vegetarian diet, high protein diet); either hypo, iso-caloric, or ad libitum diets, (4) minimum intervention period of 12 weeks. For each outcome measure of interest, random effects pairwise and network meta-analyses will be performed in order to determine the pooled relative effect of each intervention relative to every other intervention in terms of the post-intervention values (or mean differences between the changes from baseline value scores). Subgroup analyses are planned for study length, sample size, age, and sex. This systematic review will synthesize the available evidence on the comparative efficacy of different dietary approaches in the management of glycosylated hemoglobin (primary outcome), fasting glucose

  17. National Trends of Simple Prostatectomy for Benign Prostatic Hyperplasia With an Analysis of Risk Factors for Adverse Perioperative Outcomes.

    PubMed

    Pariser, Joseph J; Pearce, Shane M; Patel, Sanjay G; Bales, Gregory T

    2015-10-01

    To examine the national trends of simple prostatectomy (SP) for benign prostatic hyperplasia (BPH) focusing on perioperative outcomes and risk factors for complications. The National Inpatient Sample (2002-2012) was utilized to identify patients with BPH undergoing SP. Analysis included demographics, hospital details, associated procedures, and operative approach (open, robotic, or laparoscopic). Outcomes included complications, length of stay, charges, and mortality. Multivariate logistic regression was used to determine the risk factors for perioperative complications. Linear regression was used to assess the trends in the national annual utilization of SP. The study population included 35,171 patients. Median length of stay was 4 days (interquartile range 3-6). Cystolithotomy was performed concurrently in 6041 patients (17%). The overall complication rate was 28%, with bleeding occurring most commonly. In total, 148 (0.4%) patients experienced in-hospital mortality. On multivariate analysis, older age, black race, and overall comorbidity were associated with greater risk of complications while the use of a minimally invasive approach and concurrent cystolithotomy had a decreased risk. Over the study period, the national use of simple prostatectomy decreased, on average, by 145 cases per year (P = .002). By 2012, 135/2580 procedures (5%) were performed using a minimally invasive approach. The nationwide utilization of SP for BPH has decreased. Bleeding complications are common, but perioperative mortality is low. Patients who are older, black race, or have multiple comorbidities are at higher risk of complications. Minimally invasive approaches, which are becoming increasingly utilized, may reduce perioperative morbidity. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Factors associated with a primary surgical approach for sinonasal squamous cell carcinoma.

    PubMed

    Cracchiolo, Jennifer R; Patel, Krupa; Migliacci, Jocelyn C; Morris, Luc T; Ganly, Ian; Roman, Benjamin R; McBride, Sean M; Tabar, Viviane S; Cohen, Marc A

    2018-03-01

    Primary surgery is the preferred treatment of T1-T4a sinonasal squamous cell carcinoma (SNSCC). Patients with SNSCC in the National Cancer Data Base (NCDB) were analyzed. Factors that contributed to selecting primary surgical treatment were examined. Overall survival (OS) in surgical patients was analyzed. Four-thousand seven hundred and seventy patients with SNSCC were included. In T1-T4a tumors, lymph node metastases, maxillary sinus location, and treatment at high-volume centers were associated with selecting primary surgery. When primary surgery was utilized, tumor factors and positive margin guided worse OS. Adjuvant therapy improved OS in positive margin resection and advanced T stage cases. Tumor and non-tumor factors are associated with selecting surgery for the treatment of SNSCC. When surgery is selected, tumor factors drive OS. Negative margin resection should be the goal of a primary surgical approach. When a positive margin resection ensues, adjuvant therapy may improve OS. © 2017 Wiley Periodicals, Inc.

  20. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  1. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  2. A propensity score approach to correction for bias due to population stratification using genetic and non-genetic factors.

    PubMed

    Zhao, Huaqing; Rebbeck, Timothy R; Mitra, Nandita

    2009-12-01

    Confounding due to population stratification (PS) arises when differences in both allele and disease frequencies exist in a population of mixed racial/ethnic subpopulations. Genomic control, structured association, principal components analysis (PCA), and multidimensional scaling (MDS) approaches have been proposed to address this bias using genetic markers. However, confounding due to PS can also be due to non-genetic factors. Propensity scores are widely used to address confounding in observational studies but have not been adapted to deal with PS in genetic association studies. We propose a genomic propensity score (GPS) approach to correct for bias due to PS that considers both genetic and non-genetic factors. We compare the GPS method with PCA and MDS using simulation studies. Our results show that GPS can adequately adjust and consistently correct for bias due to PS. Under no/mild, moderate, and severe PS, GPS yielded estimated with bias close to 0 (mean=-0.0044, standard error=0.0087). Under moderate or severe PS, the GPS method consistently outperforms the PCA method in terms of bias, coverage probability (CP), and type I error. Under moderate PS, the GPS method consistently outperforms the MDS method in terms of CP. PCA maintains relatively high power compared to both MDS and GPS methods under the simulated situations. GPS and MDS are comparable in terms of statistical properties such as bias, type I error, and power. The GPS method provides a novel and robust tool for obtaining less-biased estimates of genetic associations that can consider both genetic and non-genetic factors. 2009 Wiley-Liss, Inc.

  3. Analysis of cytokine release assay data using machine learning approaches.

    PubMed

    Xiong, Feiyu; Janko, Marco; Walker, Mindi; Makropoulos, Dorie; Weinstock, Daniel; Kam, Moshe; Hrebien, Leonid

    2014-10-01

    The possible onset of Cytokine Release Syndrome (CRS) is an important consideration in the development of monoclonal antibody (mAb) therapeutics. In this study, several machine learning approaches are used to analyze CRS data. The analyzed data come from a human blood in vitro assay which was used to assess the potential of mAb-based therapeutics to produce cytokine release similar to that induced by Anti-CD28 superagonistic (Anti-CD28 SA) mAbs. The data contain 7 mAbs and two negative controls, a total of 423 samples coming from 44 donors. Three (3) machine learning approaches were applied in combination to observations obtained from that assay, namely (i) Hierarchical Cluster Analysis (HCA); (ii) Principal Component Analysis (PCA) followed by K-means clustering; and (iii) Decision Tree Classification (DTC). All three approaches were able to identify the treatment that caused the most severe cytokine response. HCA was able to provide information about the expected number of clusters in the data. PCA coupled with K-means clustering allowed classification of treatments sample by sample, and visualizing clusters of treatments. DTC models showed the relative importance of various cytokines such as IFN-γ, TNF-α and IL-10 to CRS. The use of these approaches in tandem provides better selection of parameters for one method based on outcomes from another, and an overall improved analysis of the data through complementary approaches. Moreover, the DTC analysis showed in addition that IL-17 may be correlated with CRS reactions, although this correlation has not yet been corroborated in the literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Approach to proliferation risk assessment based on multiple objective analysis framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less

  5. Quantitative Analysis of Critical Factors for the Climate Impact of Landfill Mining.

    PubMed

    Laner, David; Cencic, Oliver; Svensson, Niclas; Krook, Joakim

    2016-07-05

    Landfill mining has been proposed as an innovative strategy to mitigate environmental risks associated with landfills, to recover secondary raw materials and energy from the deposited waste, and to enable high-valued land uses at the site. The present study quantitatively assesses the importance of specific factors and conditions for the net contribution of landfill mining to global warming using a novel, set-based modeling approach and provides policy recommendations for facilitating the development of projects contributing to global warming mitigation. Building on life-cycle assessment, scenario modeling and sensitivity analysis methods are used to identify critical factors for the climate impact of landfill mining. The net contributions to global warming of the scenarios range from -1550 (saving) to 640 (burden) kg CO2e per Mg of excavated waste. Nearly 90% of the results' total variation can be explained by changes in four factors, namely the landfill gas management in the reference case (i.e., alternative to mining the landfill), the background energy system, the composition of the excavated waste, and the applied waste-to-energy technology. Based on the analyses, circumstances under which landfill mining should be prioritized or not are identified and sensitive parameters for the climate impact assessment of landfill mining are highlighted.

  6. Connecting the Dots: State Health Department Approaches to Addressing Shared Risk and Protective Factors Across Multiple Forms of Violence

    PubMed Central

    Wilkins, Natalie; Myers, Lindsey; Kuehl, Tomei; Bauman, Alice; Hertz, Marci

    2018-01-01

    Violence takes many forms, including intimate partner violence, sexual violence, child abuse and neglect, bullying, suicidal behavior, and elder abuse and neglect. These forms of violence are interconnected and often share the same root causes. They can also co-occur together in families and communities and can happen at the same time or at different stages of life. Often, due to a variety of factors, separate, “siloed” approaches are used to address each form of violence. However, understanding and implementing approaches that prevent and address the overlapping root causes of violence (risk factors) and promote factors that increase the resilience of people and communities (protective factors) can help practitioners more effectively and efficiently use limited resources to prevent multiple forms of violence and save lives. This article presents approaches used by 2 state health departments, the Maryland Department of Health and Mental Hygiene and the Colorado Department of Public Health and Environment, to integrate a shared risk and protective factor approach into their violence prevention work and identifies key lessons learned that may serve to inform crosscutting violence prevention efforts in other states. PMID:29189502

  7. Intelligent data analysis to interpret major risk factors for diabetic patients with and without ischemic stroke in a small population

    PubMed Central

    Gürgen, Fikret; Gürgen, Nurgül

    2003-01-01

    This study proposes an intelligent data analysis approach to investigate and interpret the distinctive factors of diabetes mellitus patients with and without ischemic (non-embolic type) stroke in a small population. The database consists of a total of 16 features collected from 44 diabetic patients. Features include age, gender, duration of diabetes, cholesterol, high density lipoprotein, triglyceride levels, neuropathy, nephropathy, retinopathy, peripheral vascular disease, myocardial infarction rate, glucose level, medication and blood pressure. Metric and non-metric features are distinguished. First, the mean and covariance of the data are estimated and the correlated components are observed. Second, major components are extracted by principal component analysis. Finally, as common examples of local and global classification approach, a k-nearest neighbor and a high-degree polynomial classifier such as multilayer perceptron are employed for classification with all the components and major components case. Macrovascular changes emerged as the principal distinctive factors of ischemic-stroke in diabetes mellitus. Microvascular changes were generally ineffective discriminators. Recommendations were made according to the rules of evidence-based medicine. Briefly, this case study, based on a small population, supports theories of stroke in diabetes mellitus patients and also concludes that the use of intelligent data analysis improves personalized preventive intervention. PMID:12685939

  8. Confirmatory factor analysis using Microsoft Excel.

    PubMed

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  9. Adversarial risk analysis with incomplete information: a level-k approach.

    PubMed

    Rothschild, Casey; McLay, Laura; Guikema, Seth

    2012-07-01

    This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.

  10. Multivariate geometry as an approach to algal community analysis

    USGS Publications Warehouse

    Allen, T.F.H.; Skagen, S.

    1973-01-01

    Multivariate analyses are put in the context of more usual approaches to phycological investigations. The intuitive common-sense involved in methods of ordination, classification and discrimination are emphasised by simple geometric accounts which avoid jargon and matrix algebra. Warnings are given that artifacts result from technique abuses by the naive or over-enthusiastic. An analysis of a simple periphyton data set is presented as an example of the approach. Suggestions are made as to situations in phycological investigations, where the techniques could be appropriate. The discipline is reprimanded for its neglect of the multivariate approach.

  11. An Exploratory Exercise in Taguchi Analysis of Design Parameters: Application to a Shuttle-to-space Station Automated Approach Control System

    NASA Technical Reports Server (NTRS)

    Deal, Don E.

    1991-01-01

    The chief goals of the summer project have been twofold - first, for my host group and myself to learn as much of the working details of Taguchi analysis as possible in the time allotted, and, secondly, to apply the methodology to a design problem with the intention of establishing a preliminary set of near-optimal (in the sense of producing a desired response) design parameter values from among a large number of candidate factor combinations. The selected problem is concerned with determining design factor settings for an automated approach program which is to have the capability of guiding the Shuttle into the docking port of the Space Station under controlled conditions so as to meet and/or optimize certain target criteria. The candidate design parameters under study were glide path (i.e., approach) angle, path intercept and approach gains, and minimum impulse bit mode (a parameter which defines how Shuttle jets shall be fired). Several performance criteria were of concern: terminal relative velocity at the instant the two spacecraft are mated; docking offset; number of Shuttle jet firings in certain specified directions (of interest due to possible plume impingement on the Station's solar arrays), and total RCS (a measure of the energy expended in performing the approach/docking maneuver). In the material discussed here, we have focused on single performance criteria - total RCS. An analysis of the possibility of employing a multiobjective function composed of a weighted sum of the various individual criteria has been undertaken, but is, at this writing, incomplete. Results from the Taguchi statistical analysis indicate that only three of the original four posited factors are significant in affecting RCS response. A comparison of model simulation output (via Monte Carlo) with predictions based on estimated factor effects inferred through the Taguchi experiment array data suggested acceptable or close agreement between the two except at the predicted optimum

  12. Factors which influence necropsy requests: a psychological approach.

    PubMed Central

    Start, R. D.; Hector-Taylor, M. J.; Cotton, D. W.; Startup, M.; Parsons, M. A.; Kennedy, A.

    1992-01-01

    AIMS: To determine which factors influence a clinician's decision to request a necropsy. METHODS: Patient age, confidence in premortem diagnosis, relatives' attitudes, and conditions of necropsy practice were combined factorially (two levels each) in separate medical and surgical questionnaires based on clinical case histories. The interactions between the factors were measured by a repeated measures factorial analysis of variance for each of the two clinical groups. The influence of the clinician's interest in necropsies on these interactions was also examined by a similar method. RESULTS: Necropsies were more likely to be requested on young patients, when diagnostic confidence was low, and when relatives' attitudes were favourable. Conditions of necropsy practice did not affect the likelihood of a request and there was no apparent overall difference in necropsy requests between the two groups of clinicians. The "patient age" and "relatives" factors had less influence on the decision of the surgical group to request necropsy. This was attributed to the opportunity to "see for themselves" at operation and was supported by the finding that surgeons were very likely to request necropsies in the absence of surgical intervention. Clinicians from both groups with a high pre-existing interest in the necropsy were consistently more likely to request necropsies. CONCLUSIONS: The "case history" based questionnaires successfully measured the relative influence of multiple factors in relation to the decision of clinicians to request a necropsy. These findings suggest that any attempt to reverse the decline in necropsy rates should focus on changing the clinician's perception of the value of the modern necropsy. PMID:1556237

  13. Noise-band factor analysis of cancer Fourier transform infrared evanescent-wave fiber optical (FTIR-FEW) spectra

    NASA Astrophysics Data System (ADS)

    Sukuta, Sydney; Bruch, Reinhard F.

    2002-05-01

    The goal of this study is to test the feasibility of using noise factor/eigenvector bands as general clinical analytical tools for diagnoses. We developed a new technique, Noise Band Factor Cluster Analysis (NBFCA), to diagnose benign tumors via their Fourier transform IR fiber optic evanescent wave spectral data for the first time. The middle IR region of human normal skin tissue and benign and melanoma tumors, were analyzed using this new diagnostic technique. Our results are not in full-agreement with pathological classifications hence there is a possibility that our approaches could complement or improve these traditional classification schemes. Moreover, the use of NBFCA make it much easier to delineate class boundaries hence this method provides results with much higher certainty.

  14. Microbial genome analysis: the COG approach.

    PubMed

    Galperin, Michael Y; Kristensen, David M; Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V

    2017-09-14

    For the past 20 years, the Clusters of Orthologous Genes (COG) database had been a popular tool for microbial genome annotation and comparative genomics. Initially created for the purpose of evolutionary classification of protein families, the COG have been used, apart from straightforward functional annotation of sequenced genomes, for such tasks as (i) unification of genome annotation in groups of related organisms; (ii) identification of missing and/or undetected genes in complete microbial genomes; (iii) analysis of genomic neighborhoods, in many cases allowing prediction of novel functional systems; (iv) analysis of metabolic pathways and prediction of alternative forms of enzymes; (v) comparison of organisms by COG functional categories; and (vi) prioritization of targets for structural and functional characterization. Here we review the principles of the COG approach and discuss its key advantages and drawbacks in microbial genome analysis. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  15. Risk factors of chronic periodontitis on healing response: a multilevel modelling analysis.

    PubMed

    Song, J; Zhao, H; Pan, C; Li, C; Liu, J; Pan, Y

    2017-09-15

    Chronic periodontitis is a multifactorial polygenetic disease with an increasing number of associated factors that have been identified over recent decades. Longitudinal epidemiologic studies have demonstrated that the risk factors were related to the progression of the disease. A traditional multivariate regression model was used to find risk factors associated with chronic periodontitis. However, the approach requirement of standard statistical procedures demands individual independence. Multilevel modelling (MLM) data analysis has widely been used in recent years, regarding thorough hierarchical structuring of the data, decomposing the error terms into different levels, and providing a new analytic method and framework for solving this problem. The purpose of our study is to investigate the relationship of clinical periodontal index and the risk factors in chronic periodontitis through MLM analysis and to identify high-risk individuals in the clinical setting. Fifty-four patients with moderate to severe periodontitis were included. They were treated by means of non-surgical periodontal therapy, and then made follow-up visits regularly at 3, 6, and 12 months after therapy. Each patient answered a questionnaire survey and underwent measurement of clinical periodontal parameters. Compared with baseline, probing depth (PD) and clinical attachment loss (CAL) improved significantly after non-surgical periodontal therapy with regular follow-up visits at 3, 6, and 12 months after therapy. The null model and variance component models with no independent variables included were initially obtained to investigate the variance of the PD and CAL reductions across all three levels, and they showed a statistically significant difference (P < 0.001), thus establishing that MLM data analysis was necessary. Site-level had effects on PD and CAL reduction; those variables could explain 77-78% of PD reduction and 70-80% of CAL reduction at 3, 6, and 12 months. Other levels only

  16. Classification and identification of molecules through factor analysis method based on terahertz spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Jianglou; Liu, Jinsong; Wang, Kejia; Yang, Zhengang; Liu, Xiaming

    2018-06-01

    By means of factor analysis approach, a method of molecule classification is built based on the measured terahertz absorption spectra of the molecules. A data matrix can be obtained by sampling the absorption spectra at different frequency points. The data matrix is then decomposed into the product of two matrices: a weight matrix and a characteristic matrix. By using the K-means clustering to deal with the weight matrix, these molecules can be classified. A group of samples (spirobenzopyran, indole, styrene derivatives and inorganic salts) has been prepared, and measured via a terahertz time-domain spectrometer. These samples are classified with 75% accuracy compared to that directly classified via their molecular formulas.

  17. Personalized translational epilepsy research - Novel approaches and future perspectives: Part I: Clinical and network analysis approaches.

    PubMed

    Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc

  18. Testing all six person-oriented principles in dynamic factor analysis.

    PubMed

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  19. Translation and Validation of the Nomophobia Questionnaire in the Italian Language: Exploratory Factor Analysis.

    PubMed

    Adawi, Mohammad; Bragazzi, Nicola Luigi; Argumosa-Villar, Lidia; Boada-Grau, Joan; Vigil-Colet, Andreu; Yildirim, Caglar; Del Puente, Giovanni; Watad, Abdulla

    2018-01-22

    Nomophobia, which is a neologism derived from the combination of "no mobile," "phone," and "phobia" is considered to be a modern situational phobia and indicates a fear of feeling disconnected. No psychometric scales are available in Italian for investigating such a construct. We therefore planned a translation and validation study of the Nomophobia Questionnaire (NMP-Q), which is an instrument developed by Yildirim and Correia. Subjects were recruited via an online survey using a snowball approach. The NMP-Q was translated from English into Italian using a classical "backwards and forwards" procedure. In order to explore the underlying factor structure of the translated questionnaire, an exploratory factor analysis was carried out. A principal component analysis approach with varimax rotation was performed. Multivariate regression analyses were computed to shed light on the psychological predictors of nomophobia. A sample of 403 subjects volunteered to take part in the study. The average age of participants was 27.91 years (standard deviation 8.63) and the sample was comprised of 160 males (160/403, 39.7%) and 243 females (243/403, 60.3%). Forty-five subjects spent less than 1 hour on their mobile phone per day (45/403, 11.2%), 94 spent between 1 and 2 hours (94/403, 23.3%), 69 spent between 2 and 3 hours (69/403, 17.1%), 58 spent between 3 and 4 hours (58/403, 14.4%), 48 spent between 4 and 5 hours (48/403, 11.9%), 29 spent between 5 and 7 hours (29/403, 7.2%), 36 spent between 7 and 9 hours (36/403, 8.9%), and 24 spent more than 10 hours (24/403, 6.0%). The eigenvalues and scree plot supported a 3-factorial nature of the translated questionnaire. The NMP-Q showed an overall Cronbach alpha coefficient of 0.95 (0.94, 0.89, and 0.88 for the three factors). The first factor explained up to 23.32% of the total variance, while the second and third factors explained up to 23.91% and 18.67% of the variance, respectively. The total NMP-Q score correlated with the number

  20. Translation and Validation of the Nomophobia Questionnaire in the Italian Language: Exploratory Factor Analysis

    PubMed Central

    Argumosa-Villar, Lidia; Boada-Grau, Joan; Vigil-Colet, Andreu; Yildirim, Caglar; Del Puente, Giovanni; Watad, Abdulla

    2018-01-01

    Background Nomophobia, which is a neologism derived from the combination of “no mobile,” “phone,” and “phobia” is considered to be a modern situational phobia and indicates a fear of feeling disconnected. Objective No psychometric scales are available in Italian for investigating such a construct. We therefore planned a translation and validation study of the Nomophobia Questionnaire (NMP-Q), which is an instrument developed by Yildirim and Correia. Subjects were recruited via an online survey using a snowball approach. Methods The NMP-Q was translated from English into Italian using a classical “backwards and forwards” procedure. In order to explore the underlying factor structure of the translated questionnaire, an exploratory factor analysis was carried out. A principal component analysis approach with varimax rotation was performed. Multivariate regression analyses were computed to shed light on the psychological predictors of nomophobia. Results A sample of 403 subjects volunteered to take part in the study. The average age of participants was 27.91 years (standard deviation 8.63) and the sample was comprised of 160 males (160/403, 39.7%) and 243 females (243/403, 60.3%). Forty-five subjects spent less than 1 hour on their mobile phone per day (45/403, 11.2%), 94 spent between 1 and 2 hours (94/403, 23.3%), 69 spent between 2 and 3 hours (69/403, 17.1%), 58 spent between 3 and 4 hours (58/403, 14.4%), 48 spent between 4 and 5 hours (48/403, 11.9%), 29 spent between 5 and 7 hours (29/403, 7.2%), 36 spent between 7 and 9 hours (36/403, 8.9%), and 24 spent more than 10 hours (24/403, 6.0%). The eigenvalues and scree plot supported a 3-factorial nature of the translated questionnaire. The NMP-Q showed an overall Cronbach alpha coefficient of 0.95 (0.94, 0.89, and 0.88 for the three factors). The first factor explained up to 23.32% of the total variance, while the second and third factors explained up to 23.91% and 18.67% of the variance

  1. Analysis of risk factors for T. brucei rhodesiense sleeping sickness within villages in south-east Uganda

    PubMed Central

    Zoller, Thomas; Fèvre, Eric M; Welburn, Susan C; Odiit, Martin; Coleman, Paul G

    2008-01-01

    Background Sleeping sickness (HAT) caused by T.b. rhodesiense is a major veterinary and human public health problem in Uganda. Previous studies have investigated spatial risk factors for T.b. rhodesiense at large geographic scales, but none have properly investigated such risk factors at small scales, i.e. within affected villages. In the present work, we use a case-control methodology to analyse both behavioural and spatial risk factors for HAT in an endemic area. Methods The present study investigates behavioural and occupational risk factors for infection with HAT within villages using a questionnaire-based case-control study conducted in 17 villages endemic for HAT in SE Uganda, and spatial risk factors in 4 high risk villages. For the spatial analysis, the location of homesteads with one or more cases of HAT up to three years prior to the beginning of the study was compared to all non-case homesteads. Analysing spatial associations with respect to irregularly shaped geographical objects required the development of a new approach to geographical analysis in combination with a logistic regression model. Results The study was able to identify, among other behavioural risk factors, having a family member with a history of HAT (p = 0.001) as well as proximity of a homestead to a nearby wetland area (p < 0.001) as strong risk factors for infection. The novel method of analysing complex spatial interactions used in the study can be applied to a range of other diseases. Conclusion Spatial risk factors for HAT are maintained across geographical scales; this consistency is useful in the design of decision support tools for intervention and prevention of the disease. Familial aggregation of cases was confirmed for T. b. rhodesiense HAT in the study and probably results from shared behavioural and spatial risk factors amongmembers of a household. PMID:18590541

  2. Improved scatter correction with factor analysis for planar and SPECT imaging

    NASA Astrophysics Data System (ADS)

    Knoll, Peter; Rahmim, Arman; Gültekin, Selma; Šámal, Martin; Ljungberg, Michael; Mirzaei, Siroos; Segars, Paul; Szczupak, Boguslaw

    2017-09-01

    Quantitative nuclear medicine imaging is an increasingly important frontier. In order to achieve quantitative imaging, various interactions of photons with matter have to be modeled and compensated. Although correction for photon attenuation has been addressed by including x-ray CT scans (accurate), correction for Compton scatter remains an open issue. The inclusion of scattered photons within the energy window used for planar or SPECT data acquisition decreases the contrast of the image. While a number of methods for scatter correction have been proposed in the past, in this work, we propose and assess a novel, user-independent framework applying factor analysis (FA). Extensive Monte Carlo simulations for planar and tomographic imaging were performed using the SIMIND software. Furthermore, planar acquisition of two Petri dishes filled with 99mTc solutions and a Jaszczak phantom study (Data Spectrum Corporation, Durham, NC, USA) using a dual head gamma camera were performed. In order to use FA for scatter correction, we subdivided the applied energy window into a number of sub-windows, serving as input data. FA results in two factor images (photo-peak, scatter) and two corresponding factor curves (energy spectra). Planar and tomographic Jaszczak phantom gamma camera measurements were recorded. The tomographic data (simulations and measurements) were processed for each angular position resulting in a photo-peak and a scatter data set. The reconstructed transaxial slices of the Jaszczak phantom were quantified using an ImageJ plugin. The data obtained by FA showed good agreement with the energy spectra, photo-peak, and scatter images obtained in all Monte Carlo simulated data sets. For comparison, the standard dual-energy window (DEW) approach was additionally applied for scatter correction. FA in comparison with the DEW method results in significant improvements in image accuracy for both planar and tomographic data sets. FA can be used as a user

  3. Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis

    NASA Astrophysics Data System (ADS)

    Chou, Hui-Yu; Yang, Jyh-Bin

    2017-10-01

    The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.

  4. Exploratory factor analysis of borderline personality disorder criteria in hospitalized adolescents.

    PubMed

    Becker, Daniel F; McGlashan, Thomas H; Grilo, Carlos M

    2006-01-01

    The authors examined the factor structure of borderline personality disorder (BPD) in hospitalized adolescents and also sought to add to the theoretical and clinical understanding of any homogeneous components by determining whether they may be related to specific forms of Axis I pathology. Subjects were 123 adolescent inpatients, who were reliably assessed with structured diagnostic interviews for Diagnostic and Statistical Manual of Mental Disorders, Revised Third Edition Axes I and II disorders. Exploratory factor analysis identified BPD components, and logistic regression analyses tested whether these components were predictive of specific Axis I disorders. Factor analysis revealed a 4-factor solution that accounted for 67.0% of the variance. Factor 1 ("suicidal threats or gestures" and "emptiness or boredom") predicted depressive disorders and alcohol use disorders. Factor 2 ("affective instability," "uncontrolled anger," and "identity disturbance") predicted anxiety disorders and oppositional defiant disorder. Factor 3 ("unstable relationships" and "abandonment fears") predicted only anxiety disorders. Factor 4 ("impulsiveness" and "identity disturbance") predicted conduct disorder and substance use disorders. Exploratory factor analysis of BPD criteria in adolescent inpatients revealed 4 BPD factors that appear to differ from those reported for similar studies of adults. The factors represent components of self-negation, irritability, poorly modulated relationships, and impulsivity--each of which is associated with characteristic Axis I pathology. These findings shed light on the nature of BPD in adolescents and may also have implications for treatment.

  5. Exploratory factor analysis of the Oral Health Impact Profile.

    PubMed

    John, M T; Reissmann, D R; Feuerstahler, L; Waller, N; Baba, K; Larsson, P; Celebić, A; Szabo, G; Rener-Sitar, K

    2014-09-01

    Although oral health-related quality of life (OHRQoL) as measured by the Oral Health Impact Profile (OHIP) is thought to be multidimensional, the nature of these dimensions is not known. The aim of this report was to explore the dimensionality of the OHIP using the Dimensions of OHRQoL (DOQ) Project, an international study of general population subjects and prosthodontic patients. Using the project's Learning Sample (n = 5173), we conducted an exploratory factor analysis on the 46 OHIP items not specifically referring to dentures for 5146 subjects with sufficiently complete data. The first eigenvalue (27·0) of the polychoric correlation matrix was more than ten times larger than the second eigenvalue (2·6), suggesting the presence of a dominant, higher-order general factor. Follow-up analyses with Horn's parallel analysis revealed a viable second-order, four-factor solution. An oblique rotation of this solution revealed four highly correlated factors that we named Oral Function, Oro-facial Pain, Oro-facial Appearance and Psychosocial Impact. These four dimensions and the strong general factor are two viable hypotheses for the factor structure of the OHIP. © 2014 John Wiley & Sons Ltd.

  6. Factor analysis of an instrument to measure the impact of disease on daily life.

    PubMed

    Pedrosa, Rafaela Batista Dos Santos; Rodrigues, Roberta Cunha Matheus; Padilha, Kátia Melissa; Gallani, Maria Cecília Bueno Jayme; Alexandre, Neusa Maria Costa

    2016-01-01

    to verify the structure of factors of an instrument to measure the Heart Valve Disease Impact on Daily Life (IDCV) when applied to coronary artery disease patients. the study included 153 coronary artery disease patients undergoing outpatient follow-up care. The IDCV structure of factors was initially assessed by means of confirmatory factor analysis and, subsequently, by exploratory factor analysis. The Varimax rotation method was used to estimate the main components of analysis, eigenvalues greater than one for extraction of factors, and factor loading greater than 0.40 for selection of items. Internal consistency was estimated using Cronbach's alpha coefficient. confirmatory factor analysis did not confirm the original structure of factors of the IDCV. Exploratory factor analysis showed three dimensions, which together explained 78% of the measurement variance. future studies with expansion of case selection are necessary to confirm the IDCV new structure of factors.

  7. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  8. Analysis of eighty-four commercial aviation incidents - Implications for a resource management approach to crew training

    NASA Technical Reports Server (NTRS)

    Murphy, M. R.

    1980-01-01

    A resource management approach to aircrew performance is defined and utilized in structuring an analysis of 84 exemplary incidents from the NASA Aviation Safety Reporting System. The distribution of enabling and associated (evolutionary) and recovery factors between and within five analytic categories suggests that resource management training be concentrated on: (1) interpersonal communications, with air traffic control information of major concern; (2) task management, mainly setting priorities and appropriately allocating tasks under varying workload levels; and (3) planning, coordination, and decisionmaking concerned with preventing and recovering from potentially unsafe situations in certain aircraft maneuvers.

  9. Factors Influencing Students' Adoption of E-Learning: A Structural Equation Modeling Approach

    ERIC Educational Resources Information Center

    Tarhini, Ali; Masa'deh, Ra'ed; Al-Busaidi, Kamla Ali; Mohammed, Ashraf Bany; Maqableh, Mahmoud

    2017-01-01

    Purpose: This research aims to examine the factors that may hinder or enable the adoption of e-learning systems by university students. Design/methodology/approach: A conceptual framework was developed through extending the unified theory of acceptance and use of technology (performance expectancy, effort expectancy, hedonic motivation, habit,…

  10. Discovering transcription factor binding sites in highly repetitive regions of genomes with multi-read analysis of ChIP-Seq data.

    PubMed

    Chung, Dongjun; Kuan, Pei Fen; Li, Bo; Sanalkumar, Rajendran; Liang, Kun; Bresnick, Emery H; Dewey, Colin; Keleş, Sündüz

    2011-07-01

    Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) is rapidly replacing chromatin immunoprecipitation combined with genome-wide tiling array analysis (ChIP-chip) as the preferred approach for mapping transcription-factor binding sites and chromatin modifications. The state of the art for analyzing ChIP-seq data relies on using only reads that map uniquely to a relevant reference genome (uni-reads). This can lead to the omission of up to 30% of alignable reads. We describe a general approach for utilizing reads that map to multiple locations on the reference genome (multi-reads). Our approach is based on allocating multi-reads as fractional counts using a weighted alignment scheme. Using human STAT1 and mouse GATA1 ChIP-seq datasets, we illustrate that incorporation of multi-reads significantly increases sequencing depths, leads to detection of novel peaks that are not otherwise identifiable with uni-reads, and improves detection of peaks in mappable regions. We investigate various genome-wide characteristics of peaks detected only by utilization of multi-reads via computational experiments. Overall, peaks from multi-read analysis have similar characteristics to peaks that are identified by uni-reads except that the majority of them reside in segmental duplications. We further validate a number of GATA1 multi-read only peaks by independent quantitative real-time ChIP analysis and identify novel target genes of GATA1. These computational and experimental results establish that multi-reads can be of critical importance for studying transcription factor binding in highly repetitive regions of genomes with ChIP-seq experiments.

  11. A generalized nonlinear model-based mixed multinomial logit approach for crash data analysis.

    PubMed

    Zeng, Ziqiang; Zhu, Wenbo; Ke, Ruimin; Ash, John; Wang, Yinhai; Xu, Jiuping; Xu, Xinxin

    2017-02-01

    The mixed multinomial logit (MNL) approach, which can account for unobserved heterogeneity, is a promising unordered model that has been employed in analyzing the effect of factors contributing to crash severity. However, its basic assumption of using a linear function to explore the relationship between the probability of crash severity and its contributing factors can be violated in reality. This paper develops a generalized nonlinear model-based mixed MNL approach which is capable of capturing non-monotonic relationships by developing nonlinear predictors for the contributing factors in the context of unobserved heterogeneity. The crash data on seven Interstate freeways in Washington between January 2011 and December 2014 are collected to develop the nonlinear predictors in the model. Thirteen contributing factors in terms of traffic characteristics, roadway geometric characteristics, and weather conditions are identified to have significant mixed (fixed or random) effects on the crash density in three crash severity levels: fatal, injury, and property damage only. The proposed model is compared with the standard mixed MNL model. The comparison results suggest a slight superiority of the new approach in terms of model fit measured by the Akaike Information Criterion (12.06 percent decrease) and Bayesian Information Criterion (9.11 percent decrease). The predicted crash densities for all three levels of crash severities of the new approach are also closer (on average) to the observations than the ones predicted by the standard mixed MNL model. Finally, the significance and impacts of the contributing factors are analyzed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Pair production of J/ψ mesons in the kt-factorization approach

    NASA Astrophysics Data System (ADS)

    Baranov, S. P.

    2011-09-01

    In the framework of kt-factorization approach, we consider the production of J/ψ pairs at the LHC conditions. We give predictions on the differential cross sections and discuss the source and the size of theoretical uncertainties. We also present a comparison with collinear parton model showing a dramatic difference in the J/ψ transverse momentum spectrum and J/ψ-J/ψ azimuthal correlations. Finally, we give predictions on the polarization observables in the helicity and Collins-Soper systems.

  13. A human factors analysis of EVA time requirements

    NASA Technical Reports Server (NTRS)

    Pate, D. W.

    1996-01-01

    Human Factors Engineering (HFE), also known as Ergonomics, is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. A human factors motion and time study was initiated with the goal of developing a database of EVA task times and a method of utilizing the database to predict how long an ExtraVehicular Activity (EVA) should take. Initial development relied on the EVA activities performed during the STS-61 mission (Hubble repair). The first step of the analysis was to become familiar with EVAs and with the previous studies and documents produced on EVAs. After reviewing these documents, an initial set of task primitives and task time modifiers was developed. Videotaped footage of STS-61 EVAs were analyzed using these primitives and task time modifiers. Data for two entire EVA missions and portions of several others, each with two EVA astronauts, was collected for analysis. Feedback from the analysis of the data will be used to further refine the primitives and task time modifiers used. Analysis of variance techniques for categorical data will be used to determine which factors may, individually or by interactions, effect the primitive times and how much of an effect they have.

  14. Dynamic Proteomic Analysis of Pancreatic Mesenchyme Reveals Novel Factors That Enhance Human Embryonic Stem Cell to Pancreatic Cell Differentiation.

    PubMed

    Russ, Holger A; Landsman, Limor; Moss, Christopher L; Higdon, Roger; Greer, Renee L; Kaihara, Kelly; Salamon, Randy; Kolker, Eugene; Hebrok, Matthias

    2016-01-01

    Current approaches in human embryonic stem cell (hESC) to pancreatic beta cell differentiation have largely been based on knowledge gained from developmental studies of the epithelial pancreas, while the potential roles of other supporting tissue compartments have not been fully explored. One such tissue is the pancreatic mesenchyme that supports epithelial organogenesis throughout embryogenesis. We hypothesized that detailed characterization of the pancreatic mesenchyme might result in the identification of novel factors not used in current differentiation protocols. Supplementing existing hESC differentiation conditions with such factors might create a more comprehensive simulation of normal development in cell culture. To validate our hypothesis, we took advantage of a novel transgenic mouse model to isolate the pancreatic mesenchyme at distinct embryonic and postnatal stages for subsequent proteomic analysis. Refined sample preparation and analysis conditions across four embryonic and prenatal time points resulted in the identification of 21,498 peptides with high-confidence mapping to 1,502 proteins. Expression analysis of pancreata confirmed the presence of three potentially important factors in cell differentiation: Galectin-1 (LGALS1), Neuroplastin (NPTN), and the Laminin α-2 subunit (LAMA2). Two of the three factors (LGALS1 and LAMA2) increased expression of pancreatic progenitor transcript levels in a published hESC to beta cell differentiation protocol. In addition, LAMA2 partially blocks cell culture induced beta cell dedifferentiation. Summarily, we provide evidence that proteomic analysis of supporting tissues such as the pancreatic mesenchyme allows for the identification of potentially important factors guiding hESC to pancreas differentiation.

  15. Dynamic Proteomic Analysis of Pancreatic Mesenchyme Reveals Novel Factors That Enhance Human Embryonic Stem Cell to Pancreatic Cell Differentiation

    PubMed Central

    Russ, Holger A.; Landsman, Limor; Moss, Christopher L.; Higdon, Roger; Greer, Renee L.; Kaihara, Kelly; Salamon, Randy; Kolker, Eugene; Hebrok, Matthias

    2016-01-01

    Current approaches in human embryonic stem cell (hESC) to pancreatic beta cell differentiation have largely been based on knowledge gained from developmental studies of the epithelial pancreas, while the potential roles of other supporting tissue compartments have not been fully explored. One such tissue is the pancreatic mesenchyme that supports epithelial organogenesis throughout embryogenesis. We hypothesized that detailed characterization of the pancreatic mesenchyme might result in the identification of novel factors not used in current differentiation protocols. Supplementing existing hESC differentiation conditions with such factors might create a more comprehensive simulation of normal development in cell culture. To validate our hypothesis, we took advantage of a novel transgenic mouse model to isolate the pancreatic mesenchyme at distinct embryonic and postnatal stages for subsequent proteomic analysis. Refined sample preparation and analysis conditions across four embryonic and prenatal time points resulted in the identification of 21,498 peptides with high-confidence mapping to 1,502 proteins. Expression analysis of pancreata confirmed the presence of three potentially important factors in cell differentiation: Galectin-1 (LGALS1), Neuroplastin (NPTN), and the Laminin α-2 subunit (LAMA2). Two of the three factors (LGALS1 and LAMA2) increased expression of pancreatic progenitor transcript levels in a published hESC to beta cell differentiation protocol. In addition, LAMA2 partially blocks cell culture induced beta cell dedifferentiation. Summarily, we provide evidence that proteomic analysis of supporting tissues such as the pancreatic mesenchyme allows for the identification of potentially important factors guiding hESC to pancreas differentiation. PMID:26681951

  16. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  17. Factor Analysis of Drawings: Application to College Student Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-01-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance,…

  18. Confirmatory factor analysis applied to the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Eaton, Philip; Willoughby, Shannon D.

    2018-06-01

    In 1995, Huffman and Heller used exploratory factor analysis to draw into question the factors of the Force Concept Inventory (FCI). Since then several papers have been published examining the factors of the FCI on larger sets of student responses and understandable factors were extracted as a result. However, none of these proposed factor models have been verified to not be unique to their original sample through the use of independent sets of data. This paper seeks to confirm the factor models proposed by Scott et al. in 2012, and Hestenes et al. in 1992, as well as another expert model proposed within this study through the use of confirmatory factor analysis (CFA) and a sample of 20 822 postinstruction student responses to the FCI. Upon application of CFA using the full sample, all three models were found to fit the data with acceptable global fit statistics. However, when CFA was performed using these models on smaller sample sizes the models proposed by Scott et al. and Eaton and Willoughby were found to be far more stable than the model proposed by Hestenes et al. The goodness of fit of these models to the data suggests that the FCI can be scored on factors that are not unique to a single class. These scores could then be used to comment on how instruction methods effect the performance of students along a single factor and more in-depth analyses of curriculum changes may be possible as a result.

  19. [Prevalence of Risk Factors of Non-Communicable Disease in Kyrgyzstan: Assessment using WHO STEPS Approach].

    PubMed

    A, T A; Makhmutkhodzhaev, S A; Kydyralieva, R B; Altymysheva, A T; Dzhakipova, R S; Zhorupbekova, K S; Ryskulova, S T; Knyazeva, V G; Kaliev, M T; Dzhumagulova, A S

    2016-12-01

    Assessment of prevalence of risk factors for non-communicable diseases (NCD) based on WHO "STEPS" approach was conducted in Kyrgyzstan. Results of this study demonstrated high prevalence of NCD risk factors: 94.2% of subjects aged 24-64 years had risk factors. Prevalence of elevated blood pressure was 48.7, smoking - 25.7, hypercholesterolemia - 23.6, excessive alcohol consumption - 31.4, physical inactivity 11.4, obesity - 23.1, elevated glucose level - 4.5, diabetes - 8.8, inadequate intake of fruits and vegetables - 74%. The data obtained would allow to draft effective preventive measures to combat NCD risk factors at the national level.

  20. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  1. Evaluation of approaches to monitor Staphylococcus aureus virulence factor expression during human disease.

    PubMed

    Rozemeijer, Wouter; Fink, Pamela; Rojas, Eduardo; Jones, C Hal; Pavliakova, Danka; Giardina, Peter; Murphy, Ellen; Liberator, Paul; Jiang, Qin; Girgenti, Douglas; Peters, Remco P H; Savelkoul, Paul H M; Jansen, Kathrin U; Anderson, Annaliesa S; Kluytmans, Jan

    2015-01-01

    Staphylococcus aureus is a versatile pathogen of medical significance, using multiple virulence factors to cause disease. A prophylactic S. aureus 4-antigen (SA4Ag) vaccine comprising capsular polysaccharide (types 5 and 8) conjugates, clumping factor A (ClfA) and manganese transporter C (MntC) is under development. This study was designed to characterize S. aureus isolates recovered from infected patients and also to investigate approaches for examining expression of S. aureus vaccine candidates and the host response during human infection. Confirmation of antigen expression in different disease states is important to support the inclusion of these antigens in a prophylactic vaccine. Hospitalized patients with diagnosed S. aureus wound (27) or bloodstream (24) infections were enrolled. Invasive and nasal carriage S. aureus isolates were recovered and characterized for genotypic diversity. S. aureus antigen expression was evaluated directly by real-time, quantitative, reverse-transcriptase PCR (qRT-PCR) analysis and indirectly by serology using a competitive Luminex immunoassay. Study isolates were genotypically diverse and all had the genes encoding the antigens present in the SA4Ag vaccine. S. aureus nasal carriage was detected in 55% of patients, and in those subjects 64% of the carriage isolates matched the invasive strain. In swab samples with detectable S. aureus triosephosphate isomerase housekeeping gene expression, RNA transcripts encoding the S. aureus virulence factors ClfA, MntC, and capsule polysaccharide were detected by qRT-PCR. Antigen expression was indirectly confirmed by increases in antibody titer during the course of infection from acute to convalescent phase. Demonstration of bacterial transcript expression together with immunological response to the SA4Ag antigens in a clinically relevant patient population provides support for inclusion of these antigens in a prophylactic vaccine.

  2. An integrated phenomic approach to multivariate allelic association

    PubMed Central

    Medland, Sarah Elizabeth; Neale, Michael Churton

    2010-01-01

    The increased feasibility of genome-wide association has resulted in association becoming the primary method used to localize genetic variants that cause phenotypic variation. Much attention has been focused on the vast multiple testing problems arising from analyzing large numbers of single nucleotide polymorphisms. However, the inflation of experiment-wise type I error rates through testing numerous phenotypes has received less attention. Multivariate analyses can be used to detect both pleiotropic effects that influence a latent common factor, and monotropic effects that operate at a variable-specific levels, whilst controlling for non-independence between phenotypes. In this study, we present a maximum likelihood approach, which combines both latent and variable-specific tests and which may be used with either individual or family data. Simulation results indicate that in the presence of factor-level association, the combined multivariate (CMV) analysis approach performs well with a minimal loss of power as compared with a univariate analysis of a factor or sum score (SS). As the deviation between the pattern of allelic effects and the factor loadings increases, the power of univariate analyses of both factor and SSs decreases dramatically, whereas the power of the CMV approach is maintained. We show the utility of the approach by examining the association between dopamine receptor D2 TaqIA and the initiation of marijuana, tranquilizers and stimulants in data from the Add Health Study. Perl scripts that takes ped and dat files as input and produces Mx scripts and data for running the CMV approach can be downloaded from www.vipbg.vcu.edu/~sarahme/WriteMx. PMID:19707246

  3. What Are the Risk Factors for Dislocation of Hip Bipolar Hemiarthroplasty Through the Anterolateral Approach? A Nested Case-control Study.

    PubMed

    Li, Lianhua; Ren, Jixin; Liu, Jia; Wang, Hao; Sang, Qinghua; Liu, Zhi; Sun, Tiansheng

    2016-12-01

    Hip dislocation after treatment of a femoral neck fracture with a hemiarthroplasty remains an important problem in the treatment of hip fractures, but the associations between patient factors and surgical factors, and how these factors contribute to dislocation in patients who have undergone bipolar hemiarthroplasty through an anterolateral approach for femoral neck fracture currently are only poorly characterized. We evaluated patients with bipolar hemiarthroplasty dislocation after surgery for femoral neck fracture treated through an anterolateral approach and asked: (1) What are the frequency, characteristics, and risk factors of bipolar hemiarthroplasty dislocations? (2) What are the frequency, characteristics, and risk factors of bipolar hemiarthroplasty dissociations? A review of hospital records for patients who underwent bipolar hip hemiarthroplasty for femoral neck fracture at one hospital between July 2004 and August 2014 was conducted. During that time, 1428 patients were admitted with a diagnosis of femoral neck fracture; 508 of these patients underwent bipolar hip hemiarthroplasty, of whom 61 died and 23 were lost to followup during the first year, leaving 424 (83%) available for analysis. The remainder of the patients during that time were treated with internal fixation (512), unipoloar hip arthroplasty (17), or THA (391). For each patient with dislocation, we selected five control patients from the cohort according to sex, age (± 3 years), and year of entry in the study to eliminate some confounding factors. We recorded patient characteristics regarding demographics, medical comorbidities, Katz score, American Society of Anesthesiologists score, Mini-Mental State Examination (MMSE) score, and anesthesia type. Medical comorbidities included diabetes, chronic pulmonary disease, heart disease, neuromuscular diseases, and dementia. Univariate analyses were used to search for possible risk factors. Conditional logistic regression analyses on dislocation

  4. Otoplasty: A graduated approach.

    PubMed

    Foda, H M

    1999-01-01

    Numerous otoplastic techniques have been described for the correction of protruding ears. Technique selection in otoplasty should be done only after careful analysis of the abnormal anatomy responsible for the protruding ear deformity. A graduated surgical approach is presented which is designed to address all contributing factors to the presenting auricular deformity. The approach starts with the more conservative cartilage-sparing suturing techniques, then proceeds to incorporate other more aggressive cartilage weakening maneuvers. Applying this approach resulted in better long-term results with less postoperative lateralization than that encountered on using the cartilage-sparing techniques alone.

  5. Physics Metacognition Inventory Part Ii: Confirmatory Factor Analysis and Rasch Analysis

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-01-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition,…

  6. Criteria for Comparing Domain Analysis Approaches Version 01.00.00

    DTIC Science & Technology

    1991-12-01

    Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management

  7. Performance-Shaping Factors Affecting Older Adults' Hospital-to-Home Transition Success: A Systems Approach.

    PubMed

    Werner, Nicole E; Tong, Michelle; Borkenhagen, Amy; Holden, Richard J

    2018-01-03

    Facilitating older adults' successful hospital-to-home transitions remains a persistent challenge. To address this challenge, we applied a systems lens to identify and understand the performance-shaping factors (PSFs) related older adults' hospital-to-home transition success. This study was a secondary analysis of semi-structured interviews from older adults (N = 31) recently discharged from a hospital and their informal caregivers (N = 13). We used a Human Factors Engineering approach to guide qualitative thematic analysis to develop four themes concerning the system conditions shaping hospital-to-home transition success. The four themes concerning PSFs were: (a) the hospital-to-home transition was a complex multiphase process-the process unfolded over several months and required substantial, persistent investment/effort; (b) there were unmet needs for specialized tools-information and resources provided at hospital discharge were not aligned with requirements for transition success; (c) alignment of self-care routines with transition needs-pre-hospitalization routines could be supportive/disruptive and could deteriorate/be re-established; and (d) changing levels of work demand and capacity during the transition-demand often exceeded capacity leading to work overload. Our findings highlight that the transition is not an episodic event, but rather a longitudinal process extending beyond the days just after hospital discharge. Transition interventions to improve older adults' hospital-to-home transitions need to account for this complex multiphase process. Future interventions must be developed to support older adults and informal caregivers in navigating the establishment and re-establishment of routines and managing work demands and capacity during the transition process. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  9. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  10. Generation Y, wine and alcohol. A semantic differential approach to consumption analysis in Tuscany.

    PubMed

    Marinelli, Nicola; Fabbrizzi, Sara; Alampi Sottini, Veronica; Sacchelli, Sandro; Bernetti, Iacopo; Menghini, Silvio

    2014-04-01

    The aim of the study is the elicitation of the consumer's semantic perception of different alcoholic beverages in order to provide information for the definition of communication strategies for both the private sector (and specifically the wine industry) and the public decision maker. Such information can be seen as the basis of a wider social marketing construct aimed at the promotion of responsible drinking among young consumers. The semantic differential approach was used in this study. The data collection was based on a survey to 430 consumers between 18 and 35years old in Tuscany, Italy. The database was organized in a three-way structure, indexing the data in a multiway matrix. The data were processed using a Multiple Factor Analysis (MFA). Moreover, homogeneous clusters of consumers were identified using a Hierarchical Clustering on Principal Components (HCPC) approach. The results of the study highlight that beer and spirits are mainly perceived as "Young", "Social", "Euphoric", "Happy", "Appealing" and "Trendy" beverages, while wine is associated mostly with terms such as "Pleasure", "Quality" and "Comfortable". Furthermore, the cluster analysis allowed for the identification of three groups of individuals with different approaches to alcohol drinking. The results of the study supply a useful information framework for the elaboration of specific communication strategies that, based on the drinking habits of young consumers and their perception of different beverages, can use a language that is very close to the consumer typologies. Such information can be helpful for both private and public communication strategies. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Connectivism in Postsecondary Online Courses: An Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hogg, Nanette; Lomicky, Carol S.

    2012-01-01

    This study explores 465 postsecondary students' experiences in online classes through the lens of connectivism. Downes' 4 properties of connectivism (diversity, autonomy, interactivity, and openness) were used as the study design. An exploratory factor analysis was performed. This study found a 4-factor solution. Subjects indicated that autonomy…

  12. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    PubMed

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  13. Factor analysis of the Hamilton Depression Rating Scale in Parkinson's disease.

    PubMed

    Broen, M P G; Moonen, A J H; Kuijf, M L; Dujardin, K; Marsh, L; Richard, I H; Starkstein, S E; Martinez-Martin, P; Leentjens, A F G

    2015-02-01

    Several studies have validated the Hamilton Depression Rating Scale (HAMD) in patients with Parkinson's disease (PD), and reported adequate reliability and construct validity. However, the factorial validity of the HAMD has not yet been investigated. The aim of our analysis was to explore the factor structure of the HAMD in a large sample of PD patients. A principal component analysis of the 17-item HAMD was performed on data of 341 PD patients, available from a previous cross sectional study on anxiety. An eigenvalue ≥1 was used to determine the number of factors. Factor loadings ≥0.4 in combination with oblique rotations were used to identify which variables made up the factors. Kaiser-Meyer-Olkin measure (KMO), Cronbach's alpha, Bartlett's test, communality, percentage of non-redundant residuals and the component correlation matrix were computed to assess factor validity. KMO verified the sample's adequacy for factor analysis and Cronbach's alpha indicated a good internal consistency of the total scale. Six factors had eigenvalues ≥1 and together explained 59.19% of the variance. The number of items per factor varied from 1 to 6. Inter-item correlations within each component were low. There was a high percentage of non-redundant residuals and low communality. This analysis demonstrates that the factorial validity of the HAMD in PD is unsatisfactory. This implies that the scale is not appropriate for studying specific symptom domains of depression based on factorial structure in a PD population. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  15. Coherent dynamic structure factors of strongly coupled plasmas: A generalized hydrodynamic approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Di; Hu, GuangYue; Gong, Tao

    2016-05-15

    A generalized hydrodynamic fluctuation model is proposed to simplify the calculation of the dynamic structure factor S(ω, k) of non-ideal plasmas using the fluctuation-dissipation theorem. In this model, the kinetic and correlation effects are both included in hydrodynamic coefficients, which are considered as functions of the coupling strength (Γ) and collision parameter (kλ{sub ei}), where λ{sub ei} is the electron-ion mean free path. A particle-particle particle-mesh molecular dynamics simulation code is also developed to simulate the dynamic structure factors, which are used to benchmark the calculation of our model. A good agreement between the two different approaches confirms the reliabilitymore » of our model.« less

  16. Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.

    2007-01-01

    To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.

  17. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  18. Analysis of human factors effects on the safety of transporting radioactive waste materials: Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abkowitz, M.D.; Abkowitz, S.B.; Lepofsky, M.

    1989-04-01

    This report examines the extent of human factors effects on the safety of transporting radioactive waste materials. It is seen principally as a scoping effort, to establish whether there is a need for DOE to undertake a more formal approach to studying human factors in radioactive waste transport, and if so, logical directions for that program to follow. Human factors effects are evaluated on driving and loading/transfer operations only. Particular emphasis is placed on the driving function, examining the relationship between human error and safety as it relates to the impairment of driver performance. Although multi-modal in focus, the widespreadmore » availability of data and previous literature on truck operations resulted in a primary study focus on the trucking mode from the standpoint of policy development. In addition to the analysis of human factors accident statistics, the report provides relevant background material on several policies that have been instituted or are under consideration, directed at improving human reliability in the transport sector. On the basis of reported findings, preliminary policy areas are identified. 71 refs., 26 figs., 5 tabs.« less

  19. Bridging the gap between biologic, individual, and macroenvironmental factors in cancer: a multilevel approach.

    PubMed

    Lynch, Shannon M; Rebbeck, Timothy R

    2013-04-01

    To address the complex nature of cancer occurrence and outcomes, approaches have been developed to simultaneously assess the role of two or more etiologic agents within hierarchical levels including the: (i) macroenvironment level (e.g., health care policy, neighborhood, or family structure); (ii) individual level (e.g., behaviors, carcinogenic exposures, socioeconomic factors, and psychologic responses); and (iii) biologic level (e.g., cellular biomarkers and inherited susceptibility variants). Prior multilevel approaches tend to focus on social and environmental hypotheses, and are thus limited in their ability to integrate biologic factors into a multilevel framework. This limited integration may be related to the limited translation of research findings into the clinic. We propose a "Multi-level Biologic and Social Integrative Construct" (MBASIC) to integrate macroenvironment and individual factors with biology. The goal of this framework is to help researchers identify relationships among factors that may be involved in the multifactorial, complex nature of cancer etiology, to aid in appropriate study design, to guide the development of statistical or mechanistic models to study these relationships, and to position the results of these studies for improved intervention, translation, and implementation. MBASIC allows researchers from diverse fields to develop hypotheses of interest under a common conceptual framework, to guide transdisciplinary collaborations, and to optimize the value of multilevel studies for clinical and public health activities.

  20. Quantitative analysis of intrinsic and extrinsic factors in the aggregation mechanism of Alzheimer-associated Aβ-peptide

    NASA Astrophysics Data System (ADS)

    Meisl, Georg; Yang, Xiaoting; Frohm, Birgitta; Knowles, Tuomas P. J.; Linse, Sara

    2016-01-01

    Disease related mutations and environmental factors are key determinants of the aggregation mechanism of the amyloid-β peptide implicated in Alzheimer's disease. Here we present an approach to investigate these factors through acquisition of highly reproducible data and global kinetic analysis to determine the mechanistic influence of intrinsic and extrinsic factors on the Aβ aggregation network. This allows us to translate the shift in macroscopic aggregation behaviour into effects on the individual underlying microscopic steps. We apply this work-flow to the disease-associated Aβ42-A2V variant, and to a variation in pH as examples of an intrinsic and an extrinsic perturbation. In both cases, our data reveal a shift towards a mechanism in which a larger fraction of the reactive flux goes via a pathway that generates potentially toxic oligomeric species in a fibril-catalyzed reaction. This is in agreement with the finding that Aβ42-A2V leads to early-onset Alzheimer’s disease and enhances neurotoxicity.

  1. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while

  2. New approach to optimize near-infrared spectra with design of experiments and determination of milk compounds as influence factors for changing milk over time.

    PubMed

    De Benedictis, Lorenzo; Huck, Christian

    2016-12-01

    The optimization of near-infrared spectroscopic parameters was realized via design of experiments. With this new approach objectivity can be integrated into conventional, rather subjective approaches. The investigated factors are layer thickness, number of scans and temperature during measurement. Response variables in the full factorial design consisted of absorption intensity, signal-to-noise ratio and reproducibility of the spectra. Optimized factorial combinations have been found to be 0.5mm layer thickness, 64 scans and 25°C ambient temperature for liquid milk measurements. Qualitative analysis of milk indicated a strong correlation of environmental factors, as well as the feeding of cattle with respect to the change in milk composition. This was illustrated with the aid of near-infrared spectroscopy and the previously optimized parameters by detection of altered fatty acids in milk, especially by the fatty acid content (number of carboxylic functions) and the fatty acid length. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Cafeteria factors that influence milk-drinking behaviors of elementary school children: grounded theory approach.

    PubMed

    Connors, P; Bednar, C; Klammer, S

    2001-01-01

    This study was conducted to identify factors that influenced milk-drinking behaviors of elementary school children in North Texas. Ten focus groups with a total of 41 children aged 6 to 11 years were conducted using a grounded theory approach. Based on the principles of Social Learning Theory, milk preferences and health beliefs were identified as personal factors that influenced drinking. Cafeteria rules, milk flavor, product packaging, modeling by adults, and shared experiences were environmental factors. The data suggest that school cafeterias can capitalize on their unique position to offer milk-drinking opportunities that children can share to combine nutrition education with sensory experience.

  4. Pig immune response to general stimulus and to porcine reproductive and respiratory syndrome virus infection: a meta-analysis approach

    PubMed Central

    2013-01-01

    Background The availability of gene expression data that corresponds to pig immune response challenges provides compelling material for the understanding of the host immune system. Meta-analysis offers the opportunity to confirm and expand our knowledge by combining and studying at one time a vast set of independent studies creating large datasets with increased statistical power. In this study, we performed two meta-analyses of porcine transcriptomic data: i) scrutinized the global immune response to different challenges, and ii) determined the specific response to Porcine Reproductive and Respiratory Syndrome Virus (PRRSV) infection. To gain an in-depth knowledge of the pig response to PRRSV infection, we used an original approach comparing and eliminating the common genes from both meta-analyses in order to identify genes and pathways specifically involved in the PRRSV immune response. The software Pointillist was used to cope with the highly disparate data, circumventing the biases generated by the specific responses linked to single studies. Next, we used the Ingenuity Pathways Analysis (IPA) software to survey the canonical pathways, biological functions and transcription factors found to be significantly involved in the pig immune response. We used 779 chips corresponding to 29 datasets for the pig global immune response and 279 chips obtained from 6 datasets for the pig response to PRRSV infection, respectively. Results The pig global immune response analysis showed interconnected canonical pathways involved in the regulation of translation and mitochondrial energy metabolism. Biological functions revealed in this meta-analysis were centred around translation regulation, which included protein synthesis, RNA-post transcriptional gene expression and cellular growth and proliferation. Furthermore, the oxidative phosphorylation and mitochondria dysfunctions, associated with stress signalling, were highly regulated. Transcription factors such as MYCN, MYC and

  5. Establishing Evidence for Internal Structure Using Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Watson, Joshua C.

    2017-01-01

    Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…

  6. Stochastic Frontier Approach and Data Envelopment Analysis to Total Factor Productivity and Efficiency Measurement of Bangladeshi Rice

    PubMed Central

    Hossain, Md. Kamrul; Kamil, Anton Abdulbasah; Baten, Md. Azizul; Mustafa, Adli

    2012-01-01

    The objective of this paper is to apply the Translog Stochastic Frontier production model (SFA) and Data Envelopment Analysis (DEA) to estimate efficiencies over time and the Total Factor Productivity (TFP) growth rate for Bangladeshi rice crops (Aus, Aman and Boro) throughout the most recent data available comprising the period 1989–2008. Results indicate that technical efficiency was observed as higher for Boro among the three types of rice, but the overall technical efficiency of rice production was found around 50%. Although positive changes exist in TFP for the sample analyzed, the average growth rate of TFP for rice production was estimated at almost the same levels for both Translog SFA with half normal distribution and DEA. Estimated TFP from SFA is forecasted with ARIMA (2, 0, 0) model. ARIMA (1, 0, 0) model is used to forecast TFP of Aman from DEA estimation. PMID:23077500

  7. Dynamic Assessment: An Approach Toward Reducing Test Bias.

    ERIC Educational Resources Information Center

    Carlson, Jerry S.; Wiedl, Karl Heinz

    Through dynamic testing (the notion that tailored testing can be extended to the use of a learning oriented approach to assessment), analysis were made of how motivational, personality, and cognitive style factors interact with assessment approaches to yield performance data. Testing procedures involving simple feedback, elaborated feedback, and…

  8. Conversation Analysis--A Discourse Approach to Teaching Oral English Skills

    ERIC Educational Resources Information Center

    Wu, Yan

    2013-01-01

    This paper explores a pedagocial approach to teaching oral English---Conversation Analysis. First, features of spoken language is described in comparison to written language. Second, Conversation Analysis theory is elaborated in terms of adjacency pairs, turn-taking, repairs, sequences, openings and closings, and feedback. Third, under the…

  9. Work System Assessment to Facilitate the Dissemination of a Quality Improvement Program for Optimizing Blood Culture Use: A Case Study Using a Human Factors Engineering Approach.

    PubMed

    Xie, Anping; Woods-Hill, Charlotte Z; King, Anne F; Enos-Graves, Heather; Ascenzi, Judy; Gurses, Ayse P; Klaus, Sybil A; Fackler, James C; Milstone, Aaron M

    2017-11-20

    Work system assessments can facilitate successful implementation of quality improvement programs. Using a human factors engineering approach, we conducted a work system assessment to facilitate the dissemination of a quality improvement program for optimizing blood culture use in pediatric intensive care units at 2 hospitals. Semistructured face-to-face interviews were conducted with clinicians from Johns Hopkins All Children's Hospital and University of Virginia Medical Center. Interview data were analyzed using qualitative content analysis. Blood culture-ordering practices are influenced by various work system factors, including people, tasks, tools and technologies, the physical environment, organizational conditions, and the external environment. A clinical decision-support tool could facilitate implementation by (1) standardizing blood culture-ordering practices, (2) ensuring that prescribing clinicians review the patient's condition before ordering a blood culture, (3) facilitating critical thinking, and (4) empowering nurses to communicate with physicians and advocate for adherence to blood culture-ordering guidelines. The success of interventions for optimizing blood culture use relies heavily on the local context. A work system analysis using a human factors engineering approach can identify key areas to be addressed for the successful dissemination of quality improvement interventions. © The Author 2017. Published by Oxford University Press on behalf of The Journal of the Pediatric Infectious Diseases Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. What causes childhood stunting among children of San Vicente, Guatemala: Employing complimentary, system-analysis approaches.

    PubMed

    Voth-Gaeddert, Lee E; Stoker, Matthew; Cornell, Devin; Oerther, Daniel B

    2018-04-01

    Guatemala has the sixth worst stunting rate with 48% of children under five years of age classified as stunted according to World Health Organization standards. This study utilizes two different yet complimentary system-analysis approaches to analyze correlations among environmental and demographic variables, environmental enteric dysfunction (EED), and child height-for-age (stunting metric) in Guatemala. Two descriptive models constructed around applicable environmental and demographic factors on child height-for-age and EED were analyzed using Network Analysis (NA) and Structural Equation Modeling (SEM). Data from two populations of children between the age of three months and five years were used. The first population (n = 2103) was drawn from the Food for Peace Baseline Survey conducted by the US Agency for International Development (USAID) in 2012, and the second population (n = 372) was drawn from an independent survey conducted by the San Vicente Health Center in 2016. The results from the NA of the height-for-age model confirmed pathogen exposure, nutrition, and prenatal health as important, and the results from the NA of the EED model confirmed water source, water treatment, and type of sanitation as important. The results from the SEM of the height-for-age model confirmed a statistically significant correlation among child height-for-age and child-mother interaction (-0.092, p = 0.076) while the SEM of the EED model confirmed the statistically significant correlation among EED and type of water treatment (-0.115, p = 0.013). Our approach supports important efforts to understand the complex set of factors associated with child stunting among communities sharing similarities with San Vicente. Copyright © 2018 Elsevier GmbH. All rights reserved.

  11. Boundary formulations for sensitivity analysis without matrix derivatives

    NASA Technical Reports Server (NTRS)

    Kane, J. H.; Guru Prasad, K.

    1993-01-01

    A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.

  12. Analysis of the influencing factors of global energy interconnection development

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; He, Yongxiu; Ge, Sifan; Liu, Lin

    2018-04-01

    Under the background of building global energy interconnection and achieving green and low-carbon development, this paper grasps a new round of energy restructuring and the trend of energy technology change, based on the present situation of global and China's global energy interconnection development, established the index system of the impact of global energy interconnection development factors. A subjective and objective weight analysis of the factors affecting the development of the global energy interconnection was conducted separately by network level analysis and entropy method, and the weights are summed up by the method of additive integration, which gives the comprehensive weight of the influencing factors and the ranking of their influence.

  13. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    PubMed

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. A Study on Factors Affecting Navy Officers’ Decisions to Pursue Funded Graduate Education: A Qualitative Approach

    DTIC Science & Technology

    2017-06-01

    Distribution is unlimited. A STUDY ON FACTORS AFFECTING NAVY OFFICERS’ DECISIONS TO PURSUE FUNDED GRADUATE EDUCATION: A QUALITATIVE APPROACH ...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. A STUDY ON FACTORS...REPORT DATE June 2017 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE A STUDY ON FACTORS AFFECTING NAVY OFFICERS

  15. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  16. Examining Evolving Performance on the Force Concept Inventory Using Factor Analysis

    ERIC Educational Resources Information Center

    Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W

    2017-01-01

    The application of factor analysis to the "Force Concept Inventory" (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a…

  17. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    ERIC Educational Resources Information Center

    Baglin, James

    2014-01-01

    Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…

  18. Edmonton obesity staging system among pediatric patients: a validation and obesogenic risk factor analysis.

    PubMed

    Grammatikopoulou, M G; Chourdakis, M; Gkiouras, K; Roumeli, P; Poulimeneas, D; Apostolidou, E; Chountalas, I; Tirodimos, I; Filippou, O; Papadakou-Lagogianni, S; Dardavessis, T

    2018-01-08

    The Edmonton Obesity Staging System for Pediatrics (EOSS-P) is a useful tool, delineating different obesity severity tiers associated with distinct treatment barriers. The aim of the study was to apply the EOSS-P on a Greek pediatric cohort and assess risk factors associated with each stage, compared to normal weight controls. A total of 361 children (2-14 years old), outpatients of an Athenian hospital, participated in this case-control study by forming two groups: the obese (n = 203) and the normoweight controls (n = 158). Anthropometry, blood pressure, blood and biochemical markers, comorbidities and obesogenic lifestyle parameters were recorded and the EOSS-P was applied. Validation of EOSS-P stages was conducted by juxtaposing them with IOTF-defined weight status. Obesogenic risk factors' analysis was conducted by constructing gender-and-age-adjusted (GA) and multivariate logistic models. The majority of obese children were stratified at stage 1 (46.0%), 17.0% were on stage 0, and 37.0% on stage 2. The validation analysis revealed that EOSS-P stages greater than 0 were associated with diastolic blood pressure and levels of glucose, cholesterol, LDL and ALT. Reduced obesity odds were observed among children playing outdoors and increased odds for every screen time hour, both in the GA and in the multivariate analyses (all P < 0.05). Although participation in sports > 2 times/week was associated with reduced obesity odds in the GA analysis (OR = 0.57, 95% CI = 0.33-0.98, P linear = 0.047), it lost its significance in the multivariate analysis (P linear = 0.145). Analogous results were recorded in the analyses of the abovementioned physical activity risk factors for the EOSS-P stages. Linear relationships were observed for fast-food consumption and IOTF-defined obesity and higher than 0 EOSS-P stages. Parental obesity status was associated with all EOSS-P stages and IOTF-defined obesity status. Few outpatients were healthy obese (stage 0), while

  19. Factor analysis of the contextual fine motor questionnaire in children.

    PubMed

    Lin, Chin-Kai; Meng, Ling-Fu; Yu, Ya-Wen; Chen, Che-Kuo; Li, Kuan-Hua

    2014-02-01

    Most studies treat fine motor as one subscale in a developmental test, hence, further factor analysis of fine motor has not been conducted. In fact, fine motor has been treated as a multi-dimensional domain from both clinical and theoretical perspectives, and therefore to know its factors would be valuable. The aim of this study is to analyze the internal consistency and factor validity of the Contextual Fine Motor Questionnaire (CFMQ). Based on the ecological observation and literature, the Contextual Fine Motor Questionnaire (CFMQ) was developed and includes 5 subscales: Pen Control, Tool Use During Handicraft Activities, the Use of Dining Utensils, Connecting and Separating during Dressing and Undressing, and Opening Containers. The main purpose of this study is to establish the factorial validity of the CFMQ through conducting this factor analysis study. Among 1208 questionnaires, 904 were successfully completed. Data from the children's CFMQ submitted by primary care providers was analyzed, including 485 females (53.6%) and 419 males (46.4%) from grades 1 to 5, ranging in age from 82 to 167 months (M=113.9, SD=16.3). Cronbach's alpha was used to measure internal consistency and explorative factor analysis was applied to test the five factor structures within the CFMQ. Results showed that Cronbach's alpha coefficient of the CFMQ for 5 subscales ranged from .77 to .92 and all item-total correlations with corresponding subscales were larger than .4 except one item. The factor loading of almost all items classified to their factor was larger than .5 except 3 items. There were five factors, explaining a total of 62.59% variance for the CFMQ. In conclusion, the remaining 24 items in the 5 subscales of the CFMQ had appropriate internal consistency, test-retest reliability and construct validity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Violence in public transportation: an approach based on spatial analysis.

    PubMed

    Sousa, Daiane Castro Bittencourt de; Pitombo, Cira Souza; Rocha, Samille Santos; Salgueiro, Ana Rita; Delgado, Juan Pedro Moreno

    2017-12-11

    To carry out a spatial analysis of the occurrence of acts of violence (specifically robberies) in public transportation, identifying the regions of greater incidence, using geostatistics, and possible causes with the aid of a multicriteria analysis in the Geographic Information System. The unit of analysis is the traffic analysis zone of the survey named Origem-Destino, carried out in Salvador, state of Bahia, in 2013. The robberies recorded by the Department of Public Security of Bahia in 2013 were located and made compatible with the limits of the traffic analysis zones and, later, associated with the respective centroids. After determining the regions with the highest probability of robbery, we carried out a geographic analysis of the possible causes in the region with the highest robbery potential, considering the factors analyzed using a multicriteria analysis in a Geographic Information System environment. The execution of the two steps of this study allowed us to identify areas corresponding to the greater probability of occurrence of robberies in public transportation. In addition, the three most vulnerable road sections (Estrada da Liberdade, Rua Pero Vaz, and Avenida General San Martin) were identified in these areas. In these sections, the factors that most contribute with the potential for robbery in buses are: F1 - proximity to places that facilitate escape, F3 - great movement of persons, and F2 - absence of policing, respectively. Indicator Kriging (geostatistical estimation) can be used to construct a spatial probability surface, which can be a useful tool for the implementation of public policies. The multicriteria analysis in the Geographic Information System environment allowed us to understand the spatial factors related to the phenomenon under analysis.

  1. Hearing Children's Voices through a Conversation Analysis Approach

    ERIC Educational Resources Information Center

    Bateman, Amanda

    2017-01-01

    This article introduces the methodological approach of conversation analysis (CA) and demonstrates its usefulness in presenting more authentic documentation and analysis of children's voices. Grounded in ethnomethodology, CA has recently gained interest in the area of early childhood studies due to the affordances it holds for gaining access to…

  2. Global analysis of approaches for deriving total water storage changes from GRACE satellites and implications for groundwater storage change estimation

    NASA Astrophysics Data System (ADS)

    Long, D.; Scanlon, B. R.; Longuevergne, L.; Chen, X.

    2015-12-01

    Increasing interest in use of GRACE satellites and a variety of new products to monitor changes in total water storage (TWS) underscores the need to assess the reliability of output from different products. The objective of this study was to assess skills and uncertainties of different approaches for processing GRACE data to restore signal losses caused by spatial filtering based on analysis of 1°×1° grid scale data and basin scale data in 60 river basins globally. Results indicate that scaling factors from six land surface models (LSMs), including four models from GLDAS-1 (Noah 2.7, Mosaic, VIC, and CLM 2.0), CLM 4.0, and WGHM, are similar over most humid, sub-humid, and high-latitude regions but can differ by up to 100% over arid and semi-arid basins and areas with intensive irrigation. Large differences in TWS anomalies from three processing approaches (scaling factor, additive, and multiplicative corrections) were found in arid and semi-arid regions, areas with intensive irrigation, and relatively small basins (e.g., ≤ 200,000 km2). Furthermore, TWS anomaly products from gridded data with CLM4.0 scaling factors and the additive correction approach more closely agree with WGHM output than the multiplicative correction approach. Estimation of groundwater storage changes using GRACE satellites requires caution in selecting an appropriate approach for restoring TWS changes. A priori ground-based data used in forward modeling can provide a powerful tool for explaining the distribution of signal gains or losses caused by low-pass filtering in specific regions of interest and should be very useful for more reliable estimation of groundwater storage changes using GRACE satellites.

  3. Evaluating WAIS-IV structure through a different psychometric lens: structural causal model discovery as an alternative to confirmatory factor analysis.

    PubMed

    van Dijk, Marjolein J A M; Claassen, Tom; Suwartono, Christiany; van der Veld, William M; van der Heijden, Paul T; Hendriks, Marc P H

    Since the publication of the WAIS-IV in the U.S. in 2008, efforts have been made to explore the structural validity by applying factor analysis to various samples. This study aims to achieve a more fine-grained understanding of the structure of the Dutch language version of the WAIS-IV (WAIS-IV-NL) by applying an alternative analysis based on causal modeling in addition to confirmatory factor analysis (CFA). The Bayesian Constraint-based Causal Discovery (BCCD) algorithm learns underlying network structures directly from data and assesses more complex structures than is possible with factor analysis. WAIS-IV-NL profiles of two clinical samples of 202 patients (i.e. patients with temporal lobe epilepsy and a mixed psychiatric outpatient group) were analyzed and contrasted with a matched control group (N = 202) selected from the Dutch standardization sample of the WAIS-IV-NL to investigate internal structure by means of CFA and BCCD. With CFA, the four-factor structure as proposed by Wechsler demonstrates acceptable fit in all three subsamples. However, BCCD revealed three consistent clusters (verbal comprehension, visual processing, and processing speed) in all three subsamples. The combination of Arithmetic and Digit Span as a coherent working memory factor could not be verified, and Matrix Reasoning appeared to be isolated. With BCCD, some discrepancies from the proposed four-factor structure are exemplified. Furthermore, these results fit CHC theory of intelligence more clearly. Consistent clustering patterns indicate these results are robust. The structural causal discovery approach may be helpful in better interpreting existing tests, the development of new tests, and aid in diagnostic instruments.

  4. Assessing suicide risk among callers to crisis hotlines: a confirmatory factor analysis.

    PubMed

    Witte, Tracy K; Gould, Madelyn S; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E; Kalafat, John

    2010-09-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. We conducted an Exploratory Factor Analysis (EFA), an EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and a CFA on independent subsamples derived from a total sample of 1,085. Similar to previous studies, we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations; RPP) and one factor representing milder suicidal ideation (i.e., Suicidal Desire and Ideation; SDI). The RPP factor trended toward being more predictive of suicidal ideation at follow-up than the SDI factor. (c) 2010 Wiley Periodicals, Inc.

  5. Structured plant metabolomics for the simultaneous exploration of multiple factors.

    PubMed

    Vasilev, Nikolay; Boccard, Julien; Lang, Gerhard; Grömping, Ulrike; Fischer, Rainer; Goepfert, Simon; Rudaz, Serge; Schillberg, Stefan

    2016-11-17

    Multiple factors act simultaneously on plants to establish complex interaction networks involving nutrients, elicitors and metabolites. Metabolomics offers a better understanding of complex biological systems, but evaluating the simultaneous impact of different parameters on metabolic pathways that have many components is a challenging task. We therefore developed a novel approach that combines experimental design, untargeted metabolic profiling based on multiple chromatography systems and ionization modes, and multiblock data analysis, facilitating the systematic analysis of metabolic changes in plants caused by different factors acting at the same time. Using this method, target geraniol compounds produced in transgenic tobacco cell cultures were grouped into clusters based on their response to different factors. We hypothesized that our novel approach may provide more robust data for process optimization in plant cell cultures producing any target secondary metabolite, based on the simultaneous exploration of multiple factors rather than varying one factor each time. The suitability of our approach was verified by confirming several previously reported examples of elicitor-metabolite crosstalk. However, unravelling all factor-metabolite networks remains challenging because it requires the identification of all biochemically significant metabolites in the metabolomics dataset.

  6. Investigation of the Predictive Power of Academic Achievement, Learning Approaches and Self-Regulatory Learning Skills on University Entrance Exam Scores Using Path Analysis

    ERIC Educational Resources Information Center

    Ilhan-Beyaztas, Dilek; Göçer-Sahin, Sakine

    2018-01-01

    A good analysis of the success factors in the university entrance exam, which is an important step for academic careers of students, is believed to help them manage this process. Properties such as self-regulation and learning approaches adopted by students undoubtedly influence their academic achievement as well as their success in university…

  7. Examination of fungi in domestic interiors by using factor analysis: Correlations and associations with home factors. [Cladosporium, Alternaria, Epicoccum, Aureobasidium, Aspergillus; Penicillium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, H.J.; Rotnitzky, A.; Spengler, J.D.

    1992-01-01

    Factor analysis was utilized to investigate correlations among airborne microorganisms collected with Andersen samplers from homes in Topeka, Kans., during the winter of 1987 to 1988. The factors derived were used to relate microbial concentrations with categorical, questionnaire-derived descriptions of housing conditions. This approach successfully identified groups of common aboveground decay fungi including Cladosporium, Alternaria, Epicoccum, and Aureobasidium spp. The common soil fungi Aspergillus and Penicillium spp. were also separated as a group. These previously known ecological groupings were confirmed with air sampling data by a quantitative evaluation technique. The above ground decay fungi sampled indoors in winter were presentmore » at relatively high concentrations in homes with gas stoves for cooking, suggesting a possible association between these fungi and increased humidity from the combustion process. Elevated concentrations of the soil fungi were significantly associated with the dirt floor, crawl-space type of basement. Elevated concentrations of water-requiring fungi, such as Fusarium spp., were shown to be associated with water collection in domestic interiors. Also, elevated mean concentrations for the group of fungi including Cladosporium, Epicoccum, Aureobasidium, and yeast spp. were found to be associated with symptoms reported on a health questionnaire. This finding was consistent with the authors previous study of associations between respiratory health and airborne microorganisms by univariate logistic regression analysis.« less

  8. Development and Factor Analysis of the Protective Factors Index: A Report Card Section Related to the Work of School Counselors

    ERIC Educational Resources Information Center

    Bass, Gwen; Lee, Ji Hee; Wells, Craig; Carey, John C.; Lee, Sangmin

    2015-01-01

    The scale development and exploratory and confirmatory factor analyses of the Protective Factor Index (PFI) is described. The PFI is a 13-item component of elementary students' report cards that replaces typical items associated with student behavior. The PFI is based on the Construct-Based Approach (CBA) to school counseling, which proposes that…

  9. Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.

    PubMed

    Cinco, M

    1977-11-01

    Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.

  10. Risk factors for baclofen pump infection in children: a multivariate analysis.

    PubMed

    Spader, Heather S; Bollo, Robert J; Bowers, Christian A; Riva-Cambrin, Jay

    2016-06-01

    OBJECTIVE Intrathecal baclofen infusion systems to manage severe spasticity and dystonia are associated with higher infection rates in children than in adults. Factors unique to this population, such as poor nutrition and physical limitations for pump placement, have been hypothesized as the reasons for this disparity. The authors assessed potential risk factors for infection in a multivariate analysis. METHODS Patients who underwent implantation of a programmable pump and intrathecal catheter for baclofen infusion at a single center between January 1, 2000, and March 1, 2012, were identified in this retrospective cohort study. The primary end point was infection. Potential risk factors investigated included preoperative (i.e., demographics, body mass index [BMI], gastrostomy tube, tracheostomy, previous spinal fusion), intraoperative (i.e., surgeon, antibiotics, pump size, catheter location), and postoperative (i.e., wound dehiscence, CSF leak, and number of revisions) factors. Univariate analysis was performed, and a multivariate logistic regression model was created to identify independent risk factors for infection. RESULTS A total of 254 patients were evaluated. The overall infection rate was 9.8%. Univariate analysis identified young age, shorter height, lower weight, dehiscence, CSF leak, and number of revisions within 6 months of pump placement as significantly associated with infection. Multivariate analysis identified young age, dehiscence, and number of revisions as independent risk factors for infection. CONCLUSIONS Young age, wound dehiscence, and number of revisions were independent risk factors for infection in this pediatric cohort. A low BMI and the presence of either a gastrostomy or tracheostomy were not associated with infection and may not be contraindications for this procedure.

  11. Intelligent Systems Approaches to Product Sound Quality Analysis

    NASA Astrophysics Data System (ADS)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  12. A global sensitivity analysis approach for morphogenesis models.

    PubMed

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  13. A factor analysis approach to examining relationships among ovarian steroid concentrations, gonadotrophin concentrations and menstrual cycle length characteristics in healthy, cycling women

    PubMed Central

    Barrett, E.S.; Thune, I.; Lipson, S.F.; Furberg, A.-S.; Ellison, P.T.

    2013-01-01

    STUDY QUESTION How are ovarian steroid concentrations, gonadotrophins and menstrual cycle characteristics inter-related within normal menstrual cycles? SUMMARY ANSWER Within cycles, measures of estradiol production are highly related to one another, as are measures of progesterone production; however, the two hormones also show some independence from one another, and measures of cycle length and gonadotrophin concentrations show even greater independence, indicating minimal integration within cycles. WHAT IS KNOWN ALREADY The menstrual cycle is typically conceptualized as a cohesive unit, with hormone levels, follicular development and ovulation all closely inter-related within a single cycle. Empirical support for this idea is limited, however, and to our knowledge, no analysis has examined the relationships among all of these components simultaneously. STUDY DESIGN, SIZE, DURATION A total of 206 healthy, cycling Norwegian women participated in a prospective cohort study (EBBA-I) over the duration of a single menstrual cycle. Of these, 192 contributed hormonal and cycle data to the current analysis. PARTICIPANTS/MATERIALS, SETTING, METHODS Subjects provided daily saliva samples throughout the menstrual cycle from which estradiol and progesterone concentrations were measured. FSH and LH concentrations were measured in serum samples from three points in the same menstrual cycle and cycle length characteristics were calculated based on hormonal data and menstrual records. A factor analysis was conducted to examine the underlying relationships among 22 variables derived from the hormonal data and menstrual cycle characteristics. MAIN RESULTS AND THE ROLE OF CHANCE Six rotated factors emerged, explaining 80% of the variance in the data. Of these, factors representing estradiol and progesterone concentrations accounted for 37 and 13% of the variance, respectively. There was some association between measures of estradiol and progesterone production within cycles; however

  14. Exploring the relationships between epistemic beliefs about medicine and approaches to learning medicine: a structural equation modeling analysis.

    PubMed

    Chiu, Yen-Lin; Liang, Jyh-Chong; Hou, Cheng-Yen; Tsai, Chin-Chung

    2016-07-18

    Students' epistemic beliefs may vary in different domains; therefore, it may be beneficial for medical educators to better understand medical students' epistemic beliefs regarding medicine. Understanding how medical students are aware of medical knowledge and how they learn medicine is a critical issue of medical education. The main purposes of this study were to investigate medical students' epistemic beliefs relating to medical knowledge, and to examine their relationships with students' approaches to learning medicine. A total of 340 undergraduate medical students from 9 medical colleges in Taiwan were surveyed with the Medical-Specific Epistemic Beliefs (MSEB) questionnaire (i.e., multi-source, uncertainty, development, justification) and the Approach to Learning Medicine (ALM) questionnaire (i.e., surface motive, surface strategy, deep motive, and deep strategy). By employing the structural equation modeling technique, the confirmatory factor analysis and path analysis were conducted to validate the questionnaires and explore the structural relations between these two constructs. It was indicated that medical students with multi-source beliefs who were suspicious of medical knowledge transmitted from authorities were less likely to possess a surface motive and deep strategies. Students with beliefs regarding uncertain medical knowledge tended to utilize flexible approaches, that is, they were inclined to possess a surface motive but adopt deep strategies. Students with beliefs relating to justifying medical knowledge were more likely to have mixed motives (both surface and deep motives) and mixed strategies (both surface and deep strategies). However, epistemic beliefs regarding development did not have significant relations with approaches to learning. Unexpectedly, it was found that medical students with sophisticated epistemic beliefs (e.g., suspecting knowledge from medical experts) did not necessarily engage in deep approaches to learning medicine

  15. Approaches of researches in medical geography in Poland and Ukraine

    NASA Astrophysics Data System (ADS)

    Pantylej, Wiktoria

    2008-01-01

    This paper deals with the historical review of medical geography in the world, in Poland and in Ukraine. There are different approaches in medical geography: according to the research subject (ecological and economic approaches) and according to the current affairs of research (approach concerns sexuality, the age of the population and accordingly, accessibility of health care services to the population). To the author's mind, the most perspective approaches in medical geography in Poland and Ukraine are as follows: - integrative - dedicated to the health status of the population in connection with the quality and life level; - mathematical-statistical - connected with the problem of synthetic indexes of health status of the populations and factors influencing it, and with the problem of economic value of health and life of the population; - social-economic - the analysis of the influence of socioeconomic factors (such as wealth measure, rate of unemployment, work conditions and others) on public health; - ecological - connected with the researches dedicated to the analysis of environmental impact on public health status of the population; - demographical - the analysis of demographical factors of forming public health status; - social-psychological - health culture of the population, perception of the own health/morbidity and health care systems existing in different countries.

  16. Historical Evolution of Old-Age Mortality and New Approaches to Mortality Forecasting

    PubMed Central

    Gavrilov, Leonid A.; Gavrilova, Natalia S.; Krut'ko, Vyacheslav N.

    2017-01-01

    Knowledge of future mortality levels and trends is important for actuarial practice but poses a challenge to actuaries and demographers. The Lee-Carter method, currently used for mortality forecasting, is based on the assumption that the historical evolution of mortality at all age groups is driven by one factor only. This approach cannot capture an additive manner of mortality decline observed before the 1960s. To overcome the limitation of the one-factor model of mortality and to determine the true number of factors underlying mortality changes over time, we suggest a new approach to mortality analysis and forecasting based on the method of latent variable analysis. The basic assumption of this approach is that most variation in mortality rates over time is a manifestation of a small number of latent variables, variation in which gives rise to the observed mortality patterns. To extract major components of mortality variation, we apply factor analysis to mortality changes in developed countries over the period of 1900–2014. Factor analysis of time series of age-specific death rates in 12 developed countries (data taken from the Human Mortality Database) identified two factors capable of explaining almost 94 to 99 percent of the variance in the temporal changes of adult death rates at ages 25 to 85 years. Analysis of these two factors reveals that the first factor is a “young-age” or background factor with high factor loadings at ages 30 to 45 years. The second factor can be called an “oldage” or senescent factor because of high factor loadings at ages 65 to 85 years. It was found that the senescent factor was relatively stable in the past but now is rapidly declining for both men and women. The decline of the senescent factor is faster for men, although in most countries, it started almost 30 years later. Factor analysis of time series of age-specific death rates conducted for the oldest-old ages (65 to 100 years) found two factors explaining variation

  17. Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors

    USDA-ARS?s Scientific Manuscript database

    Process factors of enzyme concentration, time, power and frequency were investigated for ultrasound-enhanced bioscouring of greige cotton. A fractional factorial experimental design and subsequent regression analysis of the process factors were employed to determine the significance of each factor a...

  18. Non-planar one-loop Parke-Taylor factors in the CHY approach for quadratic propagators

    NASA Astrophysics Data System (ADS)

    Ahmadiniaz, Naser; Gomez, Humberto; Lopez-Arcos, Cristhiam

    2018-05-01

    In this work we have studied the Kleiss-Kuijf relations for the recently introduced Parke-Taylor factors at one-loop in the CHY approach, that reproduce quadratic Feynman propagators. By doing this, we were able to identify the non-planar one-loop Parke-Taylor factors. In order to check that, in fact, these new factors can describe non-planar amplitudes, we applied them to the bi-adjoint Φ3 theory. As a byproduct, we found a new type of graphs that we called the non-planar CHY-graphs. These graphs encode all the information for the subleading order at one-loop, and there is not an equivalent of these in the Feynman formalism.

  19. Data on master regulators and transcription factor binding sites found by upstream analysis of multi-omics data on methotrexate resistance of colon cancer.

    PubMed

    Kel, AlexanderE

    2017-02-01

    Computational analysis of master regulators through the search for transcription factor binding sites followed by analysis of signal transduction networks of a cell is a new approach of causal analysis of multi-omics data. This paper contains results on analysis of multi-omics data that include transcriptomics, proteomics and epigenomics data of methotrexate (MTX) resistant colon cancer cell line. The data were used for analysis of mechanisms of resistance and for prediction of potential drug targets and promising compounds for reverting the MTX resistance of these cancer cells. We present all results of the analysis including the lists of identified transcription factors and their binding sites in genome and the list of predicted master regulators - potential drug targets. This data was generated in the study recently published in the article "Multi-omics "Upstream Analysis" of regulatory genomic regions helps identifying targets against methotrexate resistance of colon cancer" (Kel et al., 2016) [4]. These data are of interest for researchers from the field of multi-omics data analysis and for biologists who are interested in identification of novel drug targets against NTX resistance.

  20. Semiotic Approach to the Analysis of Children's Drawings

    ERIC Educational Resources Information Center

    Turkcan, Burcin

    2013-01-01

    Semiotics, which is used for the analysis of a number of communication languages, helps describe the specific operational rules by determining the sub-systems included in the field it examines. Considering that art is a communication language, this approach could be used in analyzing children's products in art education. The present study aiming…

  1. Considering a Cost Analysis Project? A Planned Approach

    ERIC Educational Resources Information Center

    Parish, Mina; Teetor, Travis

    2006-01-01

    As resources become more constrained in the library community, many organizations are finding that they need to have a better understanding of their costs. To this end, this article will present one approach to conducting a cost analysis (including questions to ask yourself, project team makeup, organizational support, and data organization). We…

  2. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  3. Confirmatory Factor Analysis of the WISC-III with Child Psychiatric Inpatients.

    ERIC Educational Resources Information Center

    Tupa, David J.; Wright, Margaret O'Dougherty; Fristad, Mary A.

    1997-01-01

    Factor models of the Wechsler Intelligence Scale for Children-Third Edition (WISC-III) for one, two, three, and four factors were tested using confirmatory factor analysis with a sample of 177 child psychiatric inpatients. The four-factor model proposed in the WISC-III manual provided the best fit to the data. (SLD)

  4. The Meaning of Higher-Order Factors in Reflective-Measurement Models

    ERIC Educational Resources Information Center

    Eid, Michael; Koch, Tobias

    2014-01-01

    Higher-order factor analysis is a widely used approach for analyzing the structure of a multidimensional test. Whenever first-order factors are correlated researchers are tempted to apply a higher-order factor model. But is this reasonable? What do the higher-order factors measure? What is their meaning? Willoughby, Holochwost, Blanton, and Blair…

  5. Factors Leading to Success in Diversified Occupation: A Livelihood Analysis in India

    ERIC Educational Resources Information Center

    Saha, Biswarup; Bahal, Ram

    2015-01-01

    Purpose: Livelihood diversification is a sound alternative for higher economic growth and its success or failure is conditioned by the interplay of a multitude of factors. The study of the profile of the farmers in which they operate is important to highlight the factors leading to success in diversified livelihoods. Design/Methodology/Approach: A…

  6. Emotional experiences and motivating factors associated with fingerprint analysis.

    PubMed

    Charlton, David; Fraser-Mackenzie, Peter A F; Dror, Itiel E

    2010-03-01

    In this study, we investigated the emotional and motivational factors involved in fingerprint analysis in day-to-day routine case work and in significant and harrowing criminal investigations. Thematic analysis was performed on interviews with 13 experienced fingerprint examiners from a variety of law enforcement agencies. The data revealed factors relating to job satisfaction and the use of skill. Individual satisfaction related to catching criminals was observed; this was most notable in solving high profile, serious, or long-running cases. There were positive emotional effects associated with matching fingerprints and apparent fear of making errors. Finally, we found evidence for a need of cognitive closure in fingerprint examiner decision-making.

  7. Image-derived input function with factor analysis and a-priori information.

    PubMed

    Simončič, Urban; Zanotti-Fregonara, Paolo

    2015-02-01

    Quantitative PET studies often require the cumbersome and invasive procedure of arterial cannulation to measure the input function. This study sought to minimize the number of necessary blood samples by developing a factor-analysis-based image-derived input function (IDIF) methodology for dynamic PET brain studies. IDIF estimation was performed as follows: (a) carotid and background regions were segmented manually on an early PET time frame; (b) blood-weighted and tissue-weighted time-activity curves (TACs) were extracted with factor analysis; (c) factor analysis results were denoised and scaled using the voxels with the highest blood signal; (d) using population data and one blood sample at 40 min, whole-blood TAC was estimated from postprocessed factor analysis results; and (e) the parent concentration was finally estimated by correcting the whole-blood curve with measured radiometabolite concentrations. The methodology was tested using data from 10 healthy individuals imaged with [(11)C](R)-rolipram. The accuracy of IDIFs was assessed against full arterial sampling by comparing the area under the curve of the input functions and by calculating the total distribution volume (VT). The shape of the image-derived whole-blood TAC matched the reference arterial curves well, and the whole-blood area under the curves were accurately estimated (mean error 1.0±4.3%). The relative Logan-V(T) error was -4.1±6.4%. Compartmental modeling and spectral analysis gave less accurate V(T) results compared with Logan. A factor-analysis-based IDIF for [(11)C](R)-rolipram brain PET studies that relies on a single blood sample and population data can be used for accurate quantification of Logan-V(T) values.

  8. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  9. A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Cheng

    2016-03-12

    A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less

  10. Using Student-Centred Learning Environments to Stimulate Deep Approaches to Learning: Factors Encouraging or Discouraging Their Effectiveness

    ERIC Educational Resources Information Center

    Baeten, Marlies; Kyndt, Eva; Struyven, Katrien; Dochy, Filip

    2010-01-01

    This review outlines encouraging and discouraging factors in stimulating the adoption of deep approaches to learning in student-centred learning environments. Both encouraging and discouraging factors can be situated in the context of the learning environment, in students' perceptions of that context and in characteristics of the students…

  11. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits.

    PubMed

    Asimit, Jennifer L; Panoutsopoulou, Kalliope; Wheeler, Eleanor; Berndt, Sonja I; Cordell, Heather J; Morris, Andrew P; Zeggini, Eleftheria; Barroso, Inês

    2015-12-01

    Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome-wide association study data to identify single-nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P-value threshold. However, P-values do not account for differences in power, whereas Bayes' factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P-values of a decreasing type I error rate as study size increases for single-disease associations. Consequently, the overlap analysis of traits from different-sized studies encounters issues in fair P-value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P-values, particularly in low-power scenarios. Calibration tables between BFs and P-values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P-values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  12. Violence in public transportation: an approach based on spatial analysis

    PubMed Central

    de Sousa, Daiane Castro Bittencourt; Pitombo, Cira Souza; Rocha, Samille Santos; Salgueiro, Ana Rita; Delgado, Juan Pedro Moreno

    2017-01-01

    ABSTRACT OBJECTIVE To carry out a spatial analysis of the occurrence of acts of violence (specifically robberies) in public transportation, identifying the regions of greater incidence, using geostatistics, and possible causes with the aid of a multicriteria analysis in the Geographic Information System. METHODS The unit of analysis is the traffic analysis zone of the survey named Origem-Destino, carried out in Salvador, state of Bahia, in 2013. The robberies recorded by the Department of Public Security of Bahia in 2013 were located and made compatible with the limits of the traffic analysis zones and, later, associated with the respective centroids. After determining the regions with the highest probability of robbery, we carried out a geographic analysis of the possible causes in the region with the highest robbery potential, considering the factors analyzed using a multicriteria analysis in a Geographic Information System environment. RESULTS The execution of the two steps of this study allowed us to identify areas corresponding to the greater probability of occurrence of robberies in public transportation. In addition, the three most vulnerable road sections (Estrada da Liberdade, Rua Pero Vaz, and Avenida General San Martin) were identified in these areas. In these sections, the factors that most contribute with the potential for robbery in buses are: F1 - proximity to places that facilitate escape, F3 - great movement of persons, and F2 - absence of policing, respectively. CONCLUSIONS Indicator Kriging (geostatistical estimation) can be used to construct a spatial probability surface, which can be a useful tool for the implementation of public policies. The multicriteria analysis in the Geographic Information System environment allowed us to understand the spatial factors related to the phenomenon under analysis. PMID:29236883

  13. Analysis of risk factors for central venous catheter-related complications: a prospective observational study in pediatric patients with bone sarcomas.

    PubMed

    Abate, Massimo Eraldo; Sánchez, Olga Escobosa; Boschi, Rita; Raspanti, Cinzia; Loro, Loretta; Affinito, Domenico; Cesari, Marilena; Paioli, Anna; Palmerini, Emanuela; Ferrari, Stefano

    2014-01-01

    The incidence of central venous catheter (CVC)-related complications reported in pediatric sarcoma patients is not established as reports in available literature are limited. The analysis of risk factors is part of the strategy to reduce the incidence of CVC complications. The objective of this study was to determine the incidence of CVC complications in children with bone sarcomas and if defined clinical variables represent a risk factor. During an 8-year period, 155 pediatric patients with bone sarcomas were prospectively followed up for CVC complications. Incidence and correlation with clinical features including gender, age, body mass index, histology, disease stage, and use of thromboprophylaxis with low-molecular-weight heparin were analyzed. Thirty-three CVC complications were recorded among 42 687 CVC-days (0.77 per 1000 CVC-days). No correlation between the specific clinical variables and the CVC complications was found. A high incidence of CVC-related sepsis secondary to gram-negative bacteria was observed. The analysis of CVC complications and their potential risk factors in this sizable and relatively homogeneous pediatric population with bone sarcomas has led to the implementation of a multimodal approach by doctors and nurses to reduce the incidence and morbidity of the CVC-related infections, particularly those related to gram-negative bacteria. As a result of this joint medical and nursing study, a multimodal approach that included equipping faucets with water filters, the reeducation of doctors and nurses, and the systematic review of CVC protocol was implemented.

  14. What Is Rotating in Exploratory Factor Analysis?

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2015-01-01

    Exploratory factor analysis (EFA) is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what "rotation" is, what exactly is rotating, and why we use rotation when performing…

  15. Comprehensive Case Analysis on Participatory Approaches, from Nexus Perspectives

    NASA Astrophysics Data System (ADS)

    Masuhara, N.; Baba, K.

    2014-12-01

    According to Messages from the Bonn2011 Conference, involving local communities fully and effectively in the planning and implementation processes related to water, energy and food nexus for local ownership and commitment should be strongly needed. The participatory approaches such as deliberative polling, "joint fact-finding" and so on have been applied so far to resolve various environmental disputes, however the drivers and barriers in such processes have not been necessarily enough analyzed in a comprehensive manner, especially in Japan. Our research aims to explore solutions for conflicts in the context of water-energy-food nexus in local communities. To achieve it, we clarify drivers and barriers of each approaches applied so far in water, energy and food policy, focusing on how to deal with scientific facts. We generate hypotheses primarily that multi-issue solutions through policy integration will be more effective for conflicts in the context of water-energy-food nexus than single issue solutions for each policy. One of the key factors to formulate effective solutions is to integrate "scientific fact (expert knowledge)" and "local knowledge". Given this primary hypothesis, more specifically, we assume that it is effective for building consensus to provide opportunities to resolve the disagreement of "framing" that stakeholders can offer experts the points for providing scientific facts and that experts can get common understanding of scientific facts in the early stage of the process. To verify the hypotheses, we develop a database of the cases which such participatory approaches have been applied so far to resolve various environmental disputes based on literature survey of journal articles and public documents of Japanese cases. At present, our database is constructing. But it's estimated that conditions of framing and providing scientific information are important driving factors for problem solving and consensus building. And it's important to refine

  16. Logistic Regression and Path Analysis Method to Analyze Factors influencing Students’ Achievement

    NASA Astrophysics Data System (ADS)

    Noeryanti, N.; Suryowati, K.; Setyawan, Y.; Aulia, R. R.

    2018-04-01

    Students' academic achievement cannot be separated from the influence of two factors namely internal and external factors. The first factors of the student (internal factors) consist of intelligence (X1), health (X2), interest (X3), and motivation of students (X4). The external factors consist of family environment (X5), school environment (X6), and society environment (X7). The objects of this research are eighth grade students of the school year 2016/2017 at SMPN 1 Jiwan Madiun sampled by using simple random sampling. Primary data are obtained by distributing questionnaires. The method used in this study is binary logistic regression analysis that aims to identify internal and external factors that affect student’s achievement and how the trends of them. Path Analysis was used to determine the factors that influence directly, indirectly or totally on student’s achievement. Based on the results of binary logistic regression, variables that affect student’s achievement are interest and motivation. And based on the results obtained by path analysis, factors that have a direct impact on student’s achievement are students’ interest (59%) and students’ motivation (27%). While the factors that have indirect influences on students’ achievement, are family environment (97%) and school environment (37).

  17. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    PubMed

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.

  18. Awareness levels about breast cancer risk factors, early warning signs, and screening and therapeutic approaches among Iranian adult women: a large population based study using latent class analysis.

    PubMed

    Tazhibi, Mahdi; Feizi, Awat

    2014-01-01

    Breast cancer (BC) continues to be a major cause of morbidity and mortality among women throughout the world and in Iran. Lack of awareness and early detection program in developing country is a main reason for escalating the mortality. The present research was conducted to assess the Iranian women's level of knowledge about breast cancer risk factors, early warning signs, and therapeutic and screening approaches, and their correlated determinants. In a cross-sectional study, 2250 women before participating at a community based screening and public educational program in an institute of cancer research in Isfahan, Iran, in 2012 were investigated using a self-administered questionnaire about risk factors, early warning signs, and therapeutic and screening approaches of BC. Latent class regression as a comprehensive statistical method was used for evaluating the level of knowledge and its correlated determinants. Only 33.2%, 31.9%, 26.7%, and 35.8% of study participants had high awareness levels about screening approaches, risk factors, early warning signs and therapeutic modalities of breast cancer, respectively, and majority had poor to moderate knowledge levels. Most effective predictors of high level of awareness were higher educational qualifications, attending in screening and public educational programs, personal problem, and family history of BC, respectively. Results of current study indicated that the levels of awareness among study population about key elements of BC are low. These findings reenforce the continuing need for more BC education through conducting public and professional programs that are intended to raise awareness among younger, single women and those with low educational attainments and without family history.

  19. Confirmatory Factor Analysis of the Delirium Rating Scale Revised-98 (DRS-R98).

    PubMed

    Thurber, Steven; Kishi, Yasuhiro; Trzepacz, Paula T; Franco, Jose G; Meagher, David J; Lee, Yanghyun; Kim, Jeong-Lan; Furlanetto, Leticia M; Negreiros, Daniel; Huang, Ming-Chyi; Chen, Chun-Hsin; Kean, Jacob; Leonard, Maeve

    2015-01-01

    Principal components analysis applied to the Delirium Rating Scale-Revised-98 contributes to understanding the delirium construct. Using a multisite pooled international delirium database, the authors applied confirmatory factor analysis to Delirium Rating Scale-Revised-98 scores from 859 adult patients evaluated by delirium experts (delirium, N=516; nondelirium, N=343). Confirmatory factor analysis found all diagnostic features and core symptoms (cognitive, language, thought process, sleep-wake cycle, motor retardation), except motor agitation, loaded onto factor 1. Motor agitation loaded onto factor 2 with noncore symptoms (delusions, affective lability, and perceptual disturbances). Factor 1 loading supports delirium as a single construct, but when accompanied by psychosis, motor agitation's role may not be solely as a circadian activity indicator.

  20. A Pocock Approach to Sequential Meta-Analysis of Clinical Trials

    ERIC Educational Resources Information Center

    Shuster, Jonathan J.; Neu, Josef

    2013-01-01

    Three recent papers have provided sequential methods for meta-analysis of two-treatment randomized clinical trials. This paper provides an alternate approach that has three desirable features. First, when carried out prospectively (i.e., we only have the results up to the time of our current analysis), we do not require knowledge of the…

  1. A Comparative Study of Data Envelopment Analysis and Other Approaches to Efficiency Evaluation and Estimation.

    DTIC Science & Technology

    1982-11-01

    ADA27 91 A COMPARATIVE STJDO DATAENVEOPMENT ANALYSISAND / HER APPROACHES TO A .i)TEXAS UN VAT AUSIN CENTER FOCAS RE CCYR 5 NO BERRETIC STADIES A... COMPARATIVE STUDY OF DATA ENVELOPMENT ANALYSIS AND OTHER APPROACHES TO EFFICIENCY EVALUATION AND ESTIMATIONt by A. Charnes W.W. Cooper H.D. Sherman...School of Business, 1981, entitled "Measurement of Hospital Efficiency: A Comparative Analysis of Data Envelopment Analysis and Other Approaches for

  2. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe themore » RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.« less

  3. Factor Analysis of the Aberrant Behavior Checklist in Individuals with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Brinkley, Jason; Nations, Laura; Abramson, Ruth K.; Hall, Alicia; Wright, Harry H.; Gabriels, Robin; Gilbert, John R.; Pericak-Vance, Margaret A. O.; Cuccaro, Michael L.

    2007-01-01

    Exploratory factor analysis (varimax and promax rotations) of the aberrant behavior checklist-community version (ABC) in 275 individuals with Autism spectrum disorder (ASD) identified four- and five-factor solutions which accounted for greater than 70% of the variance. Confirmatory factor analysis (Lisrel 8.7) revealed indices of moderate fit for…

  4. Flow-graph approach for optical analysis of planar structures.

    PubMed

    Minkov, D

    1994-11-20

    The flow-graph approach (FGA) is applied to optical analysis of isotropic stratified planar structures (ISPS's) at inclined light incidence. Conditions for the presence of coherent and noncoherent light interaction within ISPS's are determined. Examples of the use of FGA for calculation of the transmission and the reflection of two-layer ISPS's for different types of light interaction are given. The advantages of the use of FGA for optical analysis of ISPS's are discussed.

  5. A factor analysis of landscape pattern and structure metrics

    Treesearch

    Kurt H. Riitters; R.V. O' Neill; C.T. Hunsaker; James D. Wickham; D.H. Yankee; S.P. Timmins; K.B. Jones; B.L. Jackson

    1995-01-01

    Fifty-five metrics of landscape pattern and structure were calculated for 85 maps of land use and land cover. A multivariate factor analysis was used to identify the common axes (or dimensions) of pattern and structure which were measured by a reduced set of 26 metrics. The first six factors explained about 87% of the variation in the 26 landscape metrics. These...

  6. Critical factors for the return-to-work process among people with affective disorders: Voices from two vocational approaches.

    PubMed

    Porter, Susann; Lexén, Annika; Johansson, Suzanne; Bejerholm, Ulrika

    2018-05-22

    Depression is among the major causes of disability with a negative impact on both daily life and work performance. Whilst depression is the primary cause of sick-leave and unemployment in today's workplace there is a lack of knowledge of the needs of individuals with depression regarding their return-to-work (RTW) process. To explore which factors are of critical importance for people suffering from depression and who also are unemployed in their RTW-process and to explore the impact of two vocational approaches on the service users' experiences. The study included participants in two vocational rehabilitation approaches; Individual Enabling and Support (IES) and Traditional Vocational Rehabilitation (TVR). Qualitative methods were applied to explore critical factors in the RTW-process. Individuals with affective disorders including depression and bipolar disorder were included.RESULTSThree themes emerged as critical factors; Experiencing hope and power, Professionals' positive attitudes, beliefs and behaviours, and Employing a holistic perspective and integrating health and vocational service.CONCLUSIONThis study has demonstrated critical factors for the return-to-work process as experienced by persons with depression. To experience hope and power, to meet professionals that believe "you can work", who use a person-centred and holistic service approach, are such factors necessary for gaining a real job. In particular, professionals in TVR need to embrace this understanding since their services were not experienced as including these elements.

  7. Analysis of the regulation of viral transcription.

    PubMed

    Gloss, Bernd; Kalantari, Mina; Bernard, Hans-Ulrich

    2005-01-01

    Despite the small genomes and number of genes of papillomaviruses, regulation of their transcription is very complex and governed by numerous transcription factors, cis-responsive elements, and epigenetic phenomena. This chapter describes the strategies of how one can approach a systematic analysis of these factors, elements, and mechanisms. From the numerous different techniques useful for studying transcription, we describe in detail three selected protocols of approaches that have been relevant in shaping our knowledge of human papillomavirus transcription. These are DNAse I protection ("footprinting") for location of transcription-factor binding sites, electrophoretic mobility shifts ("gelshifts") for analysis of bound transcription factors, and bisulfite sequencing for analysis of DNA methylation as a prerequisite for epigenetic transcriptional regulation.

  8. Factors Associated with Sexual Behavior among Adolescents: A Multivariate Analysis.

    ERIC Educational Resources Information Center

    Harvey, S. Marie; Spigner, Clarence

    1995-01-01

    A self-administered survey examining multiple factors associated with engaging in sexual intercourse was completed by 1,026 high school students in a classroom setting. Findings suggest that effective interventions to address teenage pregnancy need to utilize a multifaceted approach to the prevention of high-risk behaviors. (JPS)

  9. Phenotypic factor analysis of psychopathology reveals a new body-related transdiagnostic factor.

    PubMed

    Pezzoli, Patrizia; Antfolk, Jan; Santtila, Pekka

    2017-01-01

    Comorbidity challenges the notion of mental disorders as discrete categories. An increasing body of literature shows that symptoms cut across traditional diagnostic boundaries and interact in shaping the latent structure of psychopathology. Using exploratory and confirmatory factor analysis, we reveal the latent sources of covariation among nine measures of psychopathological functioning in a population-based sample of 13024 Finnish twins and their siblings. By implementing unidimensional, multidimensional, second-order, and bifactor models, we illustrate the relationships between observed variables, specific, and general latent factors. We also provide the first investigation to date of measurement invariance of the bifactor model of psychopathology across gender and age groups. Our main result is the identification of a distinct "Body" factor, alongside the previously identified Internalizing and Externalizing factors. We also report relevant cross-disorder associations, especially between body-related psychopathology and trait anger, as well as substantial sex and age differences in observed and latent means. The findings expand the meta-structure of psychopathology, with implications for empirical and clinical practice, and demonstrate shared mechanisms underlying attitudes towards nutrition, self-image, sexuality and anger, with gender- and age-specific features.

  10. Bayes factor design analysis: Planning for compelling evidence.

    PubMed

    Schönbrodt, Felix D; Wagenmakers, Eric-Jan

    2018-02-01

    A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.

  11. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  12. Petri net-based approach to modeling and analysis of selected aspects of the molecular regulation of angiogenesis

    PubMed Central

    Formanowicz, Dorota; Radom, Marcin; Zawierucha, Piotr; Formanowicz, Piotr

    2017-01-01

    The functioning of both normal and pathological tissues depends on an adequate supply of oxygen through the blood vessels. A process called angiogenesis, in which new endothelial cells and smooth muscles interact with each other, forming new blood vessels either from the existing ones or from a primary vascular plexus, is particularly important and interesting, due to new therapeutic possibilities it offers. This is a multi-step and very complex process, so an accurate understanding of the underlying mechanisms is a significant task, especially in recent years, with the constantly increasing amount of new data that must be taken into account. A systems approach is necessary for these studies because it is not sufficient to analyze the properties of the building blocks separately and an analysis of the whole network of interactions is essential. This approach is based on building a mathematical model of the system, while the model is expressed in the formal language of a mathematical theory. Recently, the theory of Petri nets was shown to be especially promising for the modeling and analysis of biological phenomena. This analysis, based mainly on t-invariants, has led to a particularly important finding that a direct link (close connection) exist between transforming growth factor β1 (TGF-β1), endothelial nitric oxide synthase (eNOS), nitric oxide (NO), and hypoxia-inducible factor 1, the molecules that play a crucial roles during angiogenesis. We have shown that TGF-β1 may participate in the inhibition of angiogenesis through the upregulation of eNOS expression, which is responsible for catalyzing NO production. The results obtained in the previous studies, concerning the effects of NO on angiogenesis, have not been conclusive, and therefore, our study may contribute to a better understanding of this phenomenon. PMID:28253310

  13. Petri net-based approach to modeling and analysis of selected aspects of the molecular regulation of angiogenesis.

    PubMed

    Formanowicz, Dorota; Radom, Marcin; Zawierucha, Piotr; Formanowicz, Piotr

    2017-01-01

    The functioning of both normal and pathological tissues depends on an adequate supply of oxygen through the blood vessels. A process called angiogenesis, in which new endothelial cells and smooth muscles interact with each other, forming new blood vessels either from the existing ones or from a primary vascular plexus, is particularly important and interesting, due to new therapeutic possibilities it offers. This is a multi-step and very complex process, so an accurate understanding of the underlying mechanisms is a significant task, especially in recent years, with the constantly increasing amount of new data that must be taken into account. A systems approach is necessary for these studies because it is not sufficient to analyze the properties of the building blocks separately and an analysis of the whole network of interactions is essential. This approach is based on building a mathematical model of the system, while the model is expressed in the formal language of a mathematical theory. Recently, the theory of Petri nets was shown to be especially promising for the modeling and analysis of biological phenomena. This analysis, based mainly on t-invariants, has led to a particularly important finding that a direct link (close connection) exist between transforming growth factor β1 (TGF-β1), endothelial nitric oxide synthase (eNOS), nitric oxide (NO), and hypoxia-inducible factor 1, the molecules that play a crucial roles during angiogenesis. We have shown that TGF-β1 may participate in the inhibition of angiogenesis through the upregulation of eNOS expression, which is responsible for catalyzing NO production. The results obtained in the previous studies, concerning the effects of NO on angiogenesis, have not been conclusive, and therefore, our study may contribute to a better understanding of this phenomenon.

  14. Exploratory Factor Analysis with Small Sample Sizes

    ERIC Educational Resources Information Center

    de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.

    2009-01-01

    Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…

  15. Human Factors Assessment: The Passive Final Approach Spacing Tool (pFAST) Operational Evaluation

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Sanford, Beverly D.

    1998-01-01

    Automation to assist air traffic controllers in the current terminal and en route air traff ic environments is being developed at Ames Research Center in conjunction with the Federal Aviation Administration. This automation, known collectively as the Center-TRACON Automation System (CTAS), provides decision- making assistance to air traffic controllers through computer-generated advisories. One of the CTAS tools developed specifically to assist terminal area air traffic controllers is the Passive Final Approach Spacing Tool (pFAST). An operational evaluation of PFAST was conducted at the Dallas/Ft. Worth, Texas, Terminal Radar Approach Control (TRACON) facility. Human factors data collected during the test describe the impact of the automation upon the air traffic controller in terms of perceived workload and acceptance. Results showed that controller self-reported workload was not significantly increased or reduced by the PFAST automation; rather, controllers reported that the levels of workload remained primarily the same. Controller coordination and communication data were analyzed, and significant differences in the nature of controller coordination were found. Controller acceptance ratings indicated that PFAST was acceptable. This report describes the human factors data and results from the 1996 Operational Field Evaluation of Passive FAST.

  16. An imprecise probability approach for squeal instability analysis based on evidence theory

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-01-01

    An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.

  17. A Gaussian Approximation Approach for Value of Information Analysis.

    PubMed

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  18. Space Human Factors Engineering Gap Analysis Project Final Report

    NASA Technical Reports Server (NTRS)

    Hudy, Cynthia; Woolford, Barbara

    2006-01-01

    Humans perform critical functions throughout each phase of every space mission, beginning with the mission concept and continuing to post-mission analysis (Life Sciences Division, 1996). Space missions present humans with many challenges - the microgravity environment, relative isolation, and inherent dangers of the mission all present unique issues. As mission duration and distance from Earth increases, in-flight crew autonomy will increase along with increased complexity. As efforts for exploring the moon and Mars advance, there is a need for space human factors research and technology development to play a significant role in both on-orbit human-system interaction, as well as the development of mission requirements and needs before and after the mission. As part of the Space Human Factors Engineering (SHFE) Project within the Human Research Program (HRP), a six-month Gap Analysis Project (GAP) was funded to identify any human factors research gaps or knowledge needs. The overall aim of the project was to review the current state of human factors topic areas and requirements to determine what data, processes, or tools are needed to aid in the planning and development of future exploration missions, and also to prioritize proposals for future research and technology development.

  19. A novel statistical approach for identification of the master regulator transcription factor.

    PubMed

    Sikdar, Sinjini; Datta, Susmita

    2017-02-02

    Transcription factors are known to play key roles in carcinogenesis and therefore, are gaining popularity as potential therapeutic targets in drug development. A 'master regulator' transcription factor often appears to control most of the regulatory activities of the other transcription factors and the associated genes. This 'master regulator' transcription factor is at the top of the hierarchy of the transcriptomic regulation. Therefore, it is important to identify and target the master regulator transcription factor for proper understanding of the associated disease process and identifying the best therapeutic option. We present a novel two-step computational approach for identification of master regulator transcription factor in a genome. At the first step of our method we test whether there exists any master regulator transcription factor in the system. We evaluate the concordance of two ranked lists of transcription factors using a statistical measure. In case the concordance measure is statistically significant, we conclude that there is a master regulator. At the second step, our method identifies the master regulator transcription factor, if there exists one. In the simulation scenario, our method performs reasonably well in validating the existence of a master regulator when the number of subjects in each treatment group is reasonably large. In application to two real datasets, our method ensures the existence of master regulators and identifies biologically meaningful master regulators. An R code for implementing our method in a sample test data can be found in http://www.somnathdatta.org/software . We have developed a screening method of identifying the 'master regulator' transcription factor just using only the gene expression data. Understanding the regulatory structure and finding the master regulator help narrowing the search space for identifying biomarkers for complex diseases such as cancer. In addition to identifying the master regulator our

  20. Assessing Suicide Risk Among Callers to Crisis Hotlines: A Confirmatory Factor Analysis

    PubMed Central

    Witte, Tracy K.; Gould, Madelyn S.; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E.; Kalafat, John

    2012-01-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. 1,085 suicidal callers to crisis hotlines were divided into three sub-samples, which allowed us to conduct an independent Exploratory Factor Analysis (EFA), EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and CFA. Similar to previous factor analytic studies (Beck et al., 1997; Holden & DeLisle, 2005; Joiner, Rudd, & Rajab, 1997; Witte et al., 2006), we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations) and one factor representing more mild suicidal ideation (i.e., Suicidal Desire and Ideation). Using structural equation modeling techniques, we found preliminary evidence that the Resolved Plans and Preparations factor trended toward being more predictive of suicidal ideation than the Suicidal Desire and Ideation factor. This factor analytic study is the first longitudinal study of the obtained factors. PMID:20578186