Sample records for factor analysis applied

  1. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  2. Confirmatory factor analysis applied to the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Eaton, Philip; Willoughby, Shannon D.

    2018-06-01

    In 1995, Huffman and Heller used exploratory factor analysis to draw into question the factors of the Force Concept Inventory (FCI). Since then several papers have been published examining the factors of the FCI on larger sets of student responses and understandable factors were extracted as a result. However, none of these proposed factor models have been verified to not be unique to their original sample through the use of independent sets of data. This paper seeks to confirm the factor models proposed by Scott et al. in 2012, and Hestenes et al. in 1992, as well as another expert model proposed within this study through the use of confirmatory factor analysis (CFA) and a sample of 20 822 postinstruction student responses to the FCI. Upon application of CFA using the full sample, all three models were found to fit the data with acceptable global fit statistics. However, when CFA was performed using these models on smaller sample sizes the models proposed by Scott et al. and Eaton and Willoughby were found to be far more stable than the model proposed by Hestenes et al. The goodness of fit of these models to the data suggests that the FCI can be scored on factors that are not unique to a single class. These scores could then be used to comment on how instruction methods effect the performance of students along a single factor and more in-depth analyses of curriculum changes may be possible as a result.

  3. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    PubMed

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  4. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  5. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  6. Human factors analysis and classification system applied to civil aircraft accidents in India.

    PubMed

    Gaur, Deepak

    2005-05-01

    The Human Factors Analysis and Classification System (HFACS) has gained wide acceptance as a tool to classify human factors in aircraft accidents and incidents. This study on application of HFACS to civil aircraft accident reports at Directorate General Civil of Aviation (DGCA), India, was conducted to ascertain the practicability of applying HFACS to existing investigation reports and to analyze the trends of human factor causes of civil aircraft accidents. Accident investigation reports held at DGCA, New Delhi, for the period 1990--99 were scrutinized. In all, 83 accidents occurred during this period, of which 48 accident reports were evaluated in this study. One or more human factors contributed to 37 of the 48 (77.1%) accidents. The commonest unsafe act was 'skill based errors' followed by 'decision errors.' Violations of laid down rules were contributory in 16 cases (33.3%). 'Preconditions for unsafe acts' were seen in 23 of the 48 cases (47.9%). A fairly large number (52.1%) had 'organizational influences' contributing to the accident. These results are in consonance with larger studies of accidents in the U.S. Navy and general aviation. Such a high percentage of 'organizational influences' has not been reported in other studies. This is a healthy sign for Indian civil aviation, provided effective remedial action for the same is undertaken.

  7. Factor analysis and cluster analysis applied to assess the water quality of middle and lower Han River in Central China

    NASA Astrophysics Data System (ADS)

    Kuo, Yi-Ming; Liu, Wen-Wen

    2015-04-01

    The Han River basin is one of the most important industrial and grain production bases in the central China. A lot of factories and towns have been established along the river where large farmlands are located nearby. In the last few decades the water quality of the Han River, specifically in middle and lower reaches, has gradually declined. The agricultural nonpoint pollution and municipal and industrial point pollution significantly degrade the water quality of the Han River. Factor analysis can be applied to reduce the dimensionality of a data set consisting of a large number of inter-related variables. Cluster analysis can classify the samples according to their similar characters. In this study, factor analysis is used to identify major pollution indicators, and cluster analysis is employed to classify the samples based on the sample locations and hydrochemical variables. Water samples were collected from 12 sample sites collected from Xiangyang City (middle Han River) to Wuhan City (lower Han River). Correlations among 25 hydrochemical variables are statistically examined. The important pollutants are determined by factor analysis. A three-factor model is determined and explains over 85% of the total river water quality variation. Factor 1, including SS, Chl-a, TN and TP, can be considered as the nonpoint source pollution. Factor 2, including Cl-, Br-, SO42-, Ca2+, Mg2+, K+, Fe2+ and PO43-, can be treated as the industrial pollutant pollution. Factor 3, including F- and NO3-, reflects the influence of the groundwater or self-purification capability of the river water. The various land uses along the Han River correlate well with the pollution types. In addition, the result showed that the water quality of Han River deteriorated gradually from middle to lower Han River. Some tributaries have been seriously polluted and significantly influence the mainstream water quality of the Han River. Finally, the result showed that the nonpoint pollution and the point

  8. Analysis and Design of Power Factor Pre-Regulator Based on a Symmetrical Charge Pump Circuit Applied to Electronic Ballast

    NASA Astrophysics Data System (ADS)

    Lazcano Olea, Miguel; Ramos Astudillo, Reynaldo; Sanhueza Robles, René; Rodriguez Rubke, Leopoldo; Ruiz-Caballero, Domingo Antonio

    This paper presents the analysis and design of a power factor pre-regulator based on a symmetrical charge pump circuit applied to electronic ballast. The operation stages of the circuit are analyzed and its main design equations are obtained. Simulation and experimental results are presented in order to show the design methodology feasibility.

  9. A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research

    ERIC Educational Resources Information Center

    Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.

    2014-01-01

    Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…

  10. Text mining factor analysis (TFA) in green tea patent data

    NASA Astrophysics Data System (ADS)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  11. Confirmatory factor analysis of different versions of the Body Shape Questionnaire applied to Brazilian university students.

    PubMed

    da Silva, Wanderson Roberto; Dias, Juliana Chioda Ribeiro; Maroco, João; Campos, Juliana Alvares Duarte Bonini

    2014-09-01

    This study aimed at evaluating the validity, reliability, and factorial invariance of the complete (34-item) and shortened (8-item and 16-item) versions of the Body Shape Questionnaire (BSQ) when applied to Brazilian university students. A total of 739 female students with a mean age of 20.44 (standard deviation=2.45) years participated. Confirmatory factor analysis was conducted to verify the degree to which the one-factor structure satisfies the proposal for the BSQ's expected structure. Two items of the 34-item version were excluded because they had factor weights (λ)<40. All models had adequate convergent validity (average variance extracted=.43-.58; composite reliability=.85-.97) and internal consistency (α=.85-.97). The 8-item B version was considered the best shortened BSQ version (Akaike information criterion=84.07, Bayes information criterion=157.75, Browne-Cudeck criterion=84.46), with strong invariance for independent samples (Δχ(2)λ(7)=5.06, Δχ(2)Cov(8)=5.11, Δχ(2)Res(16)=19.30). Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Factors affecting construction performance: exploratory factor analysis

    NASA Astrophysics Data System (ADS)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  13. Exploratory Bi-factor Analysis: The Oblique Case.

    PubMed

    Jennrich, Robert I; Bentler, Peter M

    2012-07-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (Psychometrika 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (Psychometrika 76:537-549, 2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bifactor rotation criterion designed to produce a rotated loading matrix that has an approximate bi-factor structure. Among other things this can be used as an aid in finding an explicit bi-factor structure for use in a confirmatory bi-factor analysis. They considered only orthogonal rotation. The purpose of this paper is to consider oblique rotation and to compare it to orthogonal rotation. Because there are many more oblique rotations of an initial loading matrix than orthogonal rotations, one expects the oblique results to approximate a bi-factor structure better than orthogonal rotations and this is indeed the case. A surprising result arises when oblique bi-factor rotation methods are applied to ideal data.

  14. Item Factor Analysis: Current Approaches and Future Directions

    ERIC Educational Resources Information Center

    Wirth, R. J.; Edwards, Michael C.

    2007-01-01

    The rationale underlying factor analysis applies to continuous and categorical variables alike; however, the models and estimation methods for continuous (i.e., interval or ratio scale) data are not appropriate for item-level data that are categorical in nature. The authors provide a targeted review and synthesis of the item factor analysis (IFA)…

  15. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  16. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    PubMed

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  17. Factor Analysis by Generalized Least Squares.

    ERIC Educational Resources Information Center

    Joreskog, Karl G.; Goldberger, Arthur S.

    Aitkin's generalized least squares (GLS) principle, with the inverse of the observed variance-covariance matrix as a weight matrix, is applied to estimate the factor analysis model in the exploratory (unrestricted) case. It is shown that the GLS estimates are scale free and asymptotically efficient. The estimates are computed by a rapidly…

  18. Confirmatory Factor Analysis of the Delirium Rating Scale Revised-98 (DRS-R98).

    PubMed

    Thurber, Steven; Kishi, Yasuhiro; Trzepacz, Paula T; Franco, Jose G; Meagher, David J; Lee, Yanghyun; Kim, Jeong-Lan; Furlanetto, Leticia M; Negreiros, Daniel; Huang, Ming-Chyi; Chen, Chun-Hsin; Kean, Jacob; Leonard, Maeve

    2015-01-01

    Principal components analysis applied to the Delirium Rating Scale-Revised-98 contributes to understanding the delirium construct. Using a multisite pooled international delirium database, the authors applied confirmatory factor analysis to Delirium Rating Scale-Revised-98 scores from 859 adult patients evaluated by delirium experts (delirium, N=516; nondelirium, N=343). Confirmatory factor analysis found all diagnostic features and core symptoms (cognitive, language, thought process, sleep-wake cycle, motor retardation), except motor agitation, loaded onto factor 1. Motor agitation loaded onto factor 2 with noncore symptoms (delusions, affective lability, and perceptual disturbances). Factor 1 loading supports delirium as a single construct, but when accompanied by psychosis, motor agitation's role may not be solely as a circadian activity indicator.

  19. Impact of "JOBM": ISI Impact Factor Places the "Journal of Organizational Behavior Management" Third in Applied Psychology

    ERIC Educational Resources Information Center

    Hantula, Donald A.

    2006-01-01

    The ISI Impact Factor for "JOBM" is 1.793, placing it third in the JCR rankings for journals in applied psychology with a sharply accelerating linear trend over the past 5 years. This article reviews the Impact Factor and raises questions regarding its reliability and validity and then considers a citation analysis of "JOBM" in light of the…

  20. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less

  1. Applying parallel factor analysis and Tucker-3 methods on sensory and instrumental data to establish preference maps: case study on sweet corn varieties.

    PubMed

    Gere, Attila; Losó, Viktor; Györey, Annamária; Kovács, Sándor; Huzsvai, László; Nábrádi, András; Kókai, Zoltán; Sipos, László

    2014-12-01

    Traditional internal and external preference mapping methods are based on principal component analysis (PCA). However, parallel factor analysis (PARAFAC) and Tucker-3 methods could be a better choice. To evaluate the methods, preference maps of sweet corn varieties will be introduced. A preference map of eight sweet corn varieties was established using PARAFAC and Tucker-3 methods. Instrumental data were also integrated into the maps. The triplot created by the PARAFAC model explains better how odour is separated from texture or appearance, and how some varieties are separated from others. Internal and external preference maps were created using parallel factor analysis (PARAFAC) and Tucker-3 models employing both sensory (trained panel and consumers) and instrumental parameters simultaneously. Triplots of the applied three-way models have a competitive advantage compared to the traditional biplots of the PCA-based external preference maps. The solution of PARAFAC and Tucker-3 is very similar regarding the interpretation of the first and third factors. The main difference is due to the second factor as it differentiated the attributes better. Consumers who prefer 'super sweet' varieties (they place great emphasis especially on taste) are much younger and have significantly higher incomes, and buy sweet corn products rarely (once a month). Consumers who consume sweet corn products mainly because of their texture and appearance are significantly older and include a higher ratio of men. © 2014 Society of Chemical Industry.

  2. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    NASA Astrophysics Data System (ADS)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  3. Validating the European Health Literacy Survey Questionnaire in people with type 2 diabetes: Latent trait analyses applying multidimensional Rasch modelling and confirmatory factor analysis.

    PubMed

    Finbråten, Hanne Søberg; Pettersen, Kjell Sverre; Wilde-Larsson, Bodil; Nordström, Gun; Trollvik, Anne; Guttersrud, Øystein

    2017-11-01

    To validate the European Health Literacy Survey Questionnaire (HLS-EU-Q47) in people with type 2 diabetes mellitus. The HLS-EU-Q47 latent variable is outlined in a framework with four cognitive domains integrated in three health domains, implying 12 theoretically defined subscales. Valid and reliable health literacy measurers are crucial to effectively adapt health communication and education to individuals and groups of patients. Cross-sectional study applying confirmatory latent trait analyses. Using a paper-and-pencil self-administered approach, 388 adults responded in March 2015. The data were analysed using the Rasch methodology and confirmatory factor analysis. Response violation (response dependency) and trait violation (multidimensionality) of local independence were identified. Fitting the "multidimensional random coefficients multinomial logit" model, 1-, 3- and 12-dimensional Rasch models were applied and compared. Poor model fit and differential item functioning were present in some items, and several subscales suffered from poor targeting and low reliability. Despite multidimensional data, we did not observe any unordered response categories. Interpreting the domains as distinct but related latent dimensions, the data fit a 12-dimensional Rasch model and a 12-factor confirmatory factor model best. Therefore, the analyses did not support the estimation of one overall "health literacy score." To support the plausibility of claims based on the HLS-EU score(s), we suggest: removing the health care aspect to reduce the magnitude of multidimensionality; rejecting redundant items to avoid response dependency; adding "harder" items and applying a six-point rating scale to improve subscale targeting and reliability; and revising items to improve model fit and avoid bias owing to person factors. © 2017 John Wiley & Sons Ltd.

  4. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  5. Concept analysis of culture applied to nursing.

    PubMed

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  6. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  7. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    NASA Astrophysics Data System (ADS)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  8. The crowding factor method applied to parafoveal vision

    PubMed Central

    Ghahghaei, Saeideh; Walker, Laura

    2016-01-01

    Crowding increases with eccentricity and is most readily observed in the periphery. During natural, active vision, however, central vision plays an important role. Measures of critical distance to estimate crowding are difficult in central vision, as these distances are small. Any overlap of flankers with the target may create an overlay masking confound. The crowding factor method avoids this issue by simultaneously modulating target size and flanker distance and using a ratio to compare crowded to uncrowded conditions. This method was developed and applied in the periphery (Petrov & Meleshkevich, 2011b). In this work, we apply the method to characterize crowding in parafoveal vision (<3.5 visual degrees) with spatial uncertainty. We find that eccentricity and hemifield have less impact on crowding than in the periphery, yet radial/tangential asymmetries are clearly preserved. There are considerable idiosyncratic differences observed between participants. The crowding factor method provides a powerful tool for examining crowding in central and peripheral vision, which will be useful in future studies that seek to understand visual processing under natural, active viewing conditions. PMID:27690170

  9. Image analysis applied to luminescence microscopy

    NASA Astrophysics Data System (ADS)

    Maire, Eric; Lelievre-Berna, Eddy; Fafeur, Veronique; Vandenbunder, Bernard

    1998-04-01

    We have developed a novel approach to study luminescent light emission during migration of living cells by low-light imaging techniques. The equipment consists in an anti-vibration table with a hole for a direct output under the frame of an inverted microscope. The image is directly captured by an ultra low- light level photon-counting camera equipped with an image intensifier coupled by an optical fiber to a CCD sensor. This installation is dedicated to measure in a dynamic manner the effect of SF/HGF (Scatter Factor/Hepatocyte Growth Factor) both on activation of gene promoter elements and on cell motility. Epithelial cells were stably transfected with promoter elements containing Ets transcription factor-binding sites driving a luciferase reporter gene. Luminescent light emitted by individual cells was measured by image analysis. Images of luminescent spots were acquired with a high aperture objective and time exposure of 10 - 30 min in photon-counting mode. The sensitivity of the camera was adjusted to a high value which required the use of a segmentation algorithm dedicated to eliminate the background noise. Hence, image segmentation and treatments by mathematical morphology were particularly indicated in these experimental conditions. In order to estimate the orientation of cells during their migration, we used a dedicated skeleton algorithm applied to the oblong spots of variable intensities emitted by the cells. Kinetic changes of luminescent sources, distance and speed of migration were recorded and then correlated with cellular morphological changes for each spot. Our results highlight the usefulness of the mathematical morphology to quantify kinetic changes in luminescence microscopy.

  10. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  11. Factor Analysis via Components Analysis

    ERIC Educational Resources Information Center

    Bentler, Peter M.; de Leeuw, Jan

    2011-01-01

    When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…

  12. Applied Cognitive Task Analysis (ACTA) Methodology

    DTIC Science & Technology

    1997-11-01

    experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the

  13. UK Parents' Beliefs about Applied Behaviour Analysis as an Approach to Autism Education

    ERIC Educational Resources Information Center

    Denne, Louise D.; Hastings, Richard P.; Hughes, J. Carl

    2017-01-01

    Research into factors underlying the dissemination of evidence-based practice is limited within the field of Applied Behaviour Analysis (ABA). This is pertinent, particularly in the UK where national policies and guidelines do not reflect the emerging ABA evidence base, or policies and practices elsewhere. Theories of evidence-based practice in…

  14. Risk analysis of the thermal sterilization process. Analysis of factors affecting the thermal resistance of microorganisms.

    PubMed

    Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A

    1999-03-01

    A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct.

  15. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  16. Analysis of the interaction between experimental and applied behavior analysis.

    PubMed

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  17. Donor retention in health care in Iran: a factor analysis

    PubMed Central

    Aghababa, Sara; Nasiripour, Amir Ashkan; Maleki, Mohammadreza; Gohari, Mahmoodreza

    2017-01-01

    Background: Long-term financial support is essential for the survival of a charitable organization. Health charities need to identify the effective factors influencing donor retention. Methods: In the present study, the items of a questionnaire were derived from both literature review and semi-structured interviews related to donor retention. Using a purposive sampling, 300 academic and executive practitioners were selected. After the follow- up, a total of 243 usable questionnaires were prepared for factor analysis. The questionnaire was validated based on the face and content validity and reliability through Cronbach’s α-coefficient. Results: The results of exploratory factor analysis extracted 2 factors for retention: donor factor (variance = 33.841%; Cronbach’s α-coefficient = 90.2) and charity factor (variance = 29.038%; Cronbach’s α-coefficient = 82.8), respectively. Subsequently, confirmatory factor analysis was applied to support the overall reasonable fit. Conclusions: In this study, it was found that repeated monetary donations are supplied to the charitable organizations when both aspects of donor factor (retention factor and charity factor) for retention are taken into consideration. This model could provide a perspective for making sustainable donations and charitable giving PMID:28955663

  18. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  19. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  20. Metropolis-Hastings Robbins-Monro Algorithm for Confirmatory Item Factor Analysis

    ERIC Educational Resources Information Center

    Cai, Li

    2010-01-01

    Item factor analysis (IFA), already well established in educational measurement, is increasingly applied to psychological measurement in research settings. However, high-dimensional confirmatory IFA remains a numerical challenge. The current research extends the Metropolis-Hastings Robbins-Monro (MH-RM) algorithm, initially proposed for…

  1. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  2. Risky Business: Factor Analysis of Survey Data – Assessing the Probability of Incorrect Dimensionalisation

    PubMed Central

    van der Eijk, Cees; Rose, Jonathan

    2015-01-01

    This paper undertakes a systematic assessment of the extent to which factor analysis the correct number of latent dimensions (factors) when applied to ordered-categorical survey items (so-called Likert items). We simulate 2400 data sets of uni-dimensional Likert items that vary systematically over a range of conditions such as the underlying population distribution, the number of items, the level of random error, and characteristics of items and item-sets. Each of these datasets is factor analysed in a variety of ways that are frequently used in the extant literature, or that are recommended in current methodological texts. These include exploratory factor retention heuristics such as Kaiser’s criterion, Parallel Analysis and a non-graphical scree test, and (for exploratory and confirmatory analyses) evaluations of model fit. These analyses are conducted on the basis of Pearson and polychoric correlations. We find that, irrespective of the particular mode of analysis, factor analysis applied to ordered-categorical survey data very often leads to over-dimensionalisation. The magnitude of this risk depends on the specific way in which factor analysis is conducted, the number of items, the properties of the set of items, and the underlying population distribution. The paper concludes with a discussion of the consequences of over-dimensionalisation, and a brief mention of alternative modes of analysis that are much less prone to such problems. PMID:25789992

  3. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  4. Factor analysis of an instrument to measure the impact of disease on daily life.

    PubMed

    Pedrosa, Rafaela Batista Dos Santos; Rodrigues, Roberta Cunha Matheus; Padilha, Kátia Melissa; Gallani, Maria Cecília Bueno Jayme; Alexandre, Neusa Maria Costa

    2016-01-01

    to verify the structure of factors of an instrument to measure the Heart Valve Disease Impact on Daily Life (IDCV) when applied to coronary artery disease patients. the study included 153 coronary artery disease patients undergoing outpatient follow-up care. The IDCV structure of factors was initially assessed by means of confirmatory factor analysis and, subsequently, by exploratory factor analysis. The Varimax rotation method was used to estimate the main components of analysis, eigenvalues greater than one for extraction of factors, and factor loading greater than 0.40 for selection of items. Internal consistency was estimated using Cronbach's alpha coefficient. confirmatory factor analysis did not confirm the original structure of factors of the IDCV. Exploratory factor analysis showed three dimensions, which together explained 78% of the measurement variance. future studies with expansion of case selection are necessary to confirm the IDCV new structure of factors.

  5. Lessons from NASA Applied Sciences Program: Success Factors in Applying Earth Science in Decision Making

    NASA Astrophysics Data System (ADS)

    Friedl, L. A.; Cox, L.

    2008-12-01

    The NASA Applied Sciences Program collaborates with organizations to discover and demonstrate applications of NASA Earth science research and technology to decision making. The desired outcome is for public and private organizations to use NASA Earth science products in innovative applications for sustained, operational uses to enhance their decisions. In addition, the program facilitates the end-user feedback to Earth science to improve products and demands for research. The Program thus serves as a bridge between Earth science research and technology and the applied organizations and end-users with management, policy, and business responsibilities. Since 2002, the Applied Sciences Program has sponsored over 115 applications-oriented projects to apply Earth observations and model products to decision making activities. Projects have spanned numerous topics - agriculture, air quality, water resources, disasters, public health, aviation, etc. The projects have involved government agencies, private companies, universities, non-governmental organizations, and foreign entities in multiple types of teaming arrangements. The paper will examine this set of applications projects and present specific examples of successful use of Earth science in decision making. The paper will discuss scientific, organizational, and management factors that contribute to or impede the integration of the Earth science research in policy and management. The paper will also present new methods the Applied Sciences Program plans to implement to improve linkages between science and end users.

  6. Application of Factor Analysis on the Financial Ratios of Indian Cement Industry and Validation of the Results by Cluster Analysis

    NASA Astrophysics Data System (ADS)

    De, Anupam; Bandyopadhyay, Gautam; Chakraborty, B. N.

    2010-10-01

    Financial ratio analysis is an important and commonly used tool in analyzing financial health of a firm. Quite a large number of financial ratios, which can be categorized in different groups, are used for this analysis. However, to reduce number of ratios to be used for financial analysis and regrouping them into different groups on basis of empirical evidence, Factor Analysis technique is being used successfully by different researches during the last three decades. In this study Factor Analysis has been applied over audited financial data of Indian cement companies for a period of 10 years. The sample companies are listed on the Stock Exchange India (BSE and NSE). Factor Analysis, conducted over 44 variables (financial ratios) grouped in 7 categories, resulted in 11 underlying categories (factors). Each factor is named in an appropriate manner considering the factor loads and constituent variables (ratios). Representative ratios are identified for each such factor. To validate the results of Factor Analysis and to reach final conclusion regarding the representative ratios, Cluster Analysis had been performed.

  7. Combination of preservation factors applied to minimal processing of foods.

    PubMed

    Tapia de Daza, M S; Alzamora, S M; Chanes, J W

    1996-07-01

    Innovative technologies for producing minimally processed (MP) foods that apply the concept of combination of preservation factors are addressed in this article with special emphasis on a new combined approach that has been successfully applied in several Latin American countries for MP high-moisture fruit products (HMFP). HMFP can be regarded as a different approach to the commercially available and widely accepted MP concept for fruits and vegetables (even if developed for the same purpose of obtaining freshlike high-quality products with an extended shelf life) that is better adapted to Latin American countries in terms of independence of the chill chain and the use of simple and energy-efficient technologies. The continuous refrigeration hurdle associated with MP refrigerated fruits is not included in the preservation system of HMFP because a different combination of hurdles must be overcome to enhance the shelf stability of nonrespiring vegetable tissues while preserving freshlike character. Guidelines to obtain safe and high-quality MP fruit products are proposed. Other products preserved by combined factors technology are also discussed, as well as some other classical and new preservation factors whose application could enhance the quality of HMFP.

  8. Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

    PubMed Central

    Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip

    2011-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561

  9. The impact of applied behavior analysis on diverse areas of research.

    PubMed Central

    Kazdin, A E

    1975-01-01

    The impact of applied behavior analysis on various disciplines and areas of research was assessed through two major analyses. First, the relationship of applied behavior analysis to the general area of "behavior modification" was evaluated by examining the citation characteristics of journal articles in JABA and three other behavior-modification journals. Second, the penetration of applied behavior analysis into diverse areas and disciplines, including behavior modification, psychiatry, clinical psychology, education, special education, retardation, speech and hearing, counselling, and law enforcement and correction was assessed. Twenty-five journals representing diverse research areas were evaluated from 1968 to 1974 to assess the extent to which operant techniques were applied for therapeutic, rehabilitative, and educative purposes and the degree to which methodological desiderata of applied behavior analysis were met. The analyses revealed diverse publication outlets for applied behavior analysis in various disciplines. PMID:1184488

  10. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  11. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    PubMed

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-05-15

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?

    PubMed

    Strathie, Ailsa; Walker, Guy H

    2016-03-01

    A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.

  13. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  14. Applying Cognitive Work Analysis to Time Critical Targeting Functionality

    DTIC Science & Technology

    2004-10-01

    Cognitive Task Analysis , CTA, Cognitive Task Analysis , Human Factors, GUI, Graphical User Interface, Heuristic Evaluation... Cognitive Task Analysis MITRE Briefing January 2000 Dynamic Battle Management Functional Architecture 3-1 Section 3 Human Factors...clear distinction between Cognitive Work Analysis (CWA) and Cognitive Task Analysis (CTA), therefore this document will refer to these

  15. Can small field diode correction factors be applied universally?

    PubMed

    Liu, Paul Z Y; Suchowerska, Natalka; McKenzie, David R

    2014-09-01

    Diode detectors are commonly used in dosimetry, but have been reported to over-respond in small fields. Diode correction factors have been reported in the literature. The purpose of this study is to determine whether correction factors for a given diode type can be universally applied over a range of irradiation conditions including beams of different qualities. A mathematical relation of diode over-response as a function of the field size was developed using previously published experimental data in which diodes were compared to an air core scintillation dosimeter. Correction factors calculated from the mathematical relation were then compared those available in the literature. The mathematical relation established between diode over-response and the field size was found to predict the measured diode correction factors for fields between 5 and 30 mm in width. The average deviation between measured and predicted over-response was 0.32% for IBA SFD and PTW Type E diodes. Diode over-response was found to be not strongly dependent on the type of linac, the method of collimation or the measurement depth. The mathematical relation was found to agree with published diode correction factors derived from Monte Carlo simulations and measurements, indicating that correction factors are robust in their transportability between different radiation beams. Copyright © 2014. Published by Elsevier Ireland Ltd.

  16. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  17. [Mathematic analysis of risk factors influence on occupational respiratory diseases development].

    PubMed

    Budkar', L N; Bugaeva, I V; Obukhova, T Iu; Tereshina, L G; Karpova, E A; Shmonina, O G

    2010-01-01

    Analysis covered 1348 case histories of workers exposed to industrial dust in Urals region. The analysis applied mathematical processing of survival theory and correlation analysis. The authors studied influence of various factors: dust concentration, connective tissue dysplasia, smoking habits--on duration for diseases caused by dust to appear. Findings are that occupational diseases develop reliably faster with higher ambient dust concentrations and with connective tissue dysplasia syndrome. Smoking habits do not alter duration of pneumoconiosis development, but reliably increases development of occupational dust bronchitis.

  18. Multiple factors explain injury risk in adolescent elite athletes: Applying a biopsychosocial perspective.

    PubMed

    von Rosen, P; Frohm, A; Kottorp, A; Fridén, C; Heijne, A

    2017-12-01

    Many risk factors for injury are presented in the literature, few of those are however consistent and the majority is associated with adult and not adolescent elite athletes. The aim was to identify risk factors for injury in adolescent elite athletes, by applying a biopsychosocial approach. A total of 496 adolescent elite athletes (age range 15-19), participating in 16 different sports, were monitored repeatedly over 52 weeks using a valid questionnaire about injuries, training exposure, sleep, stress, nutrition, and competence-based self-esteem. Univariate and multiple Cox regression analyses were used to calculate hazard ratios (HR) for risk factors for first reported injury. The main finding was that an increase in training load, training intensity, and at the same time decreasing the sleep volume resulted in a higher risk for injury compared to no change in these variables (HR 2.25, 95% CI, 1.46-3.45, P<.01), which was the strongest risk factor identified. In addition, an increase by one score of competence-based self-esteem increased the hazard for injury with 1.02 (HR 95% CI, 1.00-1.04, P=.01). Based on the multiple Cox regression analysis, an athlete having the identified risk factors (Risk Index, competence-based self-esteem), with an average competence-based self-esteem score, had more than a threefold increased risk for injury (HR 3.35), compared to an athlete with a low competence-based self-esteem and no change in sleep or training volume. Our findings confirm injury occurrence as a result of multiple risk factors interacting in complex ways. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Determinant Factors in Applying Picture Archiving and Communication Systems (PACS) in Healthcare.

    PubMed

    Abdekhoda, Mohammadhiwa; Salih, Kawa Mirza

    2017-01-01

    Meaningful use of picture archiving and communication systems (PACS) can change the workflow for accessing digital images, lead to faster turnaround time, reduce tests and examinations, and increase patient throughput. This study was carried out to identify determinant factors that affect the adoption of PACS by physicians. This was a cross-sectional study in which 190 physicians working in a teaching hospital affiliated with Tehran University of Medical Sciences were randomly selected. Physicians' perceptions concerning the adoption of PACS were assessed by the conceptual path model of the Unified Theory of Acceptance and Use of Technology (UTAUT). Collected data were analyzed with regression analysis. Structural equation modeling was applied to test the final model that was developed. The results show that the UTAUT model can explain about 61 percent of the variance on in the adoption of PACS ( R 2 = 0.61). The findings also showed that performance expectancy, effort expectancy, social influences, and behavior intention have a direct and significant effect on the adoption of PACS. However, facility condition showed to have no significant effect on physicians' behavior intentions. Implementation of new technology such as PACS in the healthcare sector is unavoidable. Our study clearly identified significant and nonsignificant factors that may affect the adoption of PACS. Also, this study acknowledged that physicians' perception is a key factor to manage the implementation of PACS optimally, and this fact should be considered by healthcare managers and policy makers.

  20. Determinant Factors in Applying Picture Archiving and Communication Systems (PACS) in Healthcare

    PubMed Central

    Abdekhoda, Mohammadhiwa; Salih, Kawa Mirza

    2017-01-01

    Objectives Meaningful use of picture archiving and communication systems (PACS) can change the workflow for accessing digital images, lead to faster turnaround time, reduce tests and examinations, and increase patient throughput. This study was carried out to identify determinant factors that affect the adoption of PACS by physicians. Methods This was a cross-sectional study in which 190 physicians working in a teaching hospital affiliated with Tehran University of Medical Sciences were randomly selected. Physicians’ perceptions concerning the adoption of PACS were assessed by the conceptual path model of the Unified Theory of Acceptance and Use of Technology (UTAUT). Collected data were analyzed with regression analysis. Structural equation modeling was applied to test the final model that was developed. Results The results show that the UTAUT model can explain about 61 percent of the variance on in the adoption of PACS (R2 = 0.61). The findings also showed that performance expectancy, effort expectancy, social influences, and behavior intention have a direct and significant effect on the adoption of PACS. However, facility condition showed to have no significant effect on physicians’ behavior intentions. Conclusions Implementation of new technology such as PACS in the healthcare sector is unavoidable. Our study clearly identified significant and nonsignificant factors that may affect the adoption of PACS. Also, this study acknowledged that physicians’ perception is a key factor to manage the implementation of PACS optimally, and this fact should be considered by healthcare managers and policy makers. PMID:28855856

  1. Critical incident technique analysis applied to perianesthetic cardiac arrests at a university teaching hospital.

    PubMed

    Hofmeister, Erik H; Reed, Rachel A; Barletta, Michele; Shepard, Molly; Quandt, Jane

    2018-05-01

    To apply the critical incident technique (CIT) methodology to a series of perianesthetic cardiac arrest events at a university teaching hospital to describe the factors that contributed to cardiac arrest. CIT qualitative analysis of a case series. A group of 16 dogs and cats that suffered a perioperative cardiac arrest between November 2013 and November 2016. If an arrest occurred, the event was discussed among the anesthesiologists. The discussion included a description of the case, a description of the sequence of events leading up to the arrest and a discussion of what could have been done to affect the outcome. A written description of the case and the event including animal signalment and a timeline of events was provided by the supervising anesthesiologist following discussion among the anesthesiologists. Only dogs or cats were included. After the data collection period, information from the medical record was collected. A qualitative document analysis was performed on the summaries provided about each case by the supervising anesthesiologist, the medical record and any supporting documents. Each case was then classified into one or more of the following: animal, human, equipment, drug and procedural factors for cardiac arrest. The most common factor was animal (n=14), followed by human (n=12), procedural (n=4), drugs (n=1) and equipment (n=1). The majority (n=11) of animals had multiple factors identified. Cardiac arrests during anesthesia at a referral teaching hospital were primarily a result of animal and human factors. Arrests because of procedural, drug and equipment factors were uncommon. Most animals experienced more than one factor and two animals arrested after a change in recumbency. Future work should focus on root cause analysis and interventions designed to minimize all factors, particularly human ones. Copyright © 2018 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. Published by Elsevier Ltd

  2. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  3. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  4. Chemical factor analysis of skin cancer FTIR-FEW spectroscopic data

    NASA Astrophysics Data System (ADS)

    Bruch, Reinhard F.; Sukuta, Sydney

    2002-03-01

    Chemical Factor Analysis (CFA) algorithms were applied to transform complex Fourier transform infrared fiberoptical evanescent wave (FTIR-FEW) normal and malignant skin tissue spectra into factor spaces for analysis and classification. The factor space approach classified melanoma beyond prior pathological classifications related to specific biochemical alterations to health states in cluster diagrams allowing diagnosis with more biochemical specificity, resolving biochemical component spectra and employing health state eigenvector angular configurations as disease state sensors. This study demonstrated a wealth of new information from in vivo FTIR-FEW spectral tissue data, without extensive a priori information or clinically invasive procedures. In particular, we employed a variety of methods used in CFA to select the rank of spectroscopic data sets of normal benign and cancerous skin tissue. We used the Malinowski indicator function (IND), significance level and F-Tests to rank our data matrices. Normal skin tissue, melanoma and benign tumors were modeled by four, two and seven principal abstract factors, respectively. We also showed that the spectrum of the first eigenvalue was equivalent to the mean spectrum. The graphical depiction of angular disparities between the first abstract factors can be adopted as a new way to characterize and diagnose melanoma cancer.

  5. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach

    PubMed Central

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions. PMID:29033978

  6. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach.

    PubMed

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions.

  7. Animal research in the Journal of Applied Behavior Analysis.

    PubMed

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  8. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care

    PubMed Central

    McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Introduction: Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested “guiding tools” based on human factors principles. Methods: Mixed-methods development of guiding tools (Personal Booklet—to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad—to guide a team-based systems analysis; and a written Report Format) by a multiprofessional “expert” group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Results: Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P < .001), while most agreed that the Personal Booklet was practical (88/123, 71.5%) and relevant to dealing with related emotions (93/123, 75.6%). The Desk Pad tool helped focus the SEA on systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Discussion: Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement. PMID:27583996

  9. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care.

    PubMed

    Bowie, Paul; McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested "guiding tools" based on human factors principles. Mixed-methods development of guiding tools (Personal Booklet-to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad-to guide a team-based systems analysis; and a written Report Format) by a multiprofessional "expert" group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P < .001), while most agreed that the Personal Booklet was practical (88/123, 71.5%) and relevant to dealing with related emotions (93/123, 75.6%). The Desk Pad tool helped focus the SEA on systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement.

  10. SEPARABLE FACTOR ANALYSIS WITH APPLICATIONS TO MORTALITY DATA

    PubMed Central

    Fosdick, Bailey K.; Hoff, Peter D.

    2014-01-01

    Human mortality data sets can be expressed as multiway data arrays, the dimensions of which correspond to categories by which mortality rates are reported, such as age, sex, country and year. Regression models for such data typically assume an independent error distribution or an error model that allows for dependence along at most one or two dimensions of the data array. However, failing to account for other dependencies can lead to inefficient estimates of regression parameters, inaccurate standard errors and poor predictions. An alternative to assuming independent errors is to allow for dependence along each dimension of the array using a separable covariance model. However, the number of parameters in this model increases rapidly with the dimensions of the array and, for many arrays, maximum likelihood estimates of the covariance parameters do not exist. In this paper, we propose a submodel of the separable covariance model that estimates the covariance matrix for each dimension as having factor analytic structure. This model can be viewed as an extension of factor analysis to array-valued data, as it uses a factor model to estimate the covariance along each dimension of the array. We discuss properties of this model as they relate to ordinary factor analysis, describe maximum likelihood and Bayesian estimation methods, and provide a likelihood ratio testing procedure for selecting the factor model ranks. We apply this methodology to the analysis of data from the Human Mortality Database, and show in a cross-validation experiment how it outperforms simpler methods. Additionally, we use this model to impute mortality rates for countries that have no mortality data for several years. Unlike other approaches, our methodology is able to estimate similarities between the mortality rates of countries, time periods and sexes, and use this information to assist with the imputations. PMID:25489353

  11. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  12. The effects of common risk factors on stock returns: A detrended cross-correlation analysis

    NASA Astrophysics Data System (ADS)

    Ruan, Qingsong; Yang, Bingchan

    2017-10-01

    In this paper, we investigate the cross-correlations between Fama and French three factors and the return of American industries on the basis of cross-correlation statistic test and multifractal detrended cross-correlation analysis (MF-DCCA). Qualitatively, we find that the return series of Fama and French three factors and American industries were overall significantly cross-correlated based on the analysis of a statistic. Quantitatively, we find that the cross-correlations between three factors and the return of American industries were strongly multifractal, and applying MF-DCCA we also investigate the cross-correlation of industry returns and residuals. We find that there exists multifractality of industry returns and residuals. The result of correlation coefficients we can verify that there exist other factors which influence the industry returns except Fama three factors.

  13. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  14. Extension Procedures for Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Nagy, Gabriel; Brunner, Martin; Lüdtke, Oliver; Greiff, Samuel

    2017-01-01

    We present factor extension procedures for confirmatory factor analysis that provide estimates of the relations of common and unique factors with external variables that do not undergo factor analysis. We present identification strategies that build upon restrictions of the pattern of correlations between unique factors and external variables. The…

  15. TRICARE Applied Behavior Analysis (ABA) Benefit

    PubMed Central

    Maglione, Margaret; Kadiyala, Srikanth; Kress, Amii; Hastings, Jaime L.; O'Hanlon, Claire E.

    2017-01-01

    Abstract This study compared the Applied Behavior Analysis (ABA) benefit provided by TRICARE as an early intervention for autism spectrum disorder with similar benefits in Medicaid and commercial health insurance plans. The sponsor, the Office of the Under Secretary of Defense for Personnel and Readiness, was particularly interested in how a proposed TRICARE reimbursement rate decrease from $125 per hour to $68 per hour for ABA services performed by a Board Certified Behavior Analyst compared with reimbursement rates (defined as third-party payment to the service provider) in Medicaid and commercial health insurance plans. Information on ABA coverage in state Medicaid programs was collected from Medicaid state waiver databases; subsequently, Medicaid provider reimbursement data were collected from state Medicaid fee schedules. Applied Behavior Analysis provider reimbursement in the commercial health insurance system was estimated using Truven Health MarketScan® data. A weighted mean U.S. reimbursement rate was calculated for several services using cross-state information on the number of children diagnosed with autism spectrum disorder. Locations of potential provider shortages were also identified. Medicaid and commercial insurance reimbursement rates varied considerably across the United States. This project concluded that the proposed $68-per-hour reimbursement rate for services provided by a board certified analyst was more than 25 percent below the U.S. mean. PMID:28845348

  16. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  17. 18 CFR 401.96 - Factors to be applied in fixing penalty amount.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 2 2011-04-01 2011-04-01 false Factors to be applied in fixing penalty amount. 401.96 Section 401.96 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION ADMINISTRATIVE MANUAL RULES OF PRACTICE AND PROCEDURE Penalties and Settlements...

  18. Heisenberg principle applied to the analysis of speckle interferometry fringes

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Sciammarella, F. M.

    2003-11-01

    Optical techniques that are used to measure displacements utilize a carrier. When a load is applied the displacement field modulates the carrier. The accuracy of the information that can be recovered from the modulated carrier is limited by a number of factors. In this paper, these factors are analyzed and conclusions concerning the limitations in information recovery are illustrated with examples taken from experimental data.

  19. Generalized five-dimensional dynamic and spectral factor analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Fakhri, Georges; Sitek, Arkadiusz; Zimmerman, Robert E.

    2006-04-15

    We have generalized the spectral factor analysis and the factor analysis of dynamic sequences (FADS) in SPECT imaging to a five-dimensional general factor analysis model (5D-GFA), where the five dimensions are the three spatial dimensions, photon energy, and time. The generalized model yields a significant advantage in terms of the ratio of the number of equations to that of unknowns in the factor analysis problem in dynamic SPECT studies. We solved the 5D model using a least-squares approach. In addition to the traditional non-negativity constraints, we constrained the solution using a priori knowledge of both time and energy, assuming thatmore » primary factors (spectra) are Gaussian-shaped with full-width at half-maximum equal to gamma camera energy resolution. 5D-GFA was validated in a simultaneous pre-/post-synaptic dual isotope dynamic phantom study where {sup 99m}Tc and {sup 123}I activities were used to model early Parkinson disease studies. 5D-GFA was also applied to simultaneous perfusion/dopamine transporter (DAT) dynamic SPECT in rhesus monkeys. In the striatal phantom, 5D-GFA yielded significantly more accurate and precise estimates of both primary {sup 99m}Tc (bias=6.4%{+-}4.3%) and {sup 123}I (-1.7%{+-}6.9%) time activity curves (TAC) compared to conventional FADS (biases=15.5%{+-}10.6% in {sup 99m}Tc and 8.3%{+-}12.7% in {sup 123}I, p<0.05). Our technique was also validated in two primate dynamic dual isotope perfusion/DAT transporter studies. Biases of {sup 99m}Tc-HMPAO and {sup 123}I-DAT activity estimates with respect to estimates obtained in the presence of only one radionuclide (sequential imaging) were significantly lower with 5D-GFA (9.4%{+-}4.3% for {sup 99m}Tc-HMPAO and 8.7%{+-}4.1% for {sup 123}I-DAT) compared to biases greater than 15% for volumes of interest (VOI) over the reconstructed volumes (p<0.05). 5D-GFA is a novel and promising approach in dynamic SPECT imaging that can also be used in other modalities. It allows accurate and

  20. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  1. Factor analysis of the contextual fine motor questionnaire in children.

    PubMed

    Lin, Chin-Kai; Meng, Ling-Fu; Yu, Ya-Wen; Chen, Che-Kuo; Li, Kuan-Hua

    2014-02-01

    Most studies treat fine motor as one subscale in a developmental test, hence, further factor analysis of fine motor has not been conducted. In fact, fine motor has been treated as a multi-dimensional domain from both clinical and theoretical perspectives, and therefore to know its factors would be valuable. The aim of this study is to analyze the internal consistency and factor validity of the Contextual Fine Motor Questionnaire (CFMQ). Based on the ecological observation and literature, the Contextual Fine Motor Questionnaire (CFMQ) was developed and includes 5 subscales: Pen Control, Tool Use During Handicraft Activities, the Use of Dining Utensils, Connecting and Separating during Dressing and Undressing, and Opening Containers. The main purpose of this study is to establish the factorial validity of the CFMQ through conducting this factor analysis study. Among 1208 questionnaires, 904 were successfully completed. Data from the children's CFMQ submitted by primary care providers was analyzed, including 485 females (53.6%) and 419 males (46.4%) from grades 1 to 5, ranging in age from 82 to 167 months (M=113.9, SD=16.3). Cronbach's alpha was used to measure internal consistency and explorative factor analysis was applied to test the five factor structures within the CFMQ. Results showed that Cronbach's alpha coefficient of the CFMQ for 5 subscales ranged from .77 to .92 and all item-total correlations with corresponding subscales were larger than .4 except one item. The factor loading of almost all items classified to their factor was larger than .5 except 3 items. There were five factors, explaining a total of 62.59% variance for the CFMQ. In conclusion, the remaining 24 items in the 5 subscales of the CFMQ had appropriate internal consistency, test-retest reliability and construct validity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. A New, More Powerful Approach to Multitrait-Multimethod Analyses: An Application of Second-Order Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Marsh, Herbert W.; Hocevar, Dennis

    The advantages of applying confirmatory factor analysis (CFA) to multitrait-multimethod (MTMM) data are widely recognized. However, because CFA as traditionally applied to MTMM data incorporates single indicators of each scale (i.e., each trait/method combination), important weaknesses are the failure to: (1) correct appropriately for measurement…

  3. A practical guide to propensity score analysis for applied clinical research.

    PubMed

    Lee, Jaehoon; Little, Todd D

    2017-11-01

    Observational studies are often the only viable options in many clinical settings, especially when it is unethical or infeasible to randomly assign participants to different treatment régimes. In such case propensity score (PS) analysis can be applied to accounting for possible selection bias and thereby addressing questions of causal inference. Many PS methods exist, yet few guidelines are available to aid applied researchers in their conduct and evaluation of a PS analysis. In this article we give an overview of available techniques for PS estimation and application, balance diagnostic, treatment effect estimation, and sensitivity assessment, as well as recent advances. We also offer a tutorial that can be used to emulate the steps of PS analysis. Our goal is to provide information that will bring PS analysis within the reach of applied clinical researchers and practitioners. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Nontidal Loading Applied in VLBI Geodetic Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.

    2015-12-01

    We investigate the application of nontidal atmosphere pressure, hydrology, and ocean loading series in the analysis of VLBI data. The annual amplitude of VLBI scale variation is reduced to less than 0.1 ppb, a result of the annual components of the vertical loading series. VLBI site vertical scatter and baseline length scatter is reduced when these loading models are applied. We operate nontidal loading services for hydrology loading (GLDAS model), atmospheric pressure loading (NCEP), and nontidal ocean loading (JPL ECCO model). As an alternative validation, we compare these loading series with corresponding series generated by other analysis centers.

  5. Factors associated with participation frequency and satisfaction among people applying for a housing adaptation grant.

    PubMed

    Thordardottir, Björg; Ekstam, Lisa; Chiatti, Carlos; Fänge, Agneta Malmgren

    2016-09-01

    People applying for a housing adaptation (HA) grant are at great risk of participation restrictions due to declining capacity and environmental barriers. To investigate the association of person-, environment-, and activity-related factors with participation frequency and satisfaction among people applying for a housing adaptation grant. Baseline cross-sectional data were collected during home visits (n = 128). The association between person-, environment-, and activity-related factors and participation frequency and satisfaction was analysed using logistic regressions. The main result is that frequency of participation outside the home is strongly associated with dependence in activities of daily living (ADL) and cognitive impairments, while satisfaction with participation outside the home is strongly associated with self-reported health. Moreover, aspects of usability in the home were associated with frequency of participation outside the home and satisfaction with participation in the home and outside the home alone. Dependence in ADL, cognitive impairments, self-rated health, and aspects of usability are important factors contributing to participation frequency and satisfaction among people applying for a housing adaptation grant, particularly outside the home. Our findings indicate that more attention should be directed towards activity-related factors to facilitate participation among HA applicants, inside and outside the home.

  6. A Transfer Learning Approach for Applying Matrix Factorization to Small ITS Datasets

    ERIC Educational Resources Information Center

    Voß, Lydia; Schatten, Carlotta; Mazziotti, Claudia; Schmidt-Thieme, Lars

    2015-01-01

    Machine Learning methods for Performance Prediction in Intelligent Tutoring Systems (ITS) have proven their efficacy; specific methods, e.g. Matrix Factorization (MF), however suffer from the lack of available information about new tasks or new students. In this paper we show how this problem could be solved by applying Transfer Learning (TL),…

  7. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. Copyright © 2013 Wiley Periodicals, Inc.

  8. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  9. Global analysis of bacterial transcription factors to predict cellular target processes.

    PubMed

    Doerks, Tobias; Andrade, Miguel A; Lathe, Warren; von Mering, Christian; Bork, Peer

    2004-03-01

    Whole-genome sequences are now available for >100 bacterial species, giving unprecedented power to comparative genomics approaches. We have applied genome-context methods to predict target processes that are regulated by transcription factors (TFs). Of 128 orthologous groups of proteins annotated as TFs, to date, 36 are functionally uncharacterized; in our analysis we predict a probable cellular target process or biochemical pathway for half of these functionally uncharacterized TFs.

  10. Human Factors Process Task Analysis: Liquid Oxygen Pump Acceptance Test Procedure at the Advanced Technology Development Center

    NASA Technical Reports Server (NTRS)

    Diorio, Kimberly A.; Voska, Ned (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.

  11. Pathway-based factor analysis of gene expression data produces highly heritable phenotypes that associate with age.

    PubMed

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-03-09

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 "pathway phenotypes" that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold ([Formula: see text]). These phenotypes are more heritable ([Formula: see text]) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. Copyright © 2015 Brown et al.

  12. Pathway-Based Factor Analysis of Gene Expression Data Produces Highly Heritable Phenotypes That Associate with Age

    PubMed Central

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-01-01

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 “pathway phenotypes” that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold (P<5.38×10−5). These phenotypes are more heritable (h2=0.32) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. PMID:25758824

  13. Applied Behavior Analysis: Current Myths in Public Education

    ERIC Educational Resources Information Center

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  14. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2003-01-01

    TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.

  15. Can Nursing Students Practice What Is Preached? Factors Impacting Graduating Nurses' Abilities and Achievement to Apply Evidence-Based Practices.

    PubMed

    Blackman, Ian R; Giles, Tracey M

    2017-04-01

    In order to meet national Australian nursing registration requisites, nurses need to meet competency requirements for evidence-based practices (EBPs). A hypothetical model was formulated to explore factors that influenced Australian nursing students' ability and achievement to understand and employ EBPs related to health care provision. A nonexperimental, descriptive survey method was used to identify self-reported EBP efficacy estimates of 375 completing undergraduate nursing students. Factors influencing participants' self-rated EBP abilities were validated by Rasch analysis and then modeled using the partial least squares analysis (PLS Path) program. Graduating nursing students' ability to understand and apply EBPs for clinical improvement can be directly and indirectly predicted by eight variables including their understanding in the analysis, critique and synthesis of clinically based nursing research, their ability to communicate research to others and whether they had actually witnessed other staff delivering EBP. Forty-one percent of the variance in the nursing students' self-rated EBP efficacy scores is able to be accounted for by this model. Previous exposure to EBP studies facilitates participants' confidence with EBP, particularly with concurrent clinical EBP experiences. © 2017 Sigma Theta Tau International.

  16. Do review articles boost journal impact factors? A longitudinal analysis for five pharmacology journals.

    PubMed

    Amiri, Marjan; Michel, Martin C

    2018-06-21

    The impact factor is a frequently applied tool in research output analytics. Based on five consecutive publication years each of five pharmacology journals, we have analyzed to which extent review articles yield more impact factor-relevant citations than original articles. Our analysis shows that review articles are quoted about twice as often as original articles published in the same year in the same journal. We conclude that inclusion of review articles does not substantially affect the impact factor of a journal unless they account for considerably more than 10% of all published articles.

  17. A Confirmatory Factor Analysis of the Student Evidence-Based Practice Questionnaire (S-EBPQ) in an Australian sample.

    PubMed

    Beccaria, Lisa; Beccaria, Gavin; McCosker, Catherine

    2018-03-01

    It is crucial that nursing students develop skills and confidence in using Evidence-Based Practice principles early in their education. This should be assessed with valid tools however, to date, few measures have been developed and applied to the student population. To examine the structural validity of the Student Evidence-Based Practice Questionnaire (S-EBPQ), with an Australian online nursing student cohort. A cross-sectional study for constructing validity. Three hundred and forty-five undergraduate nursing students from an Australian regional university were recruited across two semesters. Confirmatory Factor Analysis was used to examine the structural validity. Confirmatory Factor Analysis was applied which resulted in a good fitting model, based on a revised 20-item tool. The S-EBPQ tool remains a psychometrically robust measure of evidence-based practice use, attitudes, and knowledge and skills and can be applied in an online Australian student context. The findings of this study provided further evidence of the reliability and four factor structure of the S-EBPQ. Opportunities for further refinement of the tool may result in improvements in structural validity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  19. Analysis of significant factors for dengue fever incidence prediction.

    PubMed

    Siriyasatien, Padet; Phumee, Atchara; Ongruk, Phatsavee; Jampachaisri, Katechan; Kesorn, Kraisak

    2016-04-16

    Many popular dengue forecasting techniques have been used by several researchers to extrapolate dengue incidence rates, including the K-H model, support vector machines (SVM), and artificial neural networks (ANN). The time series analysis methodology, particularly ARIMA and SARIMA, has been increasingly applied to the field of epidemiological research for dengue fever, dengue hemorrhagic fever, and other infectious diseases. The main drawback of these methods is that they do not consider other variables that are associated with the dependent variable. Additionally, new factors correlated to the disease are needed to enhance the prediction accuracy of the model when it is applied to areas of similar climates, where weather factors such as temperature, total rainfall, and humidity are not substantially different. Such drawbacks may consequently lower the predictive power for the outbreak. The predictive power of the forecasting model-assessed by Akaike's information criterion (AIC), Bayesian information criterion (BIC), and the mean absolute percentage error (MAPE)-is improved by including the new parameters for dengue outbreak prediction. This study's selected model outperforms all three other competing models with the lowest AIC, the lowest BIC, and a small MAPE value. The exclusive use of climate factors from similar locations decreases a model's prediction power. The multivariate Poisson regression, however, effectively forecasts even when climate variables are slightly different. Female mosquitoes and seasons were strongly correlated with dengue cases. Therefore, the dengue incidence trends provided by this model will assist the optimization of dengue prevention. The present work demonstrates the important roles of female mosquito infection rates from the previous season and climate factors (represented as seasons) in dengue outbreaks. Incorporating these two factors in the model significantly improves the predictive power of dengue hemorrhagic fever forecasting

  20. Exploratory Bi-Factor Analysis: The Oblique Case

    ERIC Educational Resources Information Center

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  1. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  2. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  3. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists.

  4. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  5. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  6. Applying causal mediation analysis to personality disorder research.

    PubMed

    Walters, Glenn D

    2018-01-01

    This article is designed to address fundamental issues in the application of causal mediation analysis to research on personality disorders. Causal mediation analysis is used to identify mechanisms of effect by testing variables as putative links between the independent and dependent variables. As such, it would appear to have relevance to personality disorder research. It is argued that proper implementation of causal mediation analysis requires that investigators take several factors into account. These factors are discussed under 5 headings: variable selection, model specification, significance evaluation, effect size estimation, and sensitivity testing. First, care must be taken when selecting the independent, dependent, mediator, and control variables for a mediation analysis. Some variables make better mediators than others and all variables should be based on reasonably reliable indicators. Second, the mediation model needs to be properly specified. This requires that the data for the analysis be prospectively or historically ordered and possess proper causal direction. Third, it is imperative that the significance of the identified pathways be established, preferably with a nonparametric bootstrap resampling approach. Fourth, effect size estimates should be computed or competing pathways compared. Finally, investigators employing the mediation method are advised to perform a sensitivity analysis. Additional topics covered in this article include parallel and serial multiple mediation designs, moderation, and the relationship between mediation and moderation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  8. Analysis of concrete beams using applied element method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  9. Defeat and entrapment: more than meets the eye? Applying network analysis to estimate dimensions of highly correlated constructs.

    PubMed

    Forkmann, Thomas; Teismann, Tobias; Stenzel, Jana-Sophie; Glaesmer, Heide; de Beurs, Derek

    2018-01-25

    Defeat and entrapment have been shown to be of central relevance to the development of different disorders. However, it remains unclear whether they represent two distinct constructs or one overall latent variable. One reason for the unclarity is that traditional factor analytic techniques have trouble estimating the right number of clusters in highly correlated data. In this study, we applied a novel approach based on network analysis that can deal with correlated data to establish whether defeat and entrapment are best thought of as one or multiple constructs. Explanatory graph analysis was used to estimate the number of dimensions within the 32 items that make up the defeat and entrapment scales in two samples: an online community sample of 480 participants, and a clinical sample of 147 inpatients admitted to a psychiatric hospital after a suicidal attempt or severe suicidal crisis. Confirmatory Factor analysis (CFA) was used to test whether the proposed structure fits the data. In both samples, bootstrapped exploratory graph analysis suggested that the defeat and entrapment items belonged to different dimensions. Within the entrapment items, two separate dimensions were detected, labelled internal and external entrapment. Defeat appeared to be multifaceted only in the online sample. When comparing the CFA outcomes of the one, two, three and four factor models, the one factor model was preferred. Defeat and entrapment can be viewed as distinct, yet, highly associated constructs. Thus, although replication is needed, results are in line with theories differentiating between these two constructs.

  10. Factor Analysis of Intern Effectiveness

    ERIC Educational Resources Information Center

    Womack, Sid T.; Hannah, Shellie Louise; Bell, Columbus David

    2012-01-01

    Four factors in teaching intern effectiveness, as measured by a Praxis III-similar instrument, were found among observational data of teaching interns during the 2010 spring semester. Those factors were lesson planning, teacher/student reflection, fairness & safe environment, and professionalism/efficacy. This factor analysis was as much of a…

  11. Confirmatory factor analysis for two questionnaires of caregiving in eating disorders

    PubMed Central

    Hibbs, Rebecca; Rhind, Charlotte; Sallis, Hannah; Goddard, Elizabeth; Raenker, Simone; Ayton, Agnes; Bamford, Bryony; Arcelus, Jon; Boughton, Nicky; Connan, Frances; Goss, Ken; Lazlo, Bert; Morgan, John; Moore, Kim; Robertson, David; Schreiber-Kounine, Christa; Sharma, Sonu; Whitehead, Linette; Lacey, Hubert; Schmidt, Ulrike; Treasure, Janet

    2014-01-01

    Objective: Caring for someone diagnosed with an eating disorder (ED) is associated with a high level of burden and psychological distress which can inadvertently contribute to the maintenance of the illness. The Eating Disorders Symptom Impact Scale (EDSIS) and Accommodation and Enabling Scale for Eating Disorders (AESED) are self-report scales to assess elements of caregiving theorised to contribute to the maintenance of an ED. Further validation and confirmation of the factor structures for these scales are necessary for rigorous evaluation of complex interventions which target these modifiable elements of caregiving. Method: EDSIS and AESED data from 268 carers of people with anorexia nervosa (AN), recruited from consecutive admissions to 15 UK inpatient or day patient hospital units, were subjected to confirmatory factor analysis to test model fit by applying the existing factor structures: (a) four-factor structure for the EDSIS and (b) five-factor structure for the AESED. Results: Confirmatory factor analytic results support the existing four-factor and five-factor structures for the EDSIS and the AESED, respectively. Discussion: The present findings provide further validation of the EDSIS and the AESED as tools to assess modifiable elements of caregiving for someone with an ED. PMID:25750785

  12. International publication trends in the Journal of Applied Behavior Analysis: 2000-2014.

    PubMed

    Martin, Neil T; Nosik, Melissa R; Carr, James E

    2016-06-01

    Dymond, Clarke, Dunlap, and Steiner's (2000) analysis of international publication trends in the Journal of Applied Behavior Analysis (JABA) from 1970 to 1999 revealed low numbers of publications from outside North America, leading the authors to express concern about the lack of international involvement in applied behavior analysis. They suggested that a future review would be necessary to evaluate any changes in international authorship in the journal. As a follow-up, we analyzed non-U.S. publication trends in the most recent 15 years of JABA and found similar results. We discuss potential reasons for the relative paucity of international authors and suggest potential strategies for increasing non-U.S. contributions to the advancement of behavior analysis. © 2015 Society for the Experimental Analysis of Behavior.

  13. A meta-analysis of factors affecting trust in human-robot interaction.

    PubMed

    Hancock, Peter A; Billings, Deborah R; Schaefer, Kristin E; Chen, Jessie Y C; de Visser, Ewart J; Parasuraman, Raja

    2011-10-01

    We evaluate and quantify the effects of human, robot, and environmental factors on perceived trust in human-robot interaction (HRI). To date, reviews of trust in HRI have been qualitative or descriptive. Our quantitative review provides a fundamental empirical foundation to advance both theory and practice. Meta-analytic methods were applied to the available literature on trust and HRI. A total of 29 empirical studies were collected, of which 10 met the selection criteria for correlational analysis and 11 for experimental analysis. These studies provided 69 correlational and 47 experimental effect sizes. The overall correlational effect size for trust was r = +0.26,with an experimental effect size of d = +0.71. The effects of human, robot, and environmental characteristics were examined with an especial evaluation of the robot dimensions of performance and attribute-based factors. The robot performance and attributes were the largest contributors to the development of trust in HRI. Environmental factors played only a moderate role. Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated. There was little evidence for effects of human-related factors. The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.

  14. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  15. Robust Bayesian Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Yuan, Ke-Hai

    2003-01-01

    Bayesian factor analysis (BFA) assumes the normal distribution of the current sample conditional on the parameters. Practical data in social and behavioral sciences typically have significant skewness and kurtosis. If the normality assumption is not attainable, the posterior analysis will be inaccurate, although the BFA depends less on the current…

  16. Tropospheric delay ray tracing applied in VLBI analysis

    NASA Astrophysics Data System (ADS)

    Eriksson, David; MacMillan, D. S.; Gipson, John M.

    2014-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI (very long baseline interferometry) analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from European Centre for Medium-Range Weather Forecasts data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption is not true, we have instead determined the ray trace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity, and geopotential height fields of the NASA Goddard Space Flight Center Goddard Earth Observing System version 5 numerical weather model. When applied in VLBI analysis, baseline length repeatabilities were improved compared with using the VMF1 mapping function model for 72% of the baselines and site vertical repeatabilities were better for 11 of 13 sites during the 2 week CONT11 observing period in September 2011. When applied to a larger data set (2011-2013), we see a similar improvement in baseline length and also in site position repeatabilities for about two thirds of the stations in each of the site topocentric components.

  17. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  18. A computational intelligent approach to multi-factor analysis of violent crime information system

    NASA Astrophysics Data System (ADS)

    Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing

    2017-02-01

    Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

  19. Psychosocial risk and protective factors for depression in the dialysis population: a systematic review and meta-regression analysis.

    PubMed

    Chan, Ramony; Steel, Zachary; Brooks, Robert; Heung, Tracy; Erlich, Jonathan; Chow, Josephine; Suranyi, Michael

    2011-11-01

    Research into the association between psychosocial factors and depression in End-Stage Renal Disease (ESRD) has expanded considerably in recent years identifying a range of factors that may act as important risk and protective factors of depression for this population. The present study provides the first systematic review and meta-analysis of this body of research. Published studies reporting associations between any psychosocial factor and depression were identified and retrieved from Medline, Embase, and PsycINFO, by applying optimised search strategies. Mean effect sizes were calculated for the associations across five psychosocial constructs (social support, personality attributes, cognitive appraisal, coping process, stress/stressor). Multiple hierarchical meta-regression analysis was applied to examine the moderating effects of methodological and substantive factors on the strength of the observed associations. 57 studies covering 58 independent samples with 5956 participants were identified, resulting in 246 effect sizes of the association between a range of psychosocial factors and depression. The overall mean effect size (Pearsons correlation coefficient) of the association between psychosocial factor and depression was 0.36. The effect sizes between the five psychosocial constructs and depression ranged from medium (0.27) to large levels (0.46) with personality attributes (0.46) and cognitive appraisal (0.46) having the largest effect sizes. In the meta-regression analyses, identified demographic (gender, age, location of study) and treatment (type of dialysis) characteristics moderated the strength of the associations with depression. The current analysis documents a moderate to large association between the presence of psychosocial risk factors and depression in ESRD. 2011. Published by Elsevier Inc. All rights reserved.

  20. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  1. An Error Analysis for the Finite Element Method Applied to Convection Diffusion Problems.

    DTIC Science & Technology

    1981-03-01

    D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U

  2. Troposphere Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    Eriksson, David; MacMillan, Daniel; Gipson, John

    2014-12-01

    Tropospheric delay modeling error is one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from European Centre for Medium Range Forecasting (ECMWF) data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have instead determined the raytrace delay along the signal path through the three-dimensional troposphere refractivity field for each VLBI quasar observation. We calculated the troposphere refractivity fields from the pressure, temperature, specific humidity, and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results using raytrace delay in the analysis of the CONT11 R&D sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 70% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 2/3 of all stations. The reference frame scale bias error was 0.02 ppb for raytracing versus 0.08 ppb and 0.06 ppb for VMF1 and NMF, respectively.

  3. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  4. Graph theory applied to noise and vibration control in statistical energy analysis models.

    PubMed

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  5. The Infinitesimal Jackknife with Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  6. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  7. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    PubMed

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  8. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  9. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  10. Conversation Analysis and Applied Linguistics.

    ERIC Educational Resources Information Center

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  11. Application of factor analysis of infrared spectra for quantitative determination of beta-tricalcium phosphate in calcium hydroxylapatite.

    PubMed

    Arsenyev, P A; Trezvov, V V; Saratovskaya, N V

    1997-01-01

    This work represents a method, which allows to determine phase composition of calcium hydroxylapatite basing on its infrared spectrum. The method uses factor analysis of the spectral data of calibration set of samples to determine minimal number of factors required to reproduce the spectra within experimental error. Multiple linear regression is applied to establish correlation between factor scores of calibration standards and their properties. The regression equations can be used to predict the property value of unknown sample. The regression model was built for determination of beta-tricalcium phosphate content in hydroxylapatite. Statistical estimation of quality of the model was carried out. Application of the factor analysis on spectral data allows to increase accuracy of beta-tricalcium phosphate determination and expand the range of determination towards its less concentration. Reproducibility of results is retained.

  12. Multicriteria decision analysis applied to Glen Canyon Dam

    USGS Publications Warehouse

    Flug, M.; Seitz, H.L.H.; Scott, J.F.

    2000-01-01

    Conflicts in water resources exist because river-reservoir systems are managed to optimize traditional benefits (e.g., hydropower and flood control), which are historically quantified in economic terms, whereas natural and environmental resources, including in-stream and riparian resources, are more difficult or impossible to quantify in economic terms. Multicriteria decision analysis provides a quantitative approach to evaluate resources subject to river basin management alternatives. This objective quantification method includes inputs from special interest groups, the general public, and concerned individuals, as well as professionals for each resource considered in a trade-off analysis. Multicriteria decision analysis is applied to resources and flow alternatives presented in the environmental impact statement for Glen Canyon Dam on the Colorado River. A numeric rating and priority-weighting scheme is used to evaluate 29 specific natural resource attributes, grouped into seven main resource objectives, for nine flow alternatives enumerated in the environmental impact statement.

  13. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  14. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  15. Different approaches in Partial Least Squares and Artificial Neural Network models applied for the analysis of a ternary mixture of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2014-03-01

    Different chemometric models were applied for the quantitative analysis of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in ternary mixture, namely, Partial Least Squares (PLS) as traditional chemometric model and Artificial Neural Networks (ANN) as advanced model. PLS and ANN were applied with and without variable selection procedure (Genetic Algorithm GA) and data compression procedure (Principal Component Analysis PCA). The chemometric methods applied are PLS-1, GA-PLS, ANN, GA-ANN and PCA-ANN. The methods were used for the quantitative analysis of the drugs in raw materials and pharmaceutical dosage form via handling the UV spectral data. A 3-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the drugs. Fifteen mixtures were used as a calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested methods. The validity of the proposed methods was assessed using the standard addition technique.

  16. Analysis of risk factors for central venous port failure in cancer patients

    PubMed Central

    Hsieh, Ching-Chuan; Weng, Hsu-Huei; Huang, Wen-Shih; Wang, Wen-Ke; Kao, Chiung-Lun; Lu, Ming-Shian; Wang, Chia-Siu

    2009-01-01

    AIM: To analyze the risk factors for central port failure in cancer patients administered chemotherapy, using univariate and multivariate analyses. METHODS: A total of 1348 totally implantable venous access devices (TIVADs) were implanted into 1280 cancer patients in this cohort study. A Cox proportional hazard model was applied to analyze risk factors for failure of TIVADs. Log-rank test was used to compare actuarial survival rates. Infection, thrombosis, and surgical complication rates (χ2 test or Fisher’s exact test) were compared in relation to the risk factors. RESULTS: Increasing age, male gender and open-ended catheter use were significant risk factors reducing survival of TIVADs as determined by univariate and multivariate analyses. Hematogenous malignancy decreased the survival time of TIVADs; this reduction was not statistically significant by univariate analysis [hazard ratio (HR) = 1.336, 95% CI: 0.966-1.849, P = 0.080)]. However, it became a significant risk factor by multivariate analysis (HR = 1.499, 95% CI: 1.079-2.083, P = 0.016) when correlated with variables of age, sex and catheter type. Close-ended (Groshong) catheters had a lower thrombosis rate than open-ended catheters (2.5% vs 5%, P = 0.015). Hematogenous malignancy had higher infection rates than solid malignancy (10.5% vs 2.5%, P < 0.001). CONCLUSION: Increasing age, male gender, open-ended catheters and hematogenous malignancy were risk factors for TIVAD failure. Close-ended catheters had lower thrombosis rates and hematogenous malignancy had higher infection rates. PMID:19787834

  17. Analysis of Brick Masonry Wall using Applied Element Method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  18. Usage of K-cluster and factor analysis for grouping and evaluation the quality of olive oil in accordance with physico-chemical parameters

    NASA Astrophysics Data System (ADS)

    Milev, M.; Nikolova, Kr.; Ivanova, Ir.; Dobreva, M.

    2015-11-01

    25 olive oils were studied- different in origin and ways of extraction, in accordance with 17 physico-chemical parameters as follows: color parameters - a and b, light, fluorescence peaks, pigments - chlorophyll and β-carotene, fatty-acid content. The goals of the current study were: Conducting correlation analysis to find the inner relation between the studied indices; By applying factor analysis with the help of the method of Principal Components (PCA), to reduce the great number of variables into a few factors, which are of main importance for distinguishing the different types of olive oil;Using K-means cluster to compare and group the tested types olive oils based on their similarity. The inner relation between the studied indices was found by applying correlation analysis. A factor analysis using PCA was applied on the basis of the found correlation matrix. Thus the number of the studied indices was reduced to 4 factors, which explained 79.3% from the entire variation. The first one unified the color parameters, β-carotene and the related with oxidative products fluorescence peak - about 520 nm. The second one was determined mainly by the chlorophyll content and related to it fluorescence peak - about 670 nm. The third and the fourth factors were determined by the fatty-acid content of the samples. The third one unified the fatty-acids, which give us the opportunity to distinguish olive oil from the other plant oils - oleic, linoleic and stearin acids. The fourth factor included fatty-acids with relatively much lower content in the studied samples. It is enquired the number of clusters to be determined preliminary in order to apply the K-Cluster analysis. The variant K = 3 was worked out because the types of the olive oil were three. The first cluster unified all salad and pomace olive oils, the second unified the samples of extra virgin oilstaken as controls from producers, which were bought from the trade network. The third cluster unified samples from

  19. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  20. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  1. Phylogenetic Factor Analysis.

    PubMed

    Tolkoff, Max R; Alfaro, Michael E; Baele, Guy; Lemey, Philippe; Suchard, Marc A

    2018-05-01

    Phylogenetic comparative methods explore the relationships between quantitative traits adjusting for shared evolutionary history. This adjustment often occurs through a Brownian diffusion process along the branches of the phylogeny that generates model residuals or the traits themselves. For high-dimensional traits, inferring all pair-wise correlations within the multivariate diffusion is limiting. To circumvent this problem, we propose phylogenetic factor analysis (PFA) that assumes a small unknown number of independent evolutionary factors arise along the phylogeny and these factors generate clusters of dependent traits. Set in a Bayesian framework, PFA provides measures of uncertainty on the factor number and groupings, combines both continuous and discrete traits, integrates over missing measurements and incorporates phylogenetic uncertainty with the help of molecular sequences. We develop Gibbs samplers based on dynamic programming to estimate the PFA posterior distribution, over 3-fold faster than for multivariate diffusion and a further order-of-magnitude more efficiently in the presence of latent traits. We further propose a novel marginal likelihood estimator for previously impractical models with discrete data and find that PFA also provides a better fit than multivariate diffusion in evolutionary questions in columbine flower development, placental reproduction transitions and triggerfish fin morphometry.

  2. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    NASA Astrophysics Data System (ADS)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  3. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  4. Replace-approximation method for ambiguous solutions in factor analysis of ultrasonic hepatic perfusion

    NASA Astrophysics Data System (ADS)

    Zhang, Ji; Ding, Mingyue; Yuchi, Ming; Hou, Wenguang; Ye, Huashan; Qiu, Wu

    2010-03-01

    Factor analysis is an efficient technique to the analysis of dynamic structures in medical image sequences and recently has been used in contrast-enhanced ultrasound (CEUS) of hepatic perfusion. Time-intensity curves (TICs) extracted by factor analysis can provide much more diagnostic information for radiologists and improve the diagnostic rate of focal liver lesions (FLLs). However, one of the major drawbacks of factor analysis of dynamic structures (FADS) is nonuniqueness of the result when only the non-negativity criterion is used. In this paper, we propose a new method of replace-approximation based on apex-seeking for ambiguous FADS solutions. Due to a partial overlap of different structures, factor curves are assumed to be approximately replaced by the curves existing in medical image sequences. Therefore, how to find optimal curves is the key point of the technique. No matter how many structures are assumed, our method always starts to seek apexes from one-dimensional space where the original high-dimensional data is mapped. By finding two stable apexes from one dimensional space, the method can ascertain the third one. The process can be continued until all structures are found. This technique were tested on two phantoms of blood perfusion and compared to the two variants of apex-seeking method. The results showed that the technique outperformed two variants in comparison of region of interest measurements from phantom data. It can be applied to the estimation of TICs derived from CEUS images and separation of different physiological regions in hepatic perfusion.

  5. Tissue Microarray Analysis Applied to Bone Diagenesis

    PubMed Central

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered. PMID:28051148

  6. A single factor underlies the metabolic syndrome: a confirmatory factor analysis.

    PubMed

    Pladevall, Manel; Singal, Bonita; Williams, L Keoki; Brotons, Carlos; Guyer, Heidi; Sadurni, Josep; Falces, Carles; Serrano-Rios, Manuel; Gabriel, Rafael; Shaw, Jonathan E; Zimmet, Paul Z; Haffner, Steven

    2006-01-01

    Confirmatory factor analysis (CFA) was used to test the hypothesis that the components of the metabolic syndrome are manifestations of a single common factor. Three different datasets were used to test and validate the model. The Spanish and Mauritian studies included 207 men and 203 women and 1,411 men and 1,650 women, respectively. A third analytical dataset including 847 men was obtained from a previously published CFA of a U.S. population. The one-factor model included the metabolic syndrome core components (central obesity, insulin resistance, blood pressure, and lipid measurements). We also tested an expanded one-factor model that included uric acid and leptin levels. Finally, we used CFA to compare the goodness of fit of one-factor models with the fit of two previously published four-factor models. The simplest one-factor model showed the best goodness-of-fit indexes (comparative fit index 1, root mean-square error of approximation 0.00). Comparisons of one-factor with four-factor models in the three datasets favored the one-factor model structure. The selection of variables to represent the different metabolic syndrome components and model specification explained why previous exploratory and confirmatory factor analysis, respectively, failed to identify a single factor for the metabolic syndrome. These analyses support the current clinical definition of the metabolic syndrome, as well as the existence of a single factor that links all of the core components.

  7. Anthropometric data reduction using confirmatory factor analysis.

    PubMed

    Rohani, Jafri Mohd; Olusegun, Akanbi Gabriel; Rani, Mat Rebi Abdul

    2014-01-01

    The unavailability of anthropometric data especially in developing countries has remained a limiting factor towards the design of learning facilities with sufficient ergonomic consideration. Attempts to use anthropometric data from developed countries have led to provision of school facilities unfit for the users. The purpose of this paper is to use factor analysis to investigate the suitability of the collected anthropometric data as a database for school design in Nigerian tertiary institutions. Anthropometric data were collected from 288 male students in a Federal Polytechnic in North-West of Nigeria. Their age is between 18-25 years. Nine vertical anthropometric dimensions related to heights were collected using the conventional traditional equipment. Exploratory factor analysis was used to categorize the variables into a model consisting of two factors. Thereafter, confirmatory factor analysis was used to investigate the fit of the data to the proposed model. A just identified model, made of two factors, each with three variables was developed. The variables within the model accounted for 81% of the total variation of the entire data. The model was found to demonstrate adequate validity and reliability. Various measuring indices were used to verify that the model fits the data properly. The final model reveals that stature height and eye height sitting were the most stable variables for designs that have to do with standing and sitting construct. The study has shown the application of factor analysis in anthropometric data analysis. The study highlighted the relevance of these statistical tools to investigate variability among anthropometric data involving diverse population, which has not been widely used for analyzing previous anthropometric data. The collected data is therefore suitable for use while designing for Nigerian students.

  8. Cement Leakage in Percutaneous Vertebral Augmentation for Osteoporotic Vertebral Compression Fractures: Analysis of Risk Factors.

    PubMed

    Xie, Weixing; Jin, Daxiang; Ma, Hui; Ding, Jinyong; Xu, Jixi; Zhang, Shuncong; Liang, De

    2016-05-01

    The risk factors for cement leakage were retrospectively reviewed in 192 patients who underwent percutaneous vertebral augmentation (PVA). To discuss the factors related to the cement leakage in PVA procedure for the treatment of osteoporotic vertebral compression fractures. PVA is widely applied for the treatment of osteoporotic vertebral fractures. Cement leakage is a major complication of this procedure. The risk factors for cement leakage were controversial. A retrospective review of 192 patients who underwent PVA was conducted. The following data were recorded: age, sex, bone density, number of fractured vertebrae before surgery, number of treated vertebrae, severity of the treated vertebrae, operative approach, volume of injected bone cement, preoperative vertebral compression ratio, preoperative local kyphosis angle, intraosseous clefts, preoperative vertebral cortical bone defect, and ratio and type of cement leakage. To study the correlation between each factor and cement leakage ratio, bivariate regression analysis was employed to perform univariate analysis, whereas multivariate linear regression analysis was employed to perform multivariate analysis. The study included 192 patients (282 treated vertebrae), and cement leakage occurred in 100 vertebrae (35.46%). The vertebrae with preoperative cortical bone defects generally exhibited higher cement leakage ratio, and the leakage is typically type C. Vertebrae with intact cortical bones before the procedure tend to experience type S leakage. Univariate analysis showed that patient age, bone density, number of fractured vertebrae before surgery, and vertebral cortical bone were associated with cement leakage ratio (P<0.05). Multivariate analysis showed that the main factors influencing bone cement leakage are bone density and vertebral cortical bone defect, with standardized partial regression coefficients of -0.085 and 0.144, respectively. High bone density and vertebral cortical bone defect are

  9. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    PubMed

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  10. Human Factors in Financial Trading: An Analysis of Trading Incidents.

    PubMed

    Leaver, Meghan; Reader, Tom W

    2016-09-01

    This study tests the reliability of a system (FINANS) to collect and analyze incident reports in the financial trading domain and is guided by a human factors taxonomy used to describe error in the trading domain. Research indicates the utility of applying human factors theory to understand error in finance, yet empirical research is lacking. We report on the development of the first system for capturing and analyzing human factors-related issues in operational trading incidents. In the first study, 20 incidents are analyzed by an expert user group against a referent standard to establish the reliability of FINANS. In the second study, 750 incidents are analyzed using distribution, mean, pathway, and associative analysis to describe the data. Kappa scores indicate that categories within FINANS can be reliably used to identify and extract data on human factors-related problems underlying trading incidents. Approximately 1% of trades (n = 750) lead to an incident. Slip/lapse (61%), situation awareness (51%), and teamwork (40%) were found to be the most common problems underlying incidents. For the most serious incidents, problems in situation awareness and teamwork were most common. We show that (a) experts in the trading domain can reliably and accurately code human factors in incidents, (b) 1% of trades incur error, and (c) poor teamwork skills and situation awareness underpin the most critical incidents. This research provides data crucial for ameliorating risk within financial trading organizations, with implications for regulation and policy. © 2016, Human Factors and Ergonomics Society.

  11. Comparisons of Exploratory and Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…

  12. The Five-Factor Model personality traits in schizophrenia: A meta-analysis.

    PubMed

    Ohi, Kazutaka; Shimada, Takamitsu; Nitta, Yusuke; Kihara, Hiroaki; Okubo, Hiroaki; Uehara, Takashi; Kawasaki, Yasuhiro

    2016-06-30

    Personality is one of important factors in the pathogenesis of schizophrenia because it affects patients' symptoms, cognition and social functioning. Several studies have reported specific personality traits in patients with schizophrenia compared with healthy subjects. However, the results were inconsistent among studies. The NEO Five-Factor Inventory (NEO-FFI) measures five personality traits: Neuroticism (N), Extraversion (E), Openness (O), Agreeableness (A) and Conscientiousness (C). Here, we performed a meta-analysis of these personality traits assessed by the NEO-FFI in 460 patients with schizophrenia and 486 healthy subjects from the published literature and investigated possible associations between schizophrenia and these traits. There was no publication bias for any traits. Because we found evidence of significant heterogeneity in all traits among the studies, we applied a random-effect model to perform the meta-analysis. Patients with schizophrenia showed a higher score for N and lower scores for E, O, A and C compared with healthy subjects. The effect sizes of these personality traits ranged from moderate to large. These differences were not affected by possible moderator factors, such as gender distribution and mean age in each study, expect for gender effect for A. These findings suggest that patients with schizophrenia have a different personality profile compared with healthy subjects. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Analyse Factorielle d'une Batterie de Tests de Comprehension Orale et Ecrite (Factor Analysis of a Battery of Tests of Listening and Reading Comprehension). Melanges Pedagogiques, 1971.

    ERIC Educational Resources Information Center

    Lonchamp, F.

    This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…

  14. Biomechanical factors associated with mandibular cantilevers: analysis with three-dimensional finite element models.

    PubMed

    Gonda, Tomoya; Yasuda, Daiisa; Ikebe, Kazunori; Maeda, Yoshinobu

    2014-01-01

    Although the risks of using a cantilever to treat missing teeth have been described, the mechanisms remain unclear. This study aimed to reveal these mechanisms from a biomechanical perspective. The effects of various implant sites, number of implants, and superstructural connections on stress distribution in the marginal bone were analyzed with three-dimensional finite element models based on mandibular computed tomography data. Forces from the masseter, temporalis, and internal pterygoid were applied as vectors. Two three-dimensional finite element models were created with the edentulous mandible showing severe and relatively modest residual ridge resorption. Cantilevers of the premolar and molar were simulated in the superstructures in the models. The following conditions were also included as factors in the models to investigate changes: poor bone quality, shortened dental arch, posterior occlusion, lateral occlusion, double force of the masseter, and short implant. Multiple linear regression analysis with a forced-entry method was performed with stress values as the objective variable and the factors as the explanatory variable. When bone mass was high, stress around the implant caused by differences in implantation sites was reduced. When bone mass was low, the presence of a cantilever was a possible risk factor. The stress around the implant increased significantly if bone quality was poor or if increased force (eg, bruxism) was applied. The addition of a cantilever to the superstructure increased stress around implants. When large muscle forces were applied to a superstructure with cantilevers or if bone quality was poor, stress around the implants increased.

  15. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  16. Bootstrap Standard Error Estimates in Dynamic Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Browne, Michael W.

    2010-01-01

    Dynamic factor analysis summarizes changes in scores on a battery of manifest variables over repeated measurements in terms of a time series in a substantially smaller number of latent factors. Algebraic formulae for standard errors of parameter estimates are more difficult to obtain than in the usual intersubject factor analysis because of the…

  17. Evaluating WAIS-IV structure through a different psychometric lens: structural causal model discovery as an alternative to confirmatory factor analysis.

    PubMed

    van Dijk, Marjolein J A M; Claassen, Tom; Suwartono, Christiany; van der Veld, William M; van der Heijden, Paul T; Hendriks, Marc P H

    Since the publication of the WAIS-IV in the U.S. in 2008, efforts have been made to explore the structural validity by applying factor analysis to various samples. This study aims to achieve a more fine-grained understanding of the structure of the Dutch language version of the WAIS-IV (WAIS-IV-NL) by applying an alternative analysis based on causal modeling in addition to confirmatory factor analysis (CFA). The Bayesian Constraint-based Causal Discovery (BCCD) algorithm learns underlying network structures directly from data and assesses more complex structures than is possible with factor analysis. WAIS-IV-NL profiles of two clinical samples of 202 patients (i.e. patients with temporal lobe epilepsy and a mixed psychiatric outpatient group) were analyzed and contrasted with a matched control group (N = 202) selected from the Dutch standardization sample of the WAIS-IV-NL to investigate internal structure by means of CFA and BCCD. With CFA, the four-factor structure as proposed by Wechsler demonstrates acceptable fit in all three subsamples. However, BCCD revealed three consistent clusters (verbal comprehension, visual processing, and processing speed) in all three subsamples. The combination of Arithmetic and Digit Span as a coherent working memory factor could not be verified, and Matrix Reasoning appeared to be isolated. With BCCD, some discrepancies from the proposed four-factor structure are exemplified. Furthermore, these results fit CHC theory of intelligence more clearly. Consistent clustering patterns indicate these results are robust. The structural causal discovery approach may be helpful in better interpreting existing tests, the development of new tests, and aid in diagnostic instruments.

  18. Medical University admission test: a confirmatory factor analysis of the results.

    PubMed

    Luschin-Ebengreuth, Marion; Dimai, Hans P; Ithaler, Daniel; Neges, Heide M; Reibnegger, Gilbert

    2016-05-01

    The Graz Admission Test has been applied since the academic year 2006/2007. The validity of the Test was demonstrated by a significant improvement of study success and a significant reduction of dropout rate. The purpose of this study was a detailed analysis of the internal correlation structure of the various components of the Graz Admission Test. In particular, the question investigated was whether or not the various test parts constitute a suitable construct which might be designated as "Basic Knowledge in Natural Science." This study is an observational investigation, analyzing the results of the Graz Admission Test for the study of human medicine and dentistry. A total of 4741 applicants were included in the analysis. Principal component factor analysis (PCFA) as well as techniques from structural equation modeling, specifically confirmatory factor analysis (CFA), were employed to detect potential underlying latent variables governing the behavior of the measured variables. PCFA showed good clustering of the science test parts, including also text comprehension. A putative latent variable "Basic Knowledge in Natural Science," investigated by CFA, was indeed shown to govern the response behavior of the applicants in biology, chemistry, physics, and mathematics as well as text comprehension. The analysis of the correlation structure of the various test parts confirmed that the science test parts together with text comprehension constitute a satisfactory instrument for measuring a latent construct variable "Basic Knowledge in Natural Science." The present results suggest the fundamental importance of basic science knowledge for results obtained in the framework of the admission process for medical universities.

  19. Receiver function analysis applied to refraction survey data

    NASA Astrophysics Data System (ADS)

    Subaru, T.; Kyosuke, O.; Hitoshi, M.

    2008-12-01

    For the estimation of the thickness of oceanic crust or petrophysical investigation of subsurface material, refraction or reflection seismic exploration is one of the methods frequently practiced. These explorations use four-component (x,y,z component of acceleration and pressure) seismometer, but only compressional wave or vertical component of seismometers tends to be used in the analyses. Hence, it is needed to use shear wave or lateral component of seismograms for more precise investigation to estimate the thickness of oceanic crust. Receiver function is a function at a place that can be used to estimate the depth of velocity interfaces by receiving waves from teleseismic signal including shear wave. Receiver function analysis uses both vertical and horizontal components of seismograms and deconvolves the horizontal with the vertical to estimate the spectral difference of P-S converted waves arriving after the direct P wave. Once the phase information of the receiver function is obtained, then one can estimate the depth of the velocity interface. This analysis has advantage in the estimation of the depth of velocity interface including Mohorovicic discontinuity using two components of seismograms when P-to-S converted waves are generated at the interface. Our study presents results of the preliminary study using synthetic seismograms. First, we use three types of geological models that are composed of a single sediment layer, a crust layer, and a sloped Moho, respectively, for underground sources. The receiver function can estimate the depth and shape of Moho interface precisely for the three models. Second, We applied this method to synthetic refraction survey data generated not by earthquakes but by artificial sources on the ground or sea surface. Compressional seismic waves propagate under the velocity interface and radiate converted shear waves as well as at the other deep underground layer interfaces. However, the receiver function analysis applied to the

  20. How Factor Analysis Can Be Used in Classification.

    ERIC Educational Resources Information Center

    Harman, Harry H.

    This is a methodological study that suggests a taxometric technique for objective classification of yeasts. It makes use of the minres method of factor analysis and groups strains of yeast according to their factor profiles. The similarities are judged in the higher-dimensional space determined by the factor analysis, but otherwise rely on the…

  1. System for corrosion monitoring in pipeline applying fuzzy logic mathematics

    NASA Astrophysics Data System (ADS)

    Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.

    2018-05-01

    A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.

  2. Edmonton obesity staging system among pediatric patients: a validation and obesogenic risk factor analysis.

    PubMed

    Grammatikopoulou, M G; Chourdakis, M; Gkiouras, K; Roumeli, P; Poulimeneas, D; Apostolidou, E; Chountalas, I; Tirodimos, I; Filippou, O; Papadakou-Lagogianni, S; Dardavessis, T

    2018-01-08

    The Edmonton Obesity Staging System for Pediatrics (EOSS-P) is a useful tool, delineating different obesity severity tiers associated with distinct treatment barriers. The aim of the study was to apply the EOSS-P on a Greek pediatric cohort and assess risk factors associated with each stage, compared to normal weight controls. A total of 361 children (2-14 years old), outpatients of an Athenian hospital, participated in this case-control study by forming two groups: the obese (n = 203) and the normoweight controls (n = 158). Anthropometry, blood pressure, blood and biochemical markers, comorbidities and obesogenic lifestyle parameters were recorded and the EOSS-P was applied. Validation of EOSS-P stages was conducted by juxtaposing them with IOTF-defined weight status. Obesogenic risk factors' analysis was conducted by constructing gender-and-age-adjusted (GA) and multivariate logistic models. The majority of obese children were stratified at stage 1 (46.0%), 17.0% were on stage 0, and 37.0% on stage 2. The validation analysis revealed that EOSS-P stages greater than 0 were associated with diastolic blood pressure and levels of glucose, cholesterol, LDL and ALT. Reduced obesity odds were observed among children playing outdoors and increased odds for every screen time hour, both in the GA and in the multivariate analyses (all P < 0.05). Although participation in sports > 2 times/week was associated with reduced obesity odds in the GA analysis (OR = 0.57, 95% CI = 0.33-0.98, P linear = 0.047), it lost its significance in the multivariate analysis (P linear = 0.145). Analogous results were recorded in the analyses of the abovementioned physical activity risk factors for the EOSS-P stages. Linear relationships were observed for fast-food consumption and IOTF-defined obesity and higher than 0 EOSS-P stages. Parental obesity status was associated with all EOSS-P stages and IOTF-defined obesity status. Few outpatients were healthy obese (stage 0), while

  3. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  4. Applying Factor Analysis Combined with Kriging and Information Entropy Theory for Mapping and Evaluating the Stability of Groundwater Quality Variation in Taiwan

    PubMed Central

    Shyu, Guey-Shin; Cheng, Bai-You; Chiang, Chi-Ting; Yao, Pei-Hsuan; Chang, Tsun-Kuo

    2011-01-01

    In Taiwan many factors, whether geological parent materials, human activities, and climate change, can affect the groundwater quality and its stability. This work combines factor analysis and kriging with information entropy theory to interpret the stability of groundwater quality variation in Taiwan between 2005 and 2007. Groundwater quality demonstrated apparent differences between the northern and southern areas of Taiwan when divided by the Wu River. Approximately 52% of the monitoring wells in southern Taiwan suffered from progressing seawater intrusion, causing unstable groundwater quality. Industrial and livestock wastewaters also polluted 59.6% of the monitoring wells, resulting in elevated EC and TOC concentrations in the groundwater. In northern Taiwan, domestic wastewaters polluted city groundwater, resulting in higher NH3-N concentration and groundwater quality instability was apparent among 10.3% of the monitoring wells. The method proposed in this study for analyzing groundwater quality inspects common stability factors, identifies potential areas influenced by common factors, and assists in elevating and reinforcing information in support of an overall groundwater management strategy. PMID:21695030

  5. Analysis of algae growth mechanism and water bloom prediction under the effect of multi-affecting factor.

    PubMed

    Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin

    2017-03-01

    The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.

  6. Non-linear principal component analysis applied to Lorenz models and to North Atlantic SLP

    NASA Astrophysics Data System (ADS)

    Russo, A.; Trigo, R. M.

    2003-04-01

    A non-linear generalisation of Principal Component Analysis (PCA), denoted Non-Linear Principal Component Analysis (NLPCA), is introduced and applied to the analysis of three data sets. Non-Linear Principal Component Analysis allows for the detection and characterisation of low-dimensional non-linear structure in multivariate data sets. This method is implemented using a 5-layer feed-forward neural network introduced originally in the chemical engineering literature (Kramer, 1991). The method is described and details of its implementation are addressed. Non-Linear Principal Component Analysis is first applied to a data set sampled from the Lorenz attractor (1963). It is found that the NLPCA approximations are more representative of the data than are the corresponding PCA approximations. The same methodology was applied to the less known Lorenz attractor (1984). However, the results obtained weren't as good as those attained with the famous 'Butterfly' attractor. Further work with this model is underway in order to assess if NLPCA techniques can be more representative of the data characteristics than are the corresponding PCA approximations. The application of NLPCA to relatively 'simple' dynamical systems, such as those proposed by Lorenz, is well understood. However, the application of NLPCA to a large climatic data set is much more challenging. Here, we have applied NLPCA to the sea level pressure (SLP) field for the entire North Atlantic area and the results show a slight imcrement of explained variance associated. Finally, directions for future work are presented.%}

  7. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    ERIC Educational Resources Information Center

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…

  8. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  9. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  10. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  11. Applying Authentic Data Analysis in Learning Earth Atmosphere

    NASA Astrophysics Data System (ADS)

    Johan, H.; Suhandi, A.; Samsudin, A.; Wulan, A. R.

    2017-09-01

    The aim of this research was to develop earth science learning material especially earth atmosphere supported by science research with authentic data analysis to enhance reasoning through. Various earth and space science phenomenon require reasoning. This research used experimental research with one group pre test-post test design. 23 pre-service physics teacher participated in this research. Essay test was conducted to get data about reason ability. Essay test was analyzed quantitatively. Observation sheet was used to capture phenomena during learning process. The results showed that student’s reasoning ability improved from unidentified and no reasoning to evidence based reasoning and inductive/deductive rule-based reasoning. Authentic data was considered using Grid Analysis Display System (GrADS). Visualization from GrADS facilitated students to correlate the concepts and bring out real condition of nature in classroom activity. It also helped student to reason the phenomena related to earth and space science concept. It can be concluded that applying authentic data analysis in learning process can help to enhance students reasoning. This study is expected to help lecture to bring out result of geoscience research in learning process and facilitate student understand concepts.

  12. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  13. Reachability Analysis Applied to Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Holzinger, M.; Scheeres, D.

    Several existing and emerging applications of Space Situational Awareness (SSA) relate directly to spacecraft Rendezvous, Proximity Operations, and Docking (RPOD) and Formation / Cluster Flight (FCF). When multiple Resident Space Ob jects (RSOs) are in vicinity of one another with appreciable periods between observations, correlating new RSO tracks to previously known objects becomes a non-trivial problem. A particularly difficult sub-problem is seen when long breaks in observations are coupled with continuous, low- thrust maneuvers. Reachability theory, directly related to optimal control theory, can compute contiguous reachability sets for known or estimated control authority and can support such RSO search and correlation efforts in both ground and on-board settings. Reachability analysis can also directly estimate the minimum control authority of a given RSO. For RPOD and FCF applications, emerging mission concepts such as fractionation drastically increase system complexity of on-board autonomous fault management systems. Reachability theory, as applied to SSA in RPOD and FCF applications, can involve correlation of nearby RSO observations, control authority estimation, and sensor track re-acquisition. Additional uses of reachability analysis are formation reconfiguration, worst-case passive safety, and propulsion failure modes such as a "stuck" thruster. Existing reachability theory is applied to RPOD and FCF regimes. An optimal control policy is developed to maximize the reachability set and optimal control law discontinuities (switching) are examined. The Clohessy-Wiltshire linearized equations of motion are normalized to accentuate relative control authority for spacecraft propulsion systems at both Low Earth Orbit (LEO) and Geostationary Earth Orbit (GEO). Several examples with traditional and low thrust propulsion systems in LEO and GEO are explored to illustrate the effects of relative control authority on the time-varying reachability set surface. Both

  14. Improved scatter correction with factor analysis for planar and SPECT imaging

    NASA Astrophysics Data System (ADS)

    Knoll, Peter; Rahmim, Arman; Gültekin, Selma; Šámal, Martin; Ljungberg, Michael; Mirzaei, Siroos; Segars, Paul; Szczupak, Boguslaw

    2017-09-01

    Quantitative nuclear medicine imaging is an increasingly important frontier. In order to achieve quantitative imaging, various interactions of photons with matter have to be modeled and compensated. Although correction for photon attenuation has been addressed by including x-ray CT scans (accurate), correction for Compton scatter remains an open issue. The inclusion of scattered photons within the energy window used for planar or SPECT data acquisition decreases the contrast of the image. While a number of methods for scatter correction have been proposed in the past, in this work, we propose and assess a novel, user-independent framework applying factor analysis (FA). Extensive Monte Carlo simulations for planar and tomographic imaging were performed using the SIMIND software. Furthermore, planar acquisition of two Petri dishes filled with 99mTc solutions and a Jaszczak phantom study (Data Spectrum Corporation, Durham, NC, USA) using a dual head gamma camera were performed. In order to use FA for scatter correction, we subdivided the applied energy window into a number of sub-windows, serving as input data. FA results in two factor images (photo-peak, scatter) and two corresponding factor curves (energy spectra). Planar and tomographic Jaszczak phantom gamma camera measurements were recorded. The tomographic data (simulations and measurements) were processed for each angular position resulting in a photo-peak and a scatter data set. The reconstructed transaxial slices of the Jaszczak phantom were quantified using an ImageJ plugin. The data obtained by FA showed good agreement with the energy spectra, photo-peak, and scatter images obtained in all Monte Carlo simulated data sets. For comparison, the standard dual-energy window (DEW) approach was additionally applied for scatter correction. FA in comparison with the DEW method results in significant improvements in image accuracy for both planar and tomographic data sets. FA can be used as a user

  15. The correlation analysis of tumor necrosis factor-alpha-308G/A polymorphism and venous thromboembolism risk: A meta-analysis.

    PubMed

    Gao, Quangen; Zhang, Peijin; Wang, Wei; Ma, He; Tong, Yue; Zhang, Jing; Lu, Zhaojun

    2016-10-01

    Venous thromboembolism is a common complex disorder, being the resultant of gene-gene and gene-environment interactions. Tumor necrosis factor-alpha is a proinflammatory cytokine which has been implicated in venous thromboembolism risk. A promoter 308G/A polymorphism in the tumor necrosis factor-alpha gene has been suggested to modulate the risk for venous thromboembolism. However, the published findings remain inconsistent. In this study, we conducted a meta-analysis of all available data regarding this issue. Eligible studies were identified through search of Pubmed, EBSCO Medline, Web of Science, and China National Knowledge Infrastructure (CNKI, Chinese) databases up to June 2014. Pooled Odd ratios (ORs) with 95% confidence intervals were applied to estimating the strength of the genetic association in the random-effects model or fixed-effects model. A total of 10 studies involving 1999 venous thromboembolism cases and 2166 controls were included in this meta-analysis to evaluate the association between tumor necrosis factor-alpha-308G/A polymorphism and venous thromboembolism risk. Overall, no significantly increased risk venous thromboembolism was observed in all comparison models when all studies were pooled into the meta-analysis. However, in stratified analyses by ethnicity, there was a pronounced association with venous thromboembolism risk among West Asians in three genetic models (A vs. G: OR = 1.82, 95%CI = 1.13-2.94; GA vs. GG: OR = 1.82, 95%CI = 1.08-3.06; AA/GA vs. GG: OR = 1.88, 95%CI = 1.12-3.16). When stratifying by source of controls, no significant result was detected in all genetic models. This meta-analysis demonstrates that tumor necrosis factor-alpha 308G/A polymorphism may contribute to susceptibility to venous thromboembolism among West Asians. Studies are needed to ascertain these findings in larger samples and different racial groups. © The Author(s) 2015.

  16. Conceptualizing and Measuring Weekend versus Weekday Alcohol Use: Item Response Theory and Confirmatory Factor Analysis

    PubMed Central

    Handren, Lindsay; Crano, William D.

    2018-01-01

    Culturally, people tend to abstain from alcohol intake during the weekdays and wait to consume in greater frequency and quantity during the weekends. The current research sought to empirically justify the days representing weekday versus weekend alcohol consumption. In study 1 (N = 419), item response theory was applied to a two-parameter (difficulty and discrimination) model that evaluated the days of drinking (frequency) during the typical 7-day week. Item characteristic curves were most similar for Monday, Tuesday, and Wednesday (prototypical weekday) and for Friday and Saturday (prototypical weekend). Thursday and Sunday, however, exhibited item characteristics that bordered the properties of weekday and weekend consumption. In study 2 (N = 403), confirmatory factor analysis was applied to test six hypothesized measurement structures representing drinks per day (quantity) during the typical week. The measurement model producing the strongest fit indices was a correlated two-factor structure involving separate weekday and weekend factors that permitted Thursday and Sunday to double load on both dimensions. The proper conceptualization and accurate measurement of the days demarcating the normative boundaries of “dry” weekdays and “wet” weekends are imperative to inform research and prevention efforts targeting temporal alcohol intake patterns. PMID:27488456

  17. Conceptualizing and Measuring Weekend versus Weekday Alcohol Use: Item Response Theory and Confirmatory Factor Analysis.

    PubMed

    Lac, Andrew; Handren, Lindsay; Crano, William D

    2016-10-01

    Culturally, people tend to abstain from alcohol intake during the weekdays and wait to consume in greater frequency and quantity during the weekends. The current research sought to empirically justify the days representing weekday versus weekend alcohol consumption. In study 1 (N = 419), item response theory was applied to a two-parameter (difficulty and discrimination) model that evaluated the days of drinking (frequency) during the typical 7-day week. Item characteristic curves were most similar for Monday, Tuesday, and Wednesday (prototypical weekday) and for Friday and Saturday (prototypical weekend). Thursday and Sunday, however, exhibited item characteristics that bordered the properties of weekday and weekend consumption. In study 2 (N = 403), confirmatory factor analysis was applied to test six hypothesized measurement structures representing drinks per day (quantity) during the typical week. The measurement model producing the strongest fit indices was a correlated two-factor structure involving separate weekday and weekend factors that permitted Thursday and Sunday to double load on both dimensions. The proper conceptualization and accurate measurement of the days demarcating the normative boundaries of "dry" weekdays and "wet" weekends are imperative to inform research and prevention efforts targeting temporal alcohol intake patterns.

  18. Contextual factors in maternal and newborn health evaluation: a protocol applied in Nigeria, India and Ethiopia.

    PubMed

    Sabot, Kate; Marchant, Tanya; Spicer, Neil; Berhanu, Della; Gautham, Meenakshi; Umar, Nasir; Schellenberg, Joanna

    2018-01-01

    Understanding the context of a health programme is important in interpreting evaluation findings and in considering the external validity for other settings. Public health researchers can be imprecise and inconsistent in their usage of the word "context" and its application to their work. This paper presents an approach to defining context, to capturing relevant contextual information and to using such information to help interpret findings from the perspective of a research group evaluating the effect of diverse innovations on coverage of evidence-based, life-saving interventions for maternal and newborn health in Ethiopia, Nigeria, and India. We define "context" as the background environment or setting of any program, and "contextual factors" as those elements of context that could affect implementation of a programme. Through a structured, consultative process, contextual factors were identified while trying to strike a balance between comprehensiveness and feasibility. Thematic areas included demographics and socio-economics, epidemiological profile, health systems and service uptake, infrastructure, education, environment, politics, policy and governance. We outline an approach for capturing and using contextual factors while maximizing use of existing data. Methods include desk reviews, secondary data extraction and key informant interviews. Outputs include databases of contextual factors and summaries of existing maternal and newborn health policies and their implementation. Use of contextual data will be qualitative in nature and may assist in interpreting findings in both quantitative and qualitative aspects of programme evaluation. Applying this approach was more resource intensive than expected, in part because routinely available information was not consistently available across settings and more primary data collection was required than anticipated. Data was used only minimally, partly due to a lack of evaluation results that needed further explanation

  19. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  20. Proceedings of the 5th Symposium on applied surface analysis

    NASA Astrophysics Data System (ADS)

    Grant, J. T.

    1984-04-01

    The 5th Symposium on Applied Surface Analysis was held at the University of Dayton, 8-10 June 1983. This Symposium was held to meet a need, namely to show the transition between basic surface science research and applications of this research to areas of Department of Defense interest. Areas receiving special attention at this Symposium were chemical bonding and reactions at metal-semiconductors interfaces, surface analysis and the tribological processes of ion implanted materials, microbeam analysis and laser ionization of sputtered neutrals. Other topics discussed included adsorption, adhesion, corrosion, wear and thin films. Approximately 110 scientists active in the field of surface analysis participated in the Symposium. Four scientists presented invited papers at the Symposium. There were 29 contributed presentations. The proceedings of the Symposium are being published in a special issue of the journal, Applications of Surface Science, by North-Holland Publishing Company.

  1. Factors affecting the appreciation generated through applying human factors/ergonomics (HFE) principles to systems of work.

    PubMed

    So, R H Y; Lam, S T

    2014-01-01

    This retrospective study examined the levels of appreciation (applause) given by clients to Human Factors/Ergonomic (HFE) specialists after they have modified the systems of work. Thirteen non-academic projects were chosen because the HFE interventions involved changed the way workers work at their workplaces. Companies involved range from multi-national corporations and military organizations with thousands of employees to small trading companies with less than 10 employees. In 5 cases the HFE recommendations were fully adopted and well appreciated. In 4 they were largely ignored and not appreciated, with partial adoption and some appreciation in the other 4 cases. Three factors that predict appreciation were identified: (i) alignment between the benefits HFE can provide and the project's key performance indices; (ii) awareness of HFE among the client's senior management; and (iii) a team organization appropriate for applying HFE recommendations. Having an HFE specialist on the client's side can greatly increase levels of appreciation, but lack of such a specialist will not affect levels of appreciation. A clear contractual requirement for HFE intervention does not promote appreciation significantly, but its absence can greatly reduce levels of appreciation. These relationships are discussed using the Kano's model of quality. Means to generate greater appreciation of the benefits of HFE are discussed. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Landslides distribution analysis and role of triggering factors in the Foglia river basin (Central Itay)

    NASA Astrophysics Data System (ADS)

    Baioni, Davide; Gallerini, Giuliano; Sgavetti, Maria

    2013-04-01

    The present work is focused on the distribution of landslides in Foglia river basin area (northern Marche-Romagna), using a heuristic approach supported by GIS tools for the construction of statistical analysis and spatial data. The study area is located in the Adriatic side of the northern Apennine in the boundary that marks the transition between the Marche and Emilia-Romagna regions. The Foglia river basin extends from the Apennines to the Adriatic sea with NE-SE trend occupying an area of about 708 km2. The purpose of this study is to investigate any relationships between factors related to the territory, which were taken into account and divided into classes, and landslides, trying to identify any possible existence of relationships between them. For this aim the study of landslides distribution was performed by using a GIS approach superimposing each thematic map, previously created, with landslides surveyed. Furthermore, we tried to isolate the most recurrent classes, to detect if at the same conditions there is a parameter that affects more than others, so as to recognize every direct relationship of cause and effect. Finally, an analysis was conducted by applying the model of uncertainty CF (Certainity Factor). In the Foglia river basin were surveyed a total of 2821 landslides occupy a total area of 155 km2, corresponding to 22% areal extent of the entire basin. The results of analysis carried out highlighted the importance and role of individual factors that led to the development of landslides analyzed. Moreover, this methodology may be applied to all orders of magnitude and scale without any problem by not requiring a commitment important, both from the economic point of view, and of human resources.

  3. Dimensional Model for Estimating Factors influencing Childhood Obesity: Path Analysis Based Modeling

    PubMed Central

    Kheirollahpour, Maryam; Shohaimi, Shamarina

    2014-01-01

    The main objective of this study is to identify and develop a comprehensive model which estimates and evaluates the overall relations among the factors that lead to weight gain in children by using structural equation modeling. The proposed models in this study explore the connection among the socioeconomic status of the family, parental feeding practice, and physical activity. Six structural models were tested to identify the direct and indirect relationship between the socioeconomic status and parental feeding practice general level of physical activity, and weight status of children. Finally, a comprehensive model was devised to show how these factors relate to each other as well as to the body mass index (BMI) of the children simultaneously. Concerning the methodology of the current study, confirmatory factor analysis (CFA) was applied to reveal the hidden (secondary) effect of socioeconomic factors on feeding practice and ultimately on the weight status of the children and also to determine the degree of model fit. The comprehensive structural model tested in this study suggested that there are significant direct and indirect relationships among variables of interest. Moreover, the results suggest that parental feeding practice and physical activity are mediators in the structural model. PMID:25097878

  4. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  5. Conversation after Right Hemisphere Brain Damage: Motivations for Applying Conversation Analysis

    ERIC Educational Resources Information Center

    Barnes, Scott; Armstrong, Elizabeth

    2010-01-01

    Despite the well documented pragmatic deficits that can arise subsequent to Right Hemisphere Brain Damage (RHBD), few researchers have directly studied everyday conversations involving people with RHBD. In recent years, researchers have begun applying Conversation Analysis (CA) to the everyday talk of people with aphasia. This research programme…

  6. The "Human Factor" in Pure and in Applied Mathematics. Systems Everywhere: Their Impact on Mathematics Education.

    ERIC Educational Resources Information Center

    Fischer, Roland

    1992-01-01

    Discusses the impact that the relationship between people and mathematics could have on the development of pure and applied mathematics. Argues for (1) a growing interest in philosophy, history and sociology of science; (2) new models in educational and psychological research; and (3) a growing awareness of the human factor in technology,…

  7. Job compensable factors and factor weights derived from job analysis data.

    PubMed

    Chi, Chia-Fen; Chang, Tin-Chang; Hsia, Ping-Ling; Song, Jen-Chieh

    2007-06-01

    Government data on 1,039 job titles in Taiwan were analyzed to assess possible relationships between job attributes and compensation. For each job title, 79 specific variables in six major classes (required education and experience, aptitude, interest, work temperament, physical demands, task environment) were coded to derive the statistical predictors of wage for managers, professionals, technical, clerical, service, farm, craft, operatives, and other workers. Of the 79 variables, only 23 significantly related to pay rate were subjected to a factor and multiple regression analysis for predicting monthly wages. Given the heterogeneous nature of collected job titles, a 4-factor solution (occupational knowledge and skills, human relations skills, work schedule hardships, physical hardships) explaining 43.8% of the total variance but predicting only 23.7% of the monthly pay rate was derived. On the other hand, multiple regression with 9 job analysis items (required education, professional training, professional certificate, professional experience, coordinating, leadership and directing, demand on hearing, proportion of shift working indoors, outdoors and others, rotating shift) better predicted pay and explained 32.5% of the variance. A direct comparison of factors and subfactors of job evaluation plans indicated mental effort and responsibility (accountability) had not been measured with the current job analysis data. Cross-validation of job evaluation factors and ratings with the wage rates is required to calibrate both.

  8. Using BMDP and SPSS for a Q factor analysis.

    PubMed

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  9. 76 FR 36543 - Draft Guidance for Industry and Food and Drug Administration Staff: Applying Human Factors and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-22

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2011-D-0469] Draft Guidance for Industry and Food and Drug Administration Staff: Applying Human Factors and Usability Engineering To Optimize Medical Device Design; Availability AGENCY: Food and Drug Administration, HHS. ACTION...

  10. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  11. PBIS May Not Qualify as Classical Applied Behavior Analysis. So What?

    PubMed

    Critchfield, Thomas S

    2015-05-01

    Some disagreement exists over whether Positive Behavior Interventions and Supports (PBIS) embodies the features of Applied Behavior Analysis (ABA) as described in a classic 1968 paper by Baer, Wolf, and Risley. When it comes to disseminating interventions at a societal level, a more compelling issue is whether ABA should become more like PBIS.

  12. Nonlinear Analysis of Squeeze Film Dampers Applied to Gas Turbine Helicopter Engines.

    DTIC Science & Technology

    1980-11-01

    calculate the stability (complex roots) of a multi-level gas turbine with aero- dynamic excitation. This program has been applied to the space shuttle...such phenomena as oil film whirl. This paper devlops an analysis technique incorporating modal analysis and fast Fourier transform tech- niques to...USING A SQUEEZE FILM BEARING By M. A. Simpson Research Engineer L. E. Barrett Reserach Assistant Professor Department of Mechanical and Aerospace

  13. Rainfall or parameter uncertainty? The power of sensitivity analysis on grouped factors

    NASA Astrophysics Data System (ADS)

    Nossent, Jiri; Pereira, Fernando; Bauwens, Willy

    2017-04-01

    Hydrological models are typically used to study and represent (a part of) the hydrological cycle. In general, the output of these models mostly depends on their input rainfall and parameter values. Both model parameters and input precipitation however, are characterized by uncertainties and, therefore, lead to uncertainty on the model output. Sensitivity analysis (SA) allows to assess and compare the importance of the different factors for this output uncertainty. Hereto, the rainfall uncertainty can be incorporated in the SA by representing it as a probabilistic multiplier. Such multiplier can be defined for the entire time series, or several of these factors can be determined for every recorded rainfall pulse or for hydrological independent storm events. As a consequence, the number of parameters included in the SA related to the rainfall uncertainty can be (much) lower or (much) higher than the number of model parameters. Although such analyses can yield interesting results, it remains challenging to determine which type of uncertainty will affect the model output most due to the different weight both types will have within the SA. In this study, we apply the variance based Sobol' sensitivity analysis method to two different hydrological simulators (NAM and HyMod) for four diverse watersheds. Besides the different number of model parameters (NAM: 11 parameters; HyMod: 5 parameters), the setup of our sensitivity and uncertainty analysis-combination is also varied by defining a variety of scenarios including diverse numbers of rainfall multipliers. To overcome the issue of the different number of factors and, thus, the different weights of the two types of uncertainty, we build on one of the advantageous properties of the Sobol' SA, i.e. treating grouped parameters as a single parameter. The latter results in a setup with a single factor for each uncertainty type and allows for a straightforward comparison of their importance. In general, the results show a clear

  14. What School Psychologists Need to Know about Factor Analysis

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Dombrowski, Stefan C.

    2017-01-01

    Factor analysis is a versatile class of psychometric techniques used by researchers to provide insight into the psychological dimensions (factors) that may account for the relationships among variables in a given dataset. The primary goal of a factor analysis is to determine a more parsimonious set of variables (i.e., fewer than the number of…

  15. School-Wide PBIS: Extending the Impact of Applied Behavior Analysis. Why is This Important to Behavior Analysts?

    PubMed

    Putnam, Robert F; Kincaid, Donald

    2015-05-01

    Horner and Sugai (2015) recently wrote a manuscript providing an overview of school-wide positive behavioral interventions and supports (PBIS) and why it is an example of applied behavior analysis at the scale of social importance. This paper will describe why school-wide PBIS is important to behavior analysts, how it helps promote applied behavior analysis in schools and other organizations, and how behavior analysts can use this framework to assist them in the promotion and implementation of applied behavior analysis at both at the school and organizational level, as well as, the classroom and individual level.

  16. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  17. Classical linear-control analysis applied to business-cycle dynamics and stability

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  18. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  19. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours…

  20. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  1. Common factor analysis versus principal component analysis: choice for symptom cluster research.

    PubMed

    Kim, Hee-Ju

    2008-03-01

    The purpose of this paper is to examine differences between two factor analytical methods and their relevance for symptom cluster research: common factor analysis (CFA) versus principal component analysis (PCA). Literature was critically reviewed to elucidate the differences between CFA and PCA. A secondary analysis (N = 84) was utilized to show the actual result differences from the two methods. CFA analyzes only the reliable common variance of data, while PCA analyzes all the variance of data. An underlying hypothetical process or construct is involved in CFA but not in PCA. PCA tends to increase factor loadings especially in a study with a small number of variables and/or low estimated communality. Thus, PCA is not appropriate for examining the structure of data. If the study purpose is to explain correlations among variables and to examine the structure of the data (this is usual for most cases in symptom cluster research), CFA provides a more accurate result. If the purpose of a study is to summarize data with a smaller number of variables, PCA is the choice. PCA can also be used as an initial step in CFA because it provides information regarding the maximum number and nature of factors. In using factor analysis for symptom cluster research, several issues need to be considered, including subjectivity of solution, sample size, symptom selection, and level of measure.

  2. Applying Rasch model analysis in the development of the cantonese tone identification test (CANTIT).

    PubMed

    Lee, Kathy Y S; Lam, Joffee H S; Chan, Kit T Y; van Hasselt, Charles Andrew; Tong, Michael C F

    2017-01-01

    Applying Rasch analysis to evaluate the internal structure of a lexical tone perception test known as the Cantonese Tone Identification Test (CANTIT). A 75-item pool (CANTIT-75) with pictures and sound tracks was developed. Respondents were required to make a four-alternative forced choice on each item. A short version of 30 items (CANTIT-30) was developed based on fit statistics, difficulty estimates, and content evaluation. Internal structure was evaluated by fit statistics and Rasch Factor Analysis (RFA). 200 children with normal hearing and 141 children with hearing impairment were recruited. For CANTIT-75, all infit and 97% of outfit values were < 2.0. RFA revealed 40.1% of total variance was explained by the Rasch measure. The first residual component explained 2.5% of total variance in an eigenvalue of 3.1. For CANTIT-30, all infit and outfit values were < 2.0. The Rasch measure explained 38.8% of total variance, the first residual component explained 3.9% of total variance in an eigenvalue of 1.9. The Rasch model provides excellent guidance for the development of short forms. Both CANTIT-75 and CANTIT-30 possess satisfactory internal structure as a construct validity evidence in measuring the lexical tone identification ability of the Cantonese speakers.

  3. Derived Basic Ability Factors: A Factor Analysis Replication Study.

    ERIC Educational Resources Information Center

    Lee, Mickey, M.; Lee, Lynda Newby

    The purpose of this study was to replicate the study conducted by Potter, Sagraves, and McDonald to determine whether their recommended analysis could separate criterion variables into similar factors that were stable from year to year and from school to school. The replication samples consisted of all students attending Louisiana State University…

  4. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    PubMed

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (P<0.001). Recurrence was found in 18 cases of the benign group and 39 cases of the malignant group, and results were significantly different (P<0.001). Tumor recurrence was shorter in patients with a higher McCormick grade (P<0.001). Recurrence was found in 13 patients with resection and all the patients with partial resection or biopsy/decompression. The results were significantly different (P<0.001). Logistic regression analysis of total resection-related factors showed that total resection

  5. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors

    PubMed Central

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-01-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (P<0.001). Recurrence was found in 18 cases of the benign group and 39 cases of the malignant group, and results were significantly different (P<0.001). Tumor recurrence was shorter in patients with a higher McCormick grade (P<0.001). Recurrence was found in 13 patients with resection and all the patients with partial resection or biopsy/decompression. The results were significantly different (P<0.001). Logistic regression analysis of total resection-related factors showed that total resection

  6. On the Likelihood Ratio Test for the Number of Factors in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.; Yuan, Ke-Hai

    2007-01-01

    In the exploratory factor analysis, when the number of factors exceeds the true number of factors, the likelihood ratio test statistic no longer follows the chi-square distribution due to a problem of rank deficiency and nonidentifiability of model parameters. As a result, decisions regarding the number of factors may be incorrect. Several…

  7. A Brief History of the Philosophical Foundations of Exploratory Factor Analysis.

    ERIC Educational Resources Information Center

    Mulaik, Stanley A.

    1987-01-01

    Exploratory factor analysis derives its key ideas from many sources, including Aristotle, Francis Bacon, Descartes, Pearson and Yule, and Kant. The conclusions of exploratory factor analysis are never complete without subsequent confirmatory factor analysis. (Author/GDC)

  8. The combined use of dynamic factor analysis and wavelet analysis to evaluate latent factors controlling complex groundwater level fluctuations in a riverside alluvial aquifer

    NASA Astrophysics Data System (ADS)

    Oh, Yun-Yeong; Yun, Seong-Taek; Yu, Soonyoung; Hamm, Se-Yeong

    2017-12-01

    To identify and quantitatively evaluate complex latent factors controlling groundwater level (GWL) fluctuations in a riverside alluvial aquifer influenced by barrage construction, we developed the combined use of dynamic factor analysis (DFA) and wavelet analysis (WA). Time series data of GWL, river water level and precipitation were collected for 3 years (July 2012 to June 2015) from an alluvial aquifer underneath an agricultural area of the Nakdong river basin, South Korea. Based on the wavelet coefficients of the final approximation, the GWL data was clustered into three groups (WCG1 to WCG3). Two dynamic factors (DFs) were then extracted using DFA for each group; thus, six major factors were extracted. Next, the time-frequency variability of the extracted DFs was examined using multiresolution cross-correlation analysis (MRCCA) with the following steps: 1) major driving forces and their scales in GWL fluctuations were identified by comparing maximum correlation coefficients (rmax) between DFs and the GWL time series and 2) the results were supplemented using the wavelet transformed coherence (WTC) analysis between DFs and the hydrological time series. Finally, relative contributions of six major DFs to the GWL fluctuations could be quantitatively assessed by calculating the effective dynamic efficiency (Def). The characteristics and relevant process of the identified six DFs are: 1) WCG1DF4,1 as an indicative of seasonal agricultural pumping (scales = 64-128 days; rmax = 0.68-0.89; Def ≤ 23.1%); 2) WCG1DF4,4 representing the cycle of regional groundwater recharge (scales = 64-128 days; rmax = 0.98-1.00; Def ≤ 11.1%); 3) WCG2DF4,1 indicating the complex interaction between the episodes of precipitation and direct runoff (scales = 2-8 days; rmax = 0.82-0.91; Def ≤ 35.3%) and seasonal GW-RW interaction (scales = 64-128 days; rmax = 0.76-0.91; Def ≤ 14.2%); 4) WCG2DF4,4 reflecting the complex effects of seasonal pervasive pumping and the local recharge

  9. Contextual Factors for Establishing Nursing Regulation in Iran: A Qualitative Content Analysis

    PubMed Central

    Nejatian, Ahmad; Joulaei, Hassan

    2018-01-01

    ABSTRACT Background: Professional regulation is one of the strategies of the governments which protect the public’s right. Nursing practice is not an exception; hence, it is regulated to protect the public against nursing services’ adverse effects. Although modern nursing in Iran started from 100 years ago, documents show that there was no regulation mechanism for nursing in Iran till 2016. Hence, this study was conducted to illuminate the contextual factors affecting the nursing regulation process in Iran. Methods: To explore the contextual elements of late establishment of nursing registration as an important part of nursing regulation, we applied directed qualitative content analysis. For this purpose, all the historical events and related materials including articles published in scientific journals, gray literature, statements, news articles, and interviews in the period of 2006-2016 were reviewed and analyzed by expert panel and categorized in predetermined groups. Results: Pooled analysis data showed four contributing elements that affected the emerging nursing regulation in Iran. These elements include 1) cultural determinants, 2) structural determinants, 3) situational determinants, and 4) international or exogenous determinants. Conclusion: Nursing regulation is an important health policy issue in Iran which needs to be facilitated by contextual factors. These factors are complicated and country-specific. Political willingness should be accompanied by nursing association willingness to establish and improve nursing regulation. Other researches are recommended to explore actors and process and content of nursing regulation policy in Iran. PMID:29607341

  10. Potential barriers to the application of multi-factor portfolio analysis in public hospitals: evidence from a pilot study in the Netherlands.

    PubMed

    Pavlova, Milena; Tsiachristas, Apostolos; Vermaeten, Gerhard; Groot, Wim

    2009-01-01

    Portfolio analysis is a business management tool that can assist health care managers to develop new organizational strategies. The application of portfolio analysis to US hospital settings has been frequently reported. In Europe however, the application of this technique has received little attention, especially concerning public hospitals. Therefore, this paper examines the peculiarities of portfolio analysis and its applicability to the strategic management of European public hospitals. The analysis is based on a pilot application of a multi-factor portfolio analysis in a Dutch university hospital. The nature of portfolio analysis and the steps in a multi-factor portfolio analysis are reviewed along with the characteristics of the research setting. Based on these data, a multi-factor portfolio model is developed and operationalized. The portfolio model is applied in a pilot investigation to analyze the market attractiveness and hospital strengths with regard to the provision of three orthopedic services: knee surgery, hip surgery, and arthroscopy. The pilot portfolio analysis is discussed to draw conclusions about potential barriers to the overall adoption of portfolio analysis in the management of a public hospital. Copyright (c) 2008 John Wiley & Sons, Ltd.

  11. Δg: The new aromaticity index based on g-factor calculation applied for polycyclic benzene rings

    NASA Astrophysics Data System (ADS)

    Ucun, Fatih; Tokatlı, Ahmet

    2015-02-01

    In this work, the aromaticity of polycyclic benzene rings was evaluated by the calculation of g-factor for a hydrogen placed perpendicularly at geometrical center of related ring plane at a distance of 1.2 Å. The results have compared with the other commonly used aromatic indices, such as HOMA, NICSs, PDI, FLU, MCI, CTED and, generally been found to be in agreement with them. So, it was proposed that the calculation of the average g-factor as Δg could be applied to study the aromaticity of polycyclic benzene rings without any restriction in the number of benzene rings as a new magnetic-based aromaticity index.

  12. A systematic review of the relationship factor between women and health professionals within the multivariant analysis of maternal satisfaction.

    PubMed

    Macpherson, Ignacio; Roqué-Sánchez, María V; Legget Bn, Finola O; Fuertes, Ferran; Segarra, Ignacio

    2016-10-01

    personalised support provided to women by health professionals is one of the prime factors attaining women's satisfaction during pregnancy and childbirth. However the multifactorial nature of 'satisfaction' makes difficult to assess it. Statistical multivariate analysis may be an effective technique to obtain in depth quantitative evidence of the importance of this factor and its interaction with the other factors involved. This technique allows us to estimate the importance of overall satisfaction in its context and suggest actions for healthcare services. systematic review of studies that quantitatively measure the personal relationship between women and healthcare professionals (gynecologists, obstetricians, nurse, midwifes, etc.) regarding maternity care satisfaction. The literature search focused on studies carried out between 1970 and 2014 that used multivariate analyses and included the woman-caregiver relationship as a factor of their analysis. twenty-four studies which applied various multivariate analysis tools to different periods of maternity care (antenatal, perinatal, post partum) were selected. The studies included discrete scale scores and questionnaires from women with low-risk pregnancies. The "personal relationship" factor appeared under various names: care received, personalised treatment, professional support, amongst others. The most common multivariate techniques used to assess the percentage of variance explained and the odds ratio of each factor were principal component analysis and logistic regression. the data, variables and factor analysis suggest that continuous, personalised care provided by the usual midwife and delivered within a family or a specialised setting, generates the highest level of satisfaction. In addition, these factors foster the woman's psychological and physiological recovery, often surpassing clinical action (e.g. medicalization and hospital organization) and/or physiological determinants (e.g. pain, pathologies, etc

  13. Hand function evaluation: a factor analysis study.

    PubMed

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  14. Factors that influence full-time MPH Students' willingness in China: would You apply again for an MPH graduate degree if you had another opportunity?

    PubMed

    Wang, Nan; Jia, Jinzhong; Wu, Ke; Wang, Yuanyuan; Zhang, Chi; Cao, Wei; Duan, Liping; Wang, Zhifeng

    2017-02-14

    Current and emerging challenges to public health in the 21st century are vastly different from those faced in previous centuries. And the shortage of health personnel and their low level of educational qualifications hindered the development of Chinese public health services. In order to fulfill this requirement, the Ministry of Education initiated a full-time, Master of Public Health (MPH) graduate programme in 2009. This study aimed to evaluate the level of graduate students' satisfaction with full-time Master of Public Health (MPH) education in China, and whether they would apply again for an MPH graduate degree if they had another opportunity to do so, as well as to identify the factors influencing their decision-making process. An anonymous, web-based survey questionnaire containing 61 items was distributed to 702 MPH students in 35 universities or colleges. The questions covered the categories of student admission, training goals, lecture courses, practical training, research activities and mentorship. Levels of satisfaction were compared between MPH students who would choose MPH again as their graduate degree if they had another opportunity to do so and those who would not. Key influencing factors of training satisfaction were identified using logistic regression models. A total of 65.10% of the participants would apply again for MPH education if they had another opportunity to do so. The factors influencing students' willingness included their university type, the time since admission and their initial willingness. In addition, the four common factors (admissions & lecture courses, research activities & mentorship, practical training and training goals) emerging from factor analysis were all significantly positively correlated with student willingness (p < 0.001). Most MPH students surveyed were highly satisfied with their MPH education and, although they advocated for improvements and reforms in some aspects, they would still choose MPH as their

  15. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    PubMed

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  16. Elder Abuse by Adult Children: An Applied Ecological Framework for Understanding Contextual Risk Factors and the Intergenerational Character of Quality of Life.

    ERIC Educational Resources Information Center

    Schiamberg, Lawrence B.; Gans, Daphna

    2000-01-01

    Using an applied ecological model, this study focuses on contextual risk factors of elder abuse. Five levels of environment were used to interpret existing research on risk factors. Configuration of risk factors provides a framework for understanding the intergenerational character of quality of life for older adults, developing recommendations…

  17. [Analysis of citations and national and international impact factor of Farmacia Hospitalaria (2001-2005)].

    PubMed

    Aleixandre-Benavent, R; González Alcaide, G; Miguel-Dasit, A; González de Dios, J; de Granda Orive, J I; Valderrama Zurián, J C

    2007-01-01

    The objective of this study is to analyse the citation patterns and impact and immediacy indicators of the Farmacia Hospitalaria journal during the period 2001-2005. An analysis of citations chosen from 101 Spanish health science journals was carried out in order to determine the citing and cited journals and the national and international impact and immediacy indicators. A similar methodology used by Thomson ISI in Science Citation Index (SCI) and Journal Citation Reports (JRC) was applied. Farmacia Hospitalaria made 1,370 citations to 316 different journals. The percentage of self-citations was 9%. The national impact factor increased from 0.178 points in 2001 to 0.663 points in 2005 while the international impact factor increased from 0.178 to 0.806 for the same period. The analysis of citation patterns demonstrates the multidisciplinary nature of Farmacia Hospitalaria and a significant growth in the impact indicators over recent years. These indicators are higher than those of some other pharmacy journals included in Journal Citation Reports. Self-citation was not excessive and was similar to that of other journals.

  18. Improvement of Quench Factor Analysis in Phase and Hardness Prediction of a Quenched Steel

    NASA Astrophysics Data System (ADS)

    Kianezhad, M.; Sajjadi, S. A.

    2013-05-01

    The accurate prediction of alloys' properties introduced by heat treatment has been considered by many researchers. The advantages of such predictions are reduction of test trails and materials' consumption as well as time and energy saving. One of the most important methods to predict hardness in quenched steel parts is Quench Factor Analysis (QFA). Classical QFA is based on the Johnson-Mehl-Avrami-Kolmogorov (JMAK) equation. In this study, a modified form of the QFA based on the work by Rometsch et al. is compared with the classical QFA, and they are applied to prediction of hardness of steels. For this purpose, samples of CK60 steel were utilized as raw material. They were austenitized at 1103 K (830 °C). After quenching in different environments, they were cut and their hardness was determined. In addition, the hardness values of the samples were fitted using the classical and modified equations for the quench factor analysis and the results were compared. Results showed a significant improvement in fitted values of the hardness and proved the higher efficiency of the new method.

  19. Web impact factor: a bibliometric criterion applied to medical informatics societies' web sites.

    PubMed

    Soualmia, Lina Fatima; Darmoni, Stéfan Jacques; Le Duff, Franck; Douyere, Magaly; Thelwall, Maurice

    2002-01-01

    Several methods are available to evaluate and compare medical journals. The most popular is the journal Impact Factor, derived from averaging counts of citations to articles. Ingwersen adapted this method to assess the attractiveness of Web sites, defining the external Web Impact Factor (WIF) to be the number of external pages containing a link to a given Web site. This paper applies the WIF to 43 medical informatics societies' Web sites using advanced search engine queries to obtain the necessary link counts. The WIF was compared to the number of publications available in the Medline bibliographic database in medical informatics in these 43 countries. Between these two metrics, the observed Pearson correlation was 0.952 (p < 0.01) and the Spearman rank correlation was 0.548 (p < 0.01) showing in both cases a positive and strong significant correlation. The WIF of medicalm informatics society's Web site is statistically related to national productivity and discrepancies can be used to indicate countries where there are either weak medical informatics associations, or ones that do not make optimal use of the Web.

  20. Configurations of Common Childhood Psychosocial Risk Factors

    ERIC Educational Resources Information Center

    Copeland, William; Shanahan, Lilly; Costello, E. Jane; Angold, Adrian

    2009-01-01

    Background: Co-occurrence of psychosocial risk factors is commonplace, but little is known about psychiatrically-predictive configurations of psychosocial risk factors. Methods: Latent class analysis (LCA) was applied to 17 putative psychosocial risk factors in a representative population sample of 920 children ages 9 to 17. The resultant class…

  1. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies

    PubMed Central

    2010-01-01

    Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data. PMID:21062443

  2. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies.

    PubMed

    Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence

    2010-11-09

    Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  3. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  4. Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data

    PubMed Central

    Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D.; Nichols, Thomas E.

    2017-01-01

    Summary Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterised as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. PMID:28498564

  5. Spatial Bayesian latent factor regression modeling of coordinate-based meta-analysis data.

    PubMed

    Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D; Nichols, Thomas E

    2018-03-01

    Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the article are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to (i) identify areas of consistent activation; and (ii) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterized as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. © 2017, The International Biometric Society.

  6. Multivariate analysis of prognostic factors in synovial sarcoma.

    PubMed

    Koh, Kyoung Hwan; Cho, Eun Yoon; Kim, Dong Wook; Seo, Sung Wook

    2009-11-01

    Many studies have described the diversity of synovial sarcoma in terms of its biological characteristics and clinical features. Moreover, much effort has been expended on the identification of prognostic factors because of unpredictable behaviors of synovial sarcomas. However, with the exception of tumor size, published results have been inconsistent. We attempted to identify independent risk factors using survival analysis. Forty-one consecutive patients with synovial sarcoma were prospectively followed from January 1997 to March 2008. Overall and progression-free survival for age, sex, tumor size, tumor location, metastasis at presentation, histologic subtype, chemotherapy, radiation therapy, and resection margin were analyzed, and standard multivariate Cox proportional hazard regression analysis was used to evaluate potential prognostic factors. Tumor size (>5 cm), nonlimb-based tumors, metastasis at presentation, and a monophasic subtype were associated with poorer overall survival. Multivariate analysis showed metastasis at presentation and monophasic tumor subtype affected overall survival. For the progression-free survival, monophasic subtype was found to be only 1 prognostic factor. The study confirmed that histologic subtype is the single most important independent prognostic factors of synovial sarcoma regardless of tumor stage.

  7. Applied cognitive task analysis (ACTA): a practitioner's toolkit for understanding cognitive task demands.

    PubMed

    Militello, L G; Hutton, R J

    1998-11-01

    Cognitive task analysis (CTA) is a set of methods for identifying cognitive skills, or mental demands, needed to perform a task proficiently. The product of the task analysis can be used to inform the design of interfaces and training systems. However, CTA is resource intensive and has previously been of limited use to design practitioners. A streamlined method of CTA, Applied Cognitive Task Analysis (ACTA), is presented in this paper. ACTA consists of three interview methods that help the practitioner to extract information about the cognitive demands and skills required for a task. ACTA also allows the practitioner to represent this information in a format that will translate more directly into applied products, such as improved training scenarios or interface recommendations. This paper will describe the three methods, an evaluation study conducted to assess the usability and usefulness of the methods, and some directions for future research for making cognitive task analysis accessible to practitioners. ACTA techniques were found to be easy to use, flexible, and to provide clear output. The information and training materials developed based on ACTA interviews were found to be accurate and important for training purposes.

  8. Applied Behavior Analysis and the Imprisoned Adult Felon Project 1: The Cellblock Token Economy.

    ERIC Educational Resources Information Center

    Milan, Michael A.; And Others

    This report provides a technical-level analysis, discussion, and summary of five experiments in applied behavior analysis. Experiment 1 examined the token economy as a basis for motivating inmate behavior; Experiment 2 examined the relationship between magnitude of token reinforcement and level of inmate performance; Experiment 3 introduced a…

  9. Pressure and protective factors influencing nursing students' self-esteem: A content analysis study.

    PubMed

    Valizadeh, Leila; Zamanzadeh, Vahid; Gargari, Rahim Badri; Ghahramanian, Akram; Tabrizi, Faranak Jabbarzadeh; Keogh, Brian

    2016-01-01

    A review of the literature shows that the range of self-esteem in nursing students ranges from normal to low. It is hypothesized that different contextual factors could affect levels of self-esteem. The main aim of this study was to explore these factors from the viewpoint of Iranian nursing students using a qualitative approach. A qualitative content analysis study. Faculty of Nursing and Midwifery, 2014. Fourteen student nurses and two qualified nurses. This study has been applied to various depths of interpretation. Semi-structured interviews were used to collect the data. Fourteen student nurses and two qualified nurses were interviewed. Two main themes of the "pressure factors" with subthemes: low self-efficacy, sense of triviality, ineffective instructor-student interaction, low self-confidence and "protective factors" with subthemes: knowledge acquisition, mirror of valuability, professional autonomy, religious beliefs, and choosing the nursing field with interest was extracted in this study. Results showed that these themes have interaction with each other like a seesaw, as pressure factors decrease, the effect of protective factors on the self-esteem are increased. Nurse educators not only should try to improve the students' skills and knowledge, but should also try to enhance the protective factors and decrease pressure factors by enhancing the nursing students' feeling of being important, using participatory teaching methods, considering students' feedback, and attempting to improve facilities at the clinics are also recommended. Copyright © 2015. Published by Elsevier Ltd.

  10. Factor Analysis for Clustered Observations.

    ERIC Educational Resources Information Center

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  11. Treatment of Autism in Young Children: Behavioral Intervention and Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Jensen, Vanessa K.; Sinclair, Leslie V.

    2002-01-01

    This article discusses the etiology and scope of autism in young children, screening and diagnosis, intervention options, and the use of applied behavior analysis. Supporting evidence of the efficacy of intensive behavioral intervention is cited, and variations in treatments and techniques are reviewed. Barriers to effective services are also…

  12. Toward a Technology of Derived Stimulus Relations: An Analysis of Articles Published in the "Journal of Applied Behavior Analysis," 1992-2009

    ERIC Educational Resources Information Center

    Rehfeldt, Ruth Anne

    2011-01-01

    Every article on stimulus equivalence or derived stimulus relations published in the "Journal of Applied Behavior Analysis" was evaluated in terms of characteristics that are relevant to the development of applied technologies: the type of participants, settings, procedure automated vs. tabletop), stimuli, and stimulus sensory modality; types of…

  13. Sequential analysis applied to clinical trials in dentistry: a systematic review.

    PubMed

    Bogowicz, P; Flores-Mir, C; Major, P W; Heo, G

    2008-01-01

    Clinical trials employ sequential analysis for the ethical and economic benefits it brings. In dentistry, as in other fields, resources are scarce and efforts are made to ensure that patients are treated ethically. The objective of this systematic review was to characterise the use of sequential analysis for clinical trials in dentistry. We searched various databases from 1900 through to January 2008. Articles were selected for review if they were clinical trials in the field of dentistry that had applied some form of sequential analysis. Selection was carried out independently by two of the authors. We included 18 trials from various specialties, which involved many different interventions. We conclude that sequential analysis seems to be underused in this field but that there are sufficient methodological resources in place for future applications.Evidence-Based Dentistry (2008) 9, 55-62. doi:10.1038/sj.ebd.6400587.

  14. Using factor analysis to identify neuromuscular synergies during treadmill walking

    NASA Technical Reports Server (NTRS)

    Merkle, L. A.; Layne, C. S.; Bloomberg, J. J.; Zhang, J. J.

    1998-01-01

    Neuroscientists are often interested in grouping variables to facilitate understanding of a particular phenomenon. Factor analysis is a powerful statistical technique that groups variables into conceptually meaningful clusters, but remains underutilized by neuroscience researchers presumably due to its complicated concepts and procedures. This paper illustrates an application of factor analysis to identify coordinated patterns of whole-body muscle activation during treadmill walking. Ten male subjects walked on a treadmill (6.4 km/h) for 20 s during which surface electromyographic (EMG) activity was obtained from the left side sternocleidomastoid, neck extensors, erector spinae, and right side biceps femoris, rectus femoris, tibialis anterior, and medial gastrocnemius. Factor analysis revealed 65% of the variance of seven muscles sampled aligned with two orthogonal factors, labeled 'transition control' and 'loading'. These two factors describe coordinated patterns of muscular activity across body segments that would not be evident by evaluating individual muscle patterns. The results show that factor analysis can be effectively used to explore relationships among muscle patterns across all body segments to increase understanding of the complex coordination necessary for smooth and efficient locomotion. We encourage neuroscientists to consider using factor analysis to identify coordinated patterns of neuromuscular activation that would be obscured using more traditional EMG analyses.

  15. AASC Recommendations for the Education of an Applied Climatologist

    NASA Astrophysics Data System (ADS)

    Nielsen-Gammon, J. W.; Stooksbury, D.; Akyuz, A.; Dupigny-Giroux, L.; Hubbard, K. G.; Timofeyeva, M. M.

    2011-12-01

    The American Association of State Climatologists (AASC) has developed curricular recommendations for the education of future applied and service climatologists. The AASC was founded in 1976. Membership of the AASC includes state climatologists and others who work in state climate offices; climate researchers in academia and educators; applied climatologists in NOAA and other federal agencies; and the private sector. The AASC is the only professional organization dedicated solely to the growth and development of applied and service climatology. The purpose of the recommendations is to offer a framework for existing and developing academic climatology programs. These recommendations are intended to serve as a road map and to help distinguish the educational needs for future applied climatologists from those of operational meteorologists or other scientists and practitioners. While the home department of climatology students may differ from one program to the next, the most essential factor is that students can demonstrate a breadth and depth of understanding in the knowledge and tools needed to be an applied climatologist. Because the training of an applied climatologist requires significant depth and breadth, the Masters degree is recommended as the minimum level of education needed. This presentation will highlight the AASC recommendations. These include a strong foundation in: - climatology (instrumentation and data collection, climate dynamics, physical climatology, synoptic and regional climatology, applied climatology, climate models, etc.) - basic natural sciences and mathematics including calculus, physics, chemistry, and biology/ecology - fundamental atmospheric sciences (atmospheric dynamics, atmospheric thermodynamics, atmospheric radiation, and weather analysis/synoptic meteorology) and - data analysis and spatial analysis (descriptive statistics, statistical methods, multivariate statistics, geostatistics, GIS, etc.). The recommendations also include a

  16. Application of homomorphic signal processing to stress wave factor analysis

    NASA Technical Reports Server (NTRS)

    Karagulle, H.; Williams, J. H., Jr.; Lee, S. S.

    1985-01-01

    The stress wave factor (SWF) signal, which is the output of an ultrasonic testing system where the transmitting and receiving transducers are coupled to the same face of the test structure, is analyzed in the frequency domain. The SWF signal generated in an isotropic elastic plate is modelled as the superposition of successive reflections. The reflection which is generated by the stress waves which travel p times as a longitudinal (P) wave and s times as a shear (S) wave through the plate while reflecting back and forth between the bottom and top faces of the plate is designated as the reflection with p, s. Short-time portions of the SWF signal are considered for obtaining spectral information on individual reflections. If the significant reflections are not overlapped, the short-time Fourier analysis is used. A summary of the elevant points of homomorphic signal processing, which is also called cepstrum analysis, is given. Homomorphic signal processing is applied to short-time SWF signals to obtain estimates of the log spectra of individual reflections for cases in which the reflections are overlapped. Two typical SWF signals generated in aluminum plates (overlapping and non-overlapping reflections) are analyzed.

  17. A Factor Analysis of the BSRI and the PAQ.

    ERIC Educational Resources Information Center

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  18. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of placing preventive fissure sealants.

    PubMed

    Bonetti, Debbie; Johnston, Marie; Clarkson, Jan E; Grimshaw, Jeremy; Pitts, Nigel B; Eccles, Martin; Steen, Nick; Thomas, Ruth; Maclennan, Graeme; Glidewell, Liz; Walker, Anne

    2010-04-08

    Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value. Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors that may predict clinical behaviour

  19. Establishing Factor Validity Using Variable Reduction in Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Hofmann, Rich

    1995-01-01

    Using a 21-statement attitude-type instrument, an iterative procedure for improving confirmatory model fit is demonstrated within the context of the EQS program of P. M. Bentler and maximum likelihood factor analysis. Each iteration systematically eliminates the poorest fitting statement as identified by a variable fit index. (SLD)

  20. A study in the founding of applied behavior analysis through its publications.

    PubMed

    Morris, Edward K; Altus, Deborah E; Smith, Nathaniel G

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research.

  1. A Study in the Founding of Applied Behavior Analysis Through Its Publications

    PubMed Central

    Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133

  2. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  3. Factors affecting job satisfaction in nurse faculty: a meta-analysis.

    PubMed

    Gormley, Denise K

    2003-04-01

    Evidence in the literature suggests job satisfaction can make a difference in keeping qualified workers on the job, but little research has been conducted focusing specifically on nursing faculty. Several studies have examined nurse faculty satisfaction in relationship to one or two influencing factors. These factors include professional autonomy, leader role expectations, organizational climate, perceived role conflict and role ambiguity, leadership behaviors, and organizational characteristics. This meta-analysis attempts to synthesize the various studies conducted on job satisfaction in nursing faculty and analyze which influencing factors have the greatest effect. The procedure used for this meta-analysis consisted of reviewing studies to identify factors influencing job satisfaction, research questions, sample size reported, instruments used for measurement of job satisfaction and influencing factors, and results of statistical analysis.

  4. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  5. Recurrence quantification analysis applied to spatiotemporal pattern analysis in high-density mapping of human atrial fibrillation.

    PubMed

    Zeemering, Stef; Bonizzi, Pietro; Maesen, Bart; Peeters, Ralf; Schotten, Ulrich

    2015-01-01

    Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded with a high-density grid of electrodes. In 32 patients with no history of AF (aAF, n=11), paroxysmal AF (PAF, n=12) and persistent AF (persAF, n=9), RPs were constructed using a phase space electrogram embedding dimension equal to the estimated AF cycle length. Spatial information was incorporated by 1) averaging the recurrence over all electrodes, and 2) by applying principal component analysis (PCA) to the matrix of embedded electrograms and selecting the first principal component as a representation of spatial diversity. Standard RQA parameters were computed on the constructed RPs and correlated to the number of fibrillation waves per AF cycle (NW). Averaged RP RQA parameters showed no correlation with NW. Correlations improved when applying PCA, with maximum correlation achieved between RP threshold and NW (RR1%, r=0.68, p <; 0.001) and RP determinism (DET, r=-0.64, p <; 0.001). All studied RQA parameters based on the PCA RP were able to discriminate between persAF and aAF/PAF (DET persAF 0.40 ± 0.11 vs. 0.59 ± 0.14/0.62 ± 0.16, p <; 0.01). RP construction and RQA combined with PCA provide a quick and reliable tool to visualize dynamical behaviour and to assess the complexity of contact mapping patterns in AF.

  6. Bootstrap Confidence Intervals for Ordinary Least Squares Factor Loadings and Correlations in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong

    2010-01-01

    This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…

  7. Applied statistical training to strengthen analysis and health research capacity in Rwanda.

    PubMed

    Thomson, Dana R; Semakula, Muhammed; Hirschhorn, Lisa R; Murray, Megan; Ndahindwa, Vedaste; Manzi, Anatole; Mukabutera, Assumpta; Karema, Corine; Condo, Jeanine; Hedt-Gauthier, Bethany

    2016-09-29

    To guide efficient investment of limited health resources in sub-Saharan Africa, local researchers need to be involved in, and guide, health system and policy research. While extensive survey and census data are available to health researchers and program officers in resource-limited countries, local involvement and leadership in research is limited due to inadequate experience, lack of dedicated research time and weak interagency connections, among other challenges. Many research-strengthening initiatives host prolonged fellowships out-of-country, yet their approaches have not been evaluated for effectiveness in involvement and development of local leadership in research. We developed, implemented and evaluated a multi-month, deliverable-driven, survey analysis training based in Rwanda to strengthen skills of five local research leaders, 15 statisticians, and a PhD candidate. Research leaders applied with a specific research question relevant to country challenges and committed to leading an analysis to publication. Statisticians with prerequisite statistical training and experience with a statistical software applied to participate in class-based trainings and complete an assigned analysis. Both statisticians and research leaders were provided ongoing in-country mentoring for analysis and manuscript writing. Participants reported a high level of skill, knowledge and collaborator development from class-based trainings and out-of-class mentorship that were sustained 1 year later. Five of six manuscripts were authored by multi-institution teams and submitted to international peer-reviewed scientific journals, and three-quarters of the participants mentored others in survey data analysis or conducted an additional survey analysis in the year following the training. Our model was effective in utilizing existing survey data and strengthening skills among full-time working professionals without disrupting ongoing work commitments and using few resources. Critical to our

  8. The Factor Structure of the English Language Development Assessment: A Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Kuriakose, Anju

    2011-01-01

    This study investigated the internal factor structure of the English language development Assessment (ELDA) using confirmatory factor analysis. ELDA is an English language proficiency test developed by a consortium of multiple states and is used to identify and reclassify English language learners in kindergarten to grade 12. Scores on item…

  9. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  10. Influential Observations in Principal Factor Analysis.

    ERIC Educational Resources Information Center

    Tanaka, Yutaka; Odaka, Yoshimasa

    1989-01-01

    A method is proposed for detecting influential observations in iterative principal factor analysis. Theoretical influence functions are derived for two components of the common variance decomposition. The major mathematical tool is the influence function derived by Tanaka (1988). (SLD)

  11. An independent component analysis confounding factor correction framework for identifying broad impact expression quantitative trait loci

    PubMed Central

    Ju, Jin Hyun; Crystal, Ronald G.

    2017-01-01

    Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In

  12. An independent component analysis confounding factor correction framework for identifying broad impact expression quantitative trait loci.

    PubMed

    Ju, Jin Hyun; Shenoy, Sushila A; Crystal, Ronald G; Mezey, Jason G

    2017-05-01

    Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In

  13. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    PubMed

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  14. Automated speech analysis applied to laryngeal disease categorization.

    PubMed

    Gelzinis, A; Verikas, A; Bacauskiene, M

    2008-07-01

    The long-term goal of the work is a decision support system for diagnostics of laryngeal diseases. Colour images of vocal folds, a voice signal, and questionnaire data are the information sources to be used in the analysis. This paper is concerned with automated analysis of a voice signal applied to screening of laryngeal diseases. The effectiveness of 11 different feature sets in classification of voice recordings of the sustained phonation of the vowel sound /a/ into a healthy and two pathological classes, diffuse and nodular, is investigated. A k-NN classifier, SVM, and a committee build using various aggregation options are used for the classification. The study was made using the mixed gender database containing 312 voice recordings. The correct classification rate of 84.6% was achieved when using an SVM committee consisting of four members. The pitch and amplitude perturbation measures, cepstral energy features, autocorrelation features as well as linear prediction cosine transform coefficients were amongst the feature sets providing the best performance. In the case of two class classification, using recordings from 79 subjects representing the pathological and 69 the healthy class, the correct classification rate of 95.5% was obtained from a five member committee. Again the pitch and amplitude perturbation measures provided the best performance.

  15. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Confirmatory Factor Analysis of the Malay Version Comprehensive Feeding Practices Questionnaire Tested among Mothers of Primary School Children in Malaysia

    PubMed Central

    Shohaimi, Shamarina; Yoke Wei, Wong; Mohd Shariff, Zalilah

    2014-01-01

    Comprehensive feeding practices questionnaire (CFPQ) is an instrument specifically developed to evaluate parental feeding practices. It has been confirmed among children in America and applied to populations in France, Norway, and New Zealand. In order to extend the application of CFPQ, we conducted a factor structure validation of the translated version of CFPQ (CFPQ-M) using confirmatory factor analysis among mothers of primary school children (N = 397) in Malaysia. Several items were modified for cultural adaptation. Of 49 items, 39 items with loading factors >0.40 were retained in the final model. The confirmatory factor analysis revealed that the final model (twelve-factor model with 39 items and 2 error covariances) displayed the best fit for our sample (Chi-square = 1147; df = 634; P < 0.05; CFI = 0.900; RMSEA = 0.045; SRMR = 0.0058). The instrument with some modifications was confirmed among mothers of school children in Malaysia. The present study extends the usability of the CFPQ and enables researchers and parents to better understand the relationships between parental feeding practices and related problems such as childhood obesity. PMID:25538958

  17. Confirmatory factor analysis of the Malay version comprehensive feeding practices questionnaire tested among mothers of primary school children in Malaysia.

    PubMed

    Shohaimi, Shamarina; Wei, Wong Yoke; Shariff, Zalilah Mohd

    2014-01-01

    Comprehensive feeding practices questionnaire (CFPQ) is an instrument specifically developed to evaluate parental feeding practices. It has been confirmed among children in America and applied to populations in France, Norway, and New Zealand. In order to extend the application of CFPQ, we conducted a factor structure validation of the translated version of CFPQ (CFPQ-M) using confirmatory factor analysis among mothers of primary school children (N = 397) in Malaysia. Several items were modified for cultural adaptation. Of 49 items, 39 items with loading factors >0.40 were retained in the final model. The confirmatory factor analysis revealed that the final model (twelve-factor model with 39 items and 2 error covariances) displayed the best fit for our sample (Chi-square = 1147; df = 634; P < 0.05; CFI = 0.900; RMSEA = 0.045; SRMR = 0.0058). The instrument with some modifications was confirmed among mothers of school children in Malaysia. The present study extends the usability of the CFPQ and enables researchers and parents to better understand the relationships between parental feeding practices and related problems such as childhood obesity.

  18. Application of factor analysis to the water quality in reservoirs

    NASA Astrophysics Data System (ADS)

    Silva, Eliana Costa e.; Lopes, Isabel Cristina; Correia, Aldina; Gonçalves, A. Manuela

    2017-06-01

    In this work we present a Factor Analysis of chemical and environmental variables of the water column and hydro-morphological features of several Portuguese reservoirs. The objective is to reduce the initial number of variables, keeping their common characteristics. Using the Factor Analysis, the environmental variables measured in the epilimnion and in the hypolimnion, together with the hydromorphological characteristics of the dams were reduced from 63 variables to only 13 factors, which explained a total of 83.348% of the variance in the original data. After performing rotation using the Varimax method, the relations between the factors and the original variables got clearer and more explainable, which provided a Factor Analysis model for these environmental variables using 13 varifactors: Water quality and distance to the source, Hypolimnion chemical composition, Sulfite-reducing bacteria and nutrients, Coliforms and faecal streptococci, Reservoir depth, Temperature, Location, among other factors.

  19. Nurses’experiences of perceived support and their contributing factors: A qualitative content analysis

    PubMed Central

    Sodeify, Roghieh; Vanaki, Zohreh; Mohammadi, Eesa

    2013-01-01

    Background: Following professional standards is the main concern of all managers in organizations. The functions of nurses are essential for both productivity and improving health organizations. In human resources management, supporting nursing profession is of ultimate importance. However, nurses’ experiences of perceived support, which are affected by various factors in workplace, have not been clearly explained yet. Thus, this study aimed to explain nurses’ experiences of perceived support and their contributing factors. Materials and Methods: This study is a qualitative research in which 12 nurses were selected through purposive sampling among nurses in university hospitals affiliated to University of Medical Sciences, Urmia, Iran, during 2011-2012. Data collection was conducted through deep interviews with semi-structural questions. All interviews were first recorded and then transcribed. Finally, data were analyzed through conventional content analysis. Results: The four main themes indicated that nurses experienced their workplace as non-supportive. Themes such as poor organizational climate, low social dignity, poor work conditions, and managers’ ignorance to individual and professional values were considered as inhibitory factors to support. Conclusion: Nursing managers can promote nurses’ positive support perceptions through recognizing inhibitory factors and applying fair solutions and take benefits of their positive consequences including high efficacy, self-esteem, and organizational commitment to promote the quality of care. PMID:23983753

  20. Says Who?: Students Apply Their Critical-Analysis Skills to Fight Town Hall

    ERIC Educational Resources Information Center

    Trimarchi, Ruth

    2002-01-01

    For some time the author looked for a tool to let students apply what they are learning about critical analysis in the science classroom to a relevant life experience. The opportunity occurred when a proposal to use environmentally friendly cleaning products in town buildings appeared on the local town meeting agenda. Using a copy of the proposal…

  1. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  2. Applying a social network analysis (SNA) approach to understanding radiologists' performance in reading mammograms

    NASA Astrophysics Data System (ADS)

    Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah

    2017-03-01

    Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.

  3. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    PubMed

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    PubMed

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically. Copyright © 2016 by the American Occupational Therapy Association, Inc.

  5. General business theory applied to the physician's practice.

    PubMed

    Shaw, D V

    2002-01-01

    In the pursuit of clinical excellence in today's competitive medical market place, practice managers--clinical or non-clinical--can loose sight of standard management and business principles that are key to success. Also, at times individuals are hesitant to identify a physician practice as a 'business,' preferring to see it as a social good. Still, it is a business--perhaps dealing with a product that is a social good, but still, a business. And, as such, benefits can be derived from a review of business management theory. This article provides a brief review of such theory and also illustrates how to apply this theory to the physician's practice. Key factors in building a successful business will be discussed and applied to the clinical practice, such as resource maximization, rate of return and product mix synergy. Some tools to assist the reader in analyzing their practice will also be provided, such as the RVU Analysis and the Ratio of Service Analysis.

  6. Using confirmatory factor analysis to explore associated factors of intimate partner violence in a sample of Chinese rural women: a cross-sectional study

    PubMed Central

    Cerulli, Catherine; Wittink, Marsha N,; Caine, Eric D.; Qiu, Peiyuan

    2018-01-01

    Objectives To estimate the prevalence of intimate partner violence (IPV) among a sample of rural Chinese women and to explore associated factors. Design Cross-sectional study. Setting Rural areas of Guangyuan City, Sichuan, China. Participants We recruited 1501 women, aged 16 years and older, who had been living locally for at least 2 years and reported being married or in a relationship during the past 12 months. They were among a sample of 1898 potential participants from our larger parent study on the prevalence of depressive-distress symptoms. Methods Participants completed demographic and social economic measures, the Short Form of the Revised Conflict Tactics Scale and the Duke Social Support Index. We applied χ2 test, analysis of variance and confirmatory factor analysis for analysis. Results The overall prevalence of IPV in the past 12 months was 29.05%; the prevalence of physical, psychological and sexual violence was 7.66%, 26.58% and 3.20%, respectively. The overall prevalence was highest among women aged 16–29 years, and was more common among those without a high school diploma and who saw their family’s financial status as very poor or stagnant. Women who were not victims of IPV had higher levels of social support. Confirmatory factor analysis showed that the total effects of social support on physical, psychological and sexual violence were −0.12, –0.35 and −0.12, respectively. The indirect effects of objective economic status on physical, psychological and sexual violence were −0.047, –0.014 and −0.047, respectively, but the total effect was not significant. The indirect effect of education on psychological violence was −0.056. Conclusion IPV is common in rural Guangyuan. Our data are comparable with the findings from north-west of China. Social support is an important protective factor. Future work is needed to develop, test and later disseminate potential IPV interventions, with a focus on building actual and perceived supportive

  7. Applied Enzymology.

    ERIC Educational Resources Information Center

    Manoharan, Asha; Dreisbach, Joseph H.

    1988-01-01

    Describes some examples of chemical and industrial applications of enzymes. Includes a background, a discussion of structure and reactivity, enzymes as therapeutic agents, enzyme replacement, enzymes used in diagnosis, industrial applications of enzymes, and immobilizing enzymes. Concludes that applied enzymology is an important factor in…

  8. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  9. Quantitative Analysis of Critical Factors for the Climate Impact of Landfill Mining.

    PubMed

    Laner, David; Cencic, Oliver; Svensson, Niclas; Krook, Joakim

    2016-07-05

    Landfill mining has been proposed as an innovative strategy to mitigate environmental risks associated with landfills, to recover secondary raw materials and energy from the deposited waste, and to enable high-valued land uses at the site. The present study quantitatively assesses the importance of specific factors and conditions for the net contribution of landfill mining to global warming using a novel, set-based modeling approach and provides policy recommendations for facilitating the development of projects contributing to global warming mitigation. Building on life-cycle assessment, scenario modeling and sensitivity analysis methods are used to identify critical factors for the climate impact of landfill mining. The net contributions to global warming of the scenarios range from -1550 (saving) to 640 (burden) kg CO2e per Mg of excavated waste. Nearly 90% of the results' total variation can be explained by changes in four factors, namely the landfill gas management in the reference case (i.e., alternative to mining the landfill), the background energy system, the composition of the excavated waste, and the applied waste-to-energy technology. Based on the analyses, circumstances under which landfill mining should be prioritized or not are identified and sensitive parameters for the climate impact assessment of landfill mining are highlighted.

  10. Toward a technology of derived stimulus relations: an analysis of articles published in the journal of applied behavior analysis, 1992-2009.

    PubMed

    Rehfeldt, Ruth Anne

    2011-01-01

    Every article on stimulus equivalence or derived stimulus relations published in the Journal of Applied Behavior Analysis was evaluated in terms of characteristics that are relevant to the development of applied technologies: the type of participants, settings, procedure (automated vs. tabletop), stimuli, and stimulus sensory modality; types of relations targeted and emergent skills demonstrated by participants; and presence versus absence of evaluation of generalization and maintenance. In most respects, published reports suggested the possibility of applied technologies but left the difficult work of technology development to future investigations, suggestions for which are provided.

  11. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  12. Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation

    PubMed Central

    Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J

    2014-01-01

    Objective To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. Materials and methods We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug–allergy, drug–drug interaction, and drug–disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Results Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1–5) compared to original alerts: 4 (1–7); p=0.024). Discussion Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. Conclusions This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. PMID:24668841

  13. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  14. Applying circular economy innovation theory in business process modeling and analysis

    NASA Astrophysics Data System (ADS)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  15. A replication of a factor analysis of motivations for trapping

    USGS Publications Warehouse

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  16. On the Relations among Regular, Equal Unique Variances, and Image Factor Analysis Models.

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.

    2000-01-01

    Investigated the conditions under which the matrix of factor loadings from the factor analysis model with equal unique variances will give a good approximation to the matrix of factor loadings from the regular factor analysis model. Extends the results to the image factor analysis model. Discusses implications for practice. (SLD)

  17. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  18. [Statistical analysis of articles in "Chinese journal of applied physiology" from 1999 to 2008].

    PubMed

    Du, Fei; Fang, Tao; Ge, Xue-ming; Jin, Peng; Zhang, Xiao-hong; Sun, Jin-li

    2010-05-01

    To evaluate the academic level and influence of "Chinese Journal of Applied Physiology" through statistical analysis for the fund sponsored articles published in the recent ten years. The articles of "Chinese Journal of Applied Physiology" from 1999 to 2008 were investigated. The number and the percentage of the fund sponsored articles, the fund organization and the author region were quantitatively analyzed by using the literature metrology method. The number of the fund sponsored articles increased unceasingly. The ratio of the fund from local government significantly enhanced in the latter five years. Most of the articles were from institutes located at Beijing, Zhejiang and Tianjin. "Chinese Journal of Applied Physiology" has a fine academic level and social influence.

  19. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    PubMed Central

    Sediyama, Cristina Y. N.; Moura, Ricardo; Garcia, Marina S.; da Silva, Antonio G.; Soraggi, Carolina; Neves, Fernando S.; Albuquerque, Maicon R.; Whiteside, Setephen P.; Malloy-Diniz, Leandro F.

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS. PMID:28484414

  20. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale.

    PubMed

    Sediyama, Cristina Y N; Moura, Ricardo; Garcia, Marina S; da Silva, Antonio G; Soraggi, Carolina; Neves, Fernando S; Albuquerque, Maicon R; Whiteside, Setephen P; Malloy-Diniz, Leandro F

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach's alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  1. Analysis of risk factors for T. brucei rhodesiense sleeping sickness within villages in south-east Uganda

    PubMed Central

    Zoller, Thomas; Fèvre, Eric M; Welburn, Susan C; Odiit, Martin; Coleman, Paul G

    2008-01-01

    Background Sleeping sickness (HAT) caused by T.b. rhodesiense is a major veterinary and human public health problem in Uganda. Previous studies have investigated spatial risk factors for T.b. rhodesiense at large geographic scales, but none have properly investigated such risk factors at small scales, i.e. within affected villages. In the present work, we use a case-control methodology to analyse both behavioural and spatial risk factors for HAT in an endemic area. Methods The present study investigates behavioural and occupational risk factors for infection with HAT within villages using a questionnaire-based case-control study conducted in 17 villages endemic for HAT in SE Uganda, and spatial risk factors in 4 high risk villages. For the spatial analysis, the location of homesteads with one or more cases of HAT up to three years prior to the beginning of the study was compared to all non-case homesteads. Analysing spatial associations with respect to irregularly shaped geographical objects required the development of a new approach to geographical analysis in combination with a logistic regression model. Results The study was able to identify, among other behavioural risk factors, having a family member with a history of HAT (p = 0.001) as well as proximity of a homestead to a nearby wetland area (p < 0.001) as strong risk factors for infection. The novel method of analysing complex spatial interactions used in the study can be applied to a range of other diseases. Conclusion Spatial risk factors for HAT are maintained across geographical scales; this consistency is useful in the design of decision support tools for intervention and prevention of the disease. Familial aggregation of cases was confirmed for T. b. rhodesiense HAT in the study and probably results from shared behavioural and spatial risk factors amongmembers of a household. PMID:18590541

  2. Two-step relaxation mode analysis with multiple evolution times applied to all-atom molecular dynamics protein simulation.

    PubMed

    Karasawa, N; Mitsutake, A; Takano, H

    2017-12-01

    Proteins implement their functionalities when folded into specific three-dimensional structures, and their functions are related to the protein structures and dynamics. Previously, we applied a relaxation mode analysis (RMA) method to protein systems; this method approximately estimates the slow relaxation modes and times via simulation and enables investigation of the dynamic properties underlying the protein structural fluctuations. Recently, two-step RMA with multiple evolution times has been proposed and applied to a slightly complex homopolymer system, i.e., a single [n]polycatenane. This method can be applied to more complex heteropolymer systems, i.e., protein systems, to estimate the relaxation modes and times more accurately. In two-step RMA, we first perform RMA and obtain rough estimates of the relaxation modes and times. Then, we apply RMA with multiple evolution times to a small number of the slowest relaxation modes obtained in the previous calculation. Herein, we apply this method to the results of principal component analysis (PCA). First, PCA is applied to a 2-μs molecular dynamics simulation of hen egg-white lysozyme in aqueous solution. Then, the two-step RMA method with multiple evolution times is applied to the obtained principal components. The slow relaxation modes and corresponding relaxation times for the principal components are much improved by the second RMA.

  3. Two-step relaxation mode analysis with multiple evolution times applied to all-atom molecular dynamics protein simulation

    NASA Astrophysics Data System (ADS)

    Karasawa, N.; Mitsutake, A.; Takano, H.

    2017-12-01

    Proteins implement their functionalities when folded into specific three-dimensional structures, and their functions are related to the protein structures and dynamics. Previously, we applied a relaxation mode analysis (RMA) method to protein systems; this method approximately estimates the slow relaxation modes and times via simulation and enables investigation of the dynamic properties underlying the protein structural fluctuations. Recently, two-step RMA with multiple evolution times has been proposed and applied to a slightly complex homopolymer system, i.e., a single [n ] polycatenane. This method can be applied to more complex heteropolymer systems, i.e., protein systems, to estimate the relaxation modes and times more accurately. In two-step RMA, we first perform RMA and obtain rough estimates of the relaxation modes and times. Then, we apply RMA with multiple evolution times to a small number of the slowest relaxation modes obtained in the previous calculation. Herein, we apply this method to the results of principal component analysis (PCA). First, PCA is applied to a 2-μ s molecular dynamics simulation of hen egg-white lysozyme in aqueous solution. Then, the two-step RMA method with multiple evolution times is applied to the obtained principal components. The slow relaxation modes and corresponding relaxation times for the principal components are much improved by the second RMA.

  4. A Factor Analysis of Learning Data and Selected Ability Test Scores

    ERIC Educational Resources Information Center

    Jones, Dorothy L.

    1976-01-01

    A verbal concept-learning task permitting the externalizing and quantifying of learning behavior and 16 ability tests were administered to female graduate students. Data were analyzed by alpha factor analysis and incomplete image analysis. Six alpha factors and 12 image factors were extracted and orthogonally rotated. Four areas of cognitive…

  5. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  6. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    PubMed

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  7. Biological risk factors for suicidal behaviors: a meta-analysis

    PubMed Central

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09–1.81) and suicide death (wOR=1.28; CI: 1.13–1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias—cytokines (wOR=2.87; CI: 1.40–5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01–1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors. PMID:27622931

  8. Applying of Decision Tree Analysis to Risk Factors Associated with Pressure Ulcers in Long-Term Care Facilities.

    PubMed

    Moon, Mikyung; Lee, Soo-Kyoung

    2017-01-01

    The purpose of this study was to use decision tree analysis to explore the factors associated with pressure ulcers (PUs) among elderly people admitted to Korean long-term care facilities. The data were extracted from the 2014 National Inpatient Sample (NIS)-data of Health Insurance Review and Assessment Service (HIRA). A MapReduce-based program was implemented to join and filter 5 tables of the NIS. The outcome predicted by the decision tree model was the prevalence of PUs as defined by the Korean Standard Classification of Disease-7 (KCD-7; code L89 * ). Using R 3.3.1, a decision tree was generated with the finalized 15,856 cases and 830 variables. The decision tree displayed 15 subgroups with 8 variables showing 0.804 accuracy, 0.820 sensitivity, and 0.787 specificity. The most significant primary predictor of PUs was length of stay less than 0.5 day. Other predictors were the presence of an infectious wound dressing, followed by having diagnoses numbering less than 3.5 and the presence of a simple dressing. Among diagnoses, "injuries to the hip and thigh" was the top predictor ranking 5th overall. Total hospital cost exceeding 2,200,000 Korean won (US $2,000) rounded out the top 7. These results support previous studies that showed length of stay, comorbidity, and total hospital cost were associated with PUs. Moreover, wound dressings were commonly used to treat PUs. They also show that machine learning, such as a decision tree, could effectively predict PUs using big data.

  9. Human Factors Vehicle Displacement Analysis: Engineering In Motion

    NASA Technical Reports Server (NTRS)

    Atencio, Laura Ashley; Reynolds, David; Robertson, Clay

    2010-01-01

    While positioned on the launch pad at the Kennedy Space Center, tall stacked launch vehicles are exposed to the natural environment. Varying directional winds and vortex shedding causes the vehicle to sway in an oscillating motion. The Human Factors team recognizes that vehicle sway may hinder ground crew operation, impact the ground system designs, and ultimately affect launch availability . The objective of this study is to physically simulate predicted oscillation envelopes identified by analysis. and conduct a Human Factors Analysis to assess the ability to carry out essential Upper Stage (US) ground operator tasks based on predicted vehicle motion.

  10. Quantitative analysis of intrinsic and extrinsic factors in the aggregation mechanism of Alzheimer-associated Aβ-peptide

    NASA Astrophysics Data System (ADS)

    Meisl, Georg; Yang, Xiaoting; Frohm, Birgitta; Knowles, Tuomas P. J.; Linse, Sara

    2016-01-01

    Disease related mutations and environmental factors are key determinants of the aggregation mechanism of the amyloid-β peptide implicated in Alzheimer's disease. Here we present an approach to investigate these factors through acquisition of highly reproducible data and global kinetic analysis to determine the mechanistic influence of intrinsic and extrinsic factors on the Aβ aggregation network. This allows us to translate the shift in macroscopic aggregation behaviour into effects on the individual underlying microscopic steps. We apply this work-flow to the disease-associated Aβ42-A2V variant, and to a variation in pH as examples of an intrinsic and an extrinsic perturbation. In both cases, our data reveal a shift towards a mechanism in which a larger fraction of the reactive flux goes via a pathway that generates potentially toxic oligomeric species in a fibril-catalyzed reaction. This is in agreement with the finding that Aβ42-A2V leads to early-onset Alzheimer’s disease and enhances neurotoxicity.

  11. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  12. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  13. Ongoing Analysis of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph; Holt, James B.; Canabal, Francisco

    1999-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  14. Incidence of cardiovascular events and associated risk factors in kidney transplant patients: a competing risks survival analysis.

    PubMed

    Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa

    2017-03-07

    The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and

  15. Deep Learning with Hierarchical Convolutional Factor Analysis

    PubMed Central

    Chen, Bo; Polatkan, Gungor; Sapiro, Guillermo; Blei, David; Dunson, David; Carin, Lawrence

    2013-01-01

    Unsupervised multi-layered (“deep”) models are considered for general data, with a particular focus on imagery. The model is represented using a hierarchical convolutional factor-analysis construction, with sparse factor loadings and scores. The computation of layer-dependent model parameters is implemented within a Bayesian setting, employing a Gibbs sampler and variational Bayesian (VB) analysis, that explicitly exploit the convolutional nature of the expansion. In order to address large-scale and streaming data, an online version of VB is also developed. The number of basis functions or dictionary elements at each layer is inferred from the data, based on a beta-Bernoulli implementation of the Indian buffet process. Example results are presented for several image-processing applications, with comparisons to related models in the literature. PMID:23787342

  16. The Meaning of Higher-Order Factors in Reflective-Measurement Models

    ERIC Educational Resources Information Center

    Eid, Michael; Koch, Tobias

    2014-01-01

    Higher-order factor analysis is a widely used approach for analyzing the structure of a multidimensional test. Whenever first-order factors are correlated researchers are tempted to apply a higher-order factor model. But is this reasonable? What do the higher-order factors measure? What is their meaning? Willoughby, Holochwost, Blanton, and Blair…

  17. Electronic health record analysis via deep poisson factor models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  18. Electronic health record analysis via deep poisson factor models

    DOE PAGES

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.; ...

    2016-01-01

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  19. Exploratory Factor Analysis of a Force Concept Inventory Data Set

    ERIC Educational Resources Information Center

    Scott, Terry F.; Schumayer, Daniel; Gray, Andrew R.

    2012-01-01

    We perform a factor analysis on a "Force Concept Inventory" (FCI) data set collected from 2109 respondents. We address two questions: the appearance of conceptual coherence in student responses to the FCI and some consequences of this factor analysis on the teaching of Newtonian mechanics. We will highlight the apparent conflation of Newton's…

  20. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures.

    PubMed

    Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each

  1. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  2. Confirmatory Factor Analysis and Differential Relationships of the Two Subdomains of Negative Symptoms in Chronically Ill Psychotic Patients

    PubMed Central

    Stiekema, Annemarie P. M.; Liemburg, Edith J.; van der Meer, Lisette; Castelein, Stynke; Stewart, Roy; van Weeghel, Jaap; Aleman, André; Bruggeman, Richard

    2016-01-01

    Research suggests a two factor structure for negative symptoms in patients with psychotic disorders: social amotivation (SA) and expressive deficits (ED). Applying this two-factor structure in clinical settings may provide valuable information with regard to outcomes and to target treatments. We aimed to investigate 1) whether the factor structure is also supported in chronically ill patients with a psychotic disorder and 2) what the relationship is between these factors and functioning (overall functioning and living situation), depressive symptoms and quality of life. 1157 Patients with a psychotic disorder and a duration of illness of 5 years or more were included in the analysis (data selected from the Pharmacotherapy Monitoring Outcome Survey; PHAMOUS). A confirmatory factor analysis was performed using items of the Positive and Negative Syndrome Scale that were previously identified to reflect negative symptoms (N1-4, N6, G5, G7, G13, G16). Subsequently, regression analysis was performed on outcomes. The results confirmed the distinction between SA (N2, N4, G16) and ED (N1, N3, N6, G5, G7, G13) in chronically ill patients. Both factors were related to worse overall functioning as measured with the Health of the Nation Outcome Scales, ED was uniquely associated with residential living status. Higher scores for SA were associated with more depressive symptoms and worse quality of life. Thus, SA is most strongly related to level of social-emotional functioning, while ED are more related to living situation and thereby are indicative of level of everyday functioning. This subdivision may be useful for research purposes and be a valuable additional tool in clinical practice and treatment development. PMID:26895203

  3. Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)

    NASA Astrophysics Data System (ADS)

    De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.

    1993-01-01

    The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.

  4. Analysis of Factors Influencing Creative Personality of Elementary School Students

    ERIC Educational Resources Information Center

    Park, Jongman; Kim, Minkee; Jang, Shinho

    2017-01-01

    This quantitative research examined factors that affect elementary students' creativity and how those factors correlate. Aiming to identify significant factors that affect creativity and to clarify the relationship between these factors by path analysis, this research was designed to be a stepping stone for creativity enhancement studies. Data…

  5. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    PubMed

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  6. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  7. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  8. Multi-Omics Factor Analysis-a framework for unsupervised integration of multi-omics data sets.

    PubMed

    Argelaguet, Ricard; Velten, Britta; Arnol, Damien; Dietrich, Sascha; Zenz, Thorsten; Marioni, John C; Buettner, Florian; Huber, Wolfgang; Stegle, Oliver

    2018-06-20

    Multi-omics studies promise the improved characterization of biological processes across molecular layers. However, methods for the unsupervised integration of the resulting heterogeneous data sets are lacking. We present Multi-Omics Factor Analysis (MOFA), a computational method for discovering the principal sources of variation in multi-omics data sets. MOFA infers a set of (hidden) factors that capture biological and technical sources of variability. It disentangles axes of heterogeneity that are shared across multiple modalities and those specific to individual data modalities. The learnt factors enable a variety of downstream analyses, including identification of sample subgroups, data imputation and the detection of outlier samples. We applied MOFA to a cohort of 200 patient samples of chronic lymphocytic leukaemia, profiled for somatic mutations, RNA expression, DNA methylation and ex vivo drug responses. MOFA identified major dimensions of disease heterogeneity, including immunoglobulin heavy-chain variable region status, trisomy of chromosome 12 and previously underappreciated drivers, such as response to oxidative stress. In a second application, we used MOFA to analyse single-cell multi-omics data, identifying coordinated transcriptional and epigenetic changes along cell differentiation. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  9. Examining evolving performance on the Force Concept Inventory using factor analysis

    NASA Astrophysics Data System (ADS)

    Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W.

    2017-06-01

    The application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a pre- and post-test, we see factor analysis as a tool by which the changes in conceptual associations made by our students may be gauged given the evolution of their response patterns. This analysis allows us to identify and track conceptual linkages, affording us insight as to how our students have matured due to instruction. We report on our analysis of 427 pre- and post-tests. The factor models for the pre- and post-tests are explored and compared along with the methodology by which these models were fit to the data. The post-test factor pattern is more aligned with an expert's interpretation of the questions' content, as it allows for a more readily identifiable relationship between factors and physical concepts. We discuss this evolution in the context of approaching the characteristics of an expert with force concepts. Also, we find that certain test items do not significantly contribute to the pre- or post-test factor models and attempt explanations as to why this is so. This may suggest that such questions may not be effective in probing the conceptual understanding of our students.

  10. Newly Graduated Nurses' Competence and Individual and Organizational Factors: A Multivariate Analysis.

    PubMed

    Numminen, Olivia; Leino-Kilpi, Helena; Isoaho, Hannu; Meretoja, Riitta

    2015-09-01

    To study the relationships between newly graduated nurses' (NGNs') perceptions of their professional competence, and individual and organizational work-related factors. A multivariate, quantitative, descriptive, correlation design was applied. Data collection took place in November 2012 with a national convenience sample of 318 NGNs representing all main healthcare settings in Finland. Five instruments measured NGNs' perceptions of their professional competence, occupational commitment, empowerment, practice environment, and its ethical climate, with additional questions on turnover intentions, job satisfaction, and demographics. Descriptive statistics summarized the demographic data, and inferential statistics multivariate path analysis modeling estimated the relationships between the variables. The strongest relationship was found between professional competence and empowerment, competence explaining 20% of the variance of empowerment. The explanatory power of competence regarding practice environment, ethical climate of the work unit, and occupational commitment, and competence's associations with turnover intentions, job satisfaction, and age, were statistically significant but considerably weaker. Higher competence and satisfaction with quality of care were associated with more positive perceptions of practice environment and its ethical climate as well as higher empowerment and occupational commitment. Apart from its association with empowerment, competence seems to be a rather independent factor in relation to the measured work-related factors. Further exploration would deepen the knowledge of this relationship, providing support for planning educational and developmental programs. Research on other individual and organizational factors is warranted to shed light on factors associated with professional competence in providing high-quality and safe care as well as retaining new nurses in the workforce. The study sheds light on the strength and direction of

  11. A critique of the usefulness of inferential statistics in applied behavior analysis

    PubMed Central

    Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.

    1998-01-01

    Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304

  12. Confirmatory factor analysis using Microsoft Excel.

    PubMed

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  13. A Time Series Analysis: Weather Factors, Human Migration and Malaria Cases in Endemic Area of Purworejo, Indonesia, 2005–2014

    PubMed Central

    REJEKI, Dwi Sarwani Sri; NURHAYATI, Nunung; AJI, Budi; MURHANDARWATI, E. Elsa Herdiana; KUSNANTO, Hari

    2018-01-01

    Background: Climatic and weather factors become important determinants of vector-borne diseases transmission like malaria. This study aimed to prove relationships between weather factors with considering human migration and previous case findings and malaria cases in endemic areas in Purworejo during 2005–2014. Methods: This study employed ecological time series analysis by using monthly data. The independent variables were the maximum temperature, minimum temperature, maximum humidity, minimum humidity, precipitation, human migration, and previous malaria cases, while the dependent variable was positive malaria cases. Three models of count data regression analysis i.e. Poisson model, quasi-Poisson model, and negative binomial model were applied to measure the relationship. The least Akaike Information Criteria (AIC) value was also performed to find the best model. Negative binomial regression analysis was considered as the best model. Results: The model showed that humidity (lag 2), precipitation (lag 3), precipitation (lag 12), migration (lag1) and previous malaria cases (lag 12) had a significant relationship with malaria cases. Conclusion: Weather, migration and previous malaria cases factors need to be considered as prominent indicators for the increase of malaria case projection. PMID:29900134

  14. Applying Intelligent Algorithms to Automate the Identification of Error Factors.

    PubMed

    Jin, Haizhe; Qu, Qingxing; Munechika, Masahiko; Sano, Masataka; Kajihara, Chisato; Duffy, Vincent G; Chen, Han

    2018-05-03

    Medical errors are the manifestation of the defects occurring in medical processes. Extracting and identifying defects as medical error factors from these processes are an effective approach to prevent medical errors. However, it is a difficult and time-consuming task and requires an analyst with a professional medical background. The issues of identifying a method to extract medical error factors and reduce the extraction difficulty need to be resolved. In this research, a systematic methodology to extract and identify error factors in the medical administration process was proposed. The design of the error report, extraction of the error factors, and identification of the error factors were analyzed. Based on 624 medical error cases across four medical institutes in both Japan and China, 19 error-related items and their levels were extracted. After which, they were closely related to 12 error factors. The relational model between the error-related items and error factors was established based on a genetic algorithm (GA)-back-propagation neural network (BPNN) model. Additionally, compared to GA-BPNN, BPNN, partial least squares regression and support vector regression, GA-BPNN exhibited a higher overall prediction accuracy, being able to promptly identify the error factors from the error-related items. The combination of "error-related items, their different levels, and the GA-BPNN model" was proposed as an error-factor identification technology, which could automatically identify medical error factors.

  15. Factor and Rasch analysis of the Fonseca anamnestic index for the diagnosis of myogenous temporomandibular disorder.

    PubMed

    Rodrigues-Bigaton, Delaine; de Castro, Ester M; Pires, Paulo F

    Rasch analysis has been used in recent studies to test the psychometric properties of a questionnaire. The conditions for use of the Rasch model are one-dimensionality (assessed via prior factor analysis) and local independence (the probability of getting a particular item right or wrong should not be conditioned upon success or failure in another). To evaluate the dimensionality and the psychometric properties of the Fonseca anamnestic index (FAI), such as the fit of the data to the model, the degree of difficulty of the items, and the ability to respond in patients with myogenous temporomandibular disorder (TMD). The sample consisted of 94 women with myogenous TMD, diagnosed by the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD), who answered the FAI. For the factor analysis, we applied the Kaiser-Meyer-Olkin test, Bartlett's sphericity, Spearman's correlation, and the determinant of the correlation matrix. For extraction of the factors/dimensions, an eigenvalue >1.0 was used, followed by oblique oblimin rotation. The Rasch analysis was conducted on the dimension that showed the highest proportion of variance explained. Adequate sample "n" and FAI multidimensionality were observed. Dimension 1 (primary) consisted of items 1, 2, 3, 6, and 7. All items of dimension 1 showed adequate fit to the model, being observed according to the degree of difficulty (from most difficult to easiest), respectively, items 2, 1, 3, 6, and 7. The FAI presented multidimensionality with its main dimension consisting of five reliable items with adequate fit to the composition of its structure. Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.

  16. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra

    NASA Astrophysics Data System (ADS)

    Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.

  17. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra.

    PubMed

    Luce, Robert; Hildebrandt, Peter; Kuhlmann, Uwe; Liesen, Jörg

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for nonnegative matrix factorization that is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with the vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed. © The Author(s) 2016.

  18. Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation.

    PubMed

    Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J

    2014-10-01

    To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug-allergy, drug-drug interaction, and drug-disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1-5) compared to original alerts: 4 (1-7); p=0.024). Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. An Annotated Bibliography of Articles in the "Journal of Speech and Language Pathology-Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Esch, Barbara E.; Forbes, Heather J.

    2017-01-01

    The open-source "Journal of Speech and Language Pathology-Applied Behavior Analysis" ("JSLP-ABA") was published online from 2006 to 2010. We present an annotated bibliography of 80 articles published in the now-defunct journal with the aim of representing its scholarly content to readers of "The Analysis of Verbal…

  1. Testing all six person-oriented principles in dynamic factor analysis.

    PubMed

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  2. Exploratory factor analysis of borderline personality disorder criteria in hospitalized adolescents.

    PubMed

    Becker, Daniel F; McGlashan, Thomas H; Grilo, Carlos M

    2006-01-01

    The authors examined the factor structure of borderline personality disorder (BPD) in hospitalized adolescents and also sought to add to the theoretical and clinical understanding of any homogeneous components by determining whether they may be related to specific forms of Axis I pathology. Subjects were 123 adolescent inpatients, who were reliably assessed with structured diagnostic interviews for Diagnostic and Statistical Manual of Mental Disorders, Revised Third Edition Axes I and II disorders. Exploratory factor analysis identified BPD components, and logistic regression analyses tested whether these components were predictive of specific Axis I disorders. Factor analysis revealed a 4-factor solution that accounted for 67.0% of the variance. Factor 1 ("suicidal threats or gestures" and "emptiness or boredom") predicted depressive disorders and alcohol use disorders. Factor 2 ("affective instability," "uncontrolled anger," and "identity disturbance") predicted anxiety disorders and oppositional defiant disorder. Factor 3 ("unstable relationships" and "abandonment fears") predicted only anxiety disorders. Factor 4 ("impulsiveness" and "identity disturbance") predicted conduct disorder and substance use disorders. Exploratory factor analysis of BPD criteria in adolescent inpatients revealed 4 BPD factors that appear to differ from those reported for similar studies of adults. The factors represent components of self-negation, irritability, poorly modulated relationships, and impulsivity--each of which is associated with characteristic Axis I pathology. These findings shed light on the nature of BPD in adolescents and may also have implications for treatment.

  3. Exploratory factor analysis of the Oral Health Impact Profile.

    PubMed

    John, M T; Reissmann, D R; Feuerstahler, L; Waller, N; Baba, K; Larsson, P; Celebić, A; Szabo, G; Rener-Sitar, K

    2014-09-01

    Although oral health-related quality of life (OHRQoL) as measured by the Oral Health Impact Profile (OHIP) is thought to be multidimensional, the nature of these dimensions is not known. The aim of this report was to explore the dimensionality of the OHIP using the Dimensions of OHRQoL (DOQ) Project, an international study of general population subjects and prosthodontic patients. Using the project's Learning Sample (n = 5173), we conducted an exploratory factor analysis on the 46 OHIP items not specifically referring to dentures for 5146 subjects with sufficiently complete data. The first eigenvalue (27·0) of the polychoric correlation matrix was more than ten times larger than the second eigenvalue (2·6), suggesting the presence of a dominant, higher-order general factor. Follow-up analyses with Horn's parallel analysis revealed a viable second-order, four-factor solution. An oblique rotation of this solution revealed four highly correlated factors that we named Oral Function, Oro-facial Pain, Oro-facial Appearance and Psychosocial Impact. These four dimensions and the strong general factor are two viable hypotheses for the factor structure of the OHIP. © 2014 John Wiley & Sons Ltd.

  4. Factors influencing crime rates: an econometric analysis approach

    NASA Astrophysics Data System (ADS)

    Bothos, John M. A.; Thomopoulos, Stelios C. A.

    2016-05-01

    The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.

  5. Factoring local sequence composition in motif significance analysis.

    PubMed

    Ng, Patrick; Keich, Uri

    2008-01-01

    We recently introduced a biologically realistic and reliable significance analysis of the output of a popular class of motif finders. In this paper we further improve our significance analysis by incorporating local base composition information. Relying on realistic biological data simulation, as well as on FDR analysis applied to real data, we show that our method is significantly better than the increasingly popular practice of using the normal approximation to estimate the significance of a finder's output. Finally we turn to leveraging our reliable significance analysis to improve the actual motif finding task. Specifically, endowing a variant of the Gibbs Sampler with our improved significance analysis we demonstrate that de novo finders can perform better than has been perceived. Significantly, our new variant outperforms all the finders reviewed in a recently published comprehensive analysis of the Harbison genome-wide binding location data. Interestingly, many of these finders incorporate additional information such as nucleosome positioning and the significance of binding data.

  6. A human factors analysis of EVA time requirements

    NASA Technical Reports Server (NTRS)

    Pate, D. W.

    1996-01-01

    Human Factors Engineering (HFE), also known as Ergonomics, is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. A human factors motion and time study was initiated with the goal of developing a database of EVA task times and a method of utilizing the database to predict how long an ExtraVehicular Activity (EVA) should take. Initial development relied on the EVA activities performed during the STS-61 mission (Hubble repair). The first step of the analysis was to become familiar with EVAs and with the previous studies and documents produced on EVAs. After reviewing these documents, an initial set of task primitives and task time modifiers was developed. Videotaped footage of STS-61 EVAs were analyzed using these primitives and task time modifiers. Data for two entire EVA missions and portions of several others, each with two EVA astronauts, was collected for analysis. Feedback from the analysis of the data will be used to further refine the primitives and task time modifiers used. Analysis of variance techniques for categorical data will be used to determine which factors may, individually or by interactions, effect the primitive times and how much of an effect they have.

  7. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rankmore » impacts both overcompleteness and sparsity.« less

  8. Factor Analysis of Drawings: Application to College Student Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-01-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance,…

  9. The health of Australian veterans of the 1991 Gulf War: factor analysis of self-reported symptoms.

    PubMed

    Forbes, A B; McKenzie, D P; Mackinnon, A J; Kelsall, H L; McFarlane, A C; Ikin, J F; Glass, D C; Sim, M R

    2004-12-01

    A recent report showed that Australian veterans of the 1991 Gulf War displayed a greater prevalence of a multitude of self-reported symptoms than a randomly sampled comparison group of military personnel who were eligible for deployment but were not deployed to the Gulf. To investigate whether the pattern, rather than frequency, of symptom reporting in these Australian Gulf War veterans differed from that of the comparison group personnel. Factor analysis was used to determine whether the co-occurrence of 62 symptoms in 1322 male Gulf War veterans can be explained by a number of underlying dimensions, called factors. The methodology was also applied to 1459 male comparison group subjects and the factor solutions of the two groups were compared. For the Gulf War veterans, a three factor solution displayed replicability and construct validity. The three factors were labelled as psycho-physiological distress, somatic distress, and arthro-neuromuscular distress, and were broadly similar to those described in previous studies of Gulf War veterans. A concordant three factor solution was also found for the comparison group subjects, with strong convergence of the factor loadings and factor scores across the two groups being displayed. Results did not display evidence of a unique pattern of self-reported symptoms among Gulf War veterans. Results also indicated that the differences between the groups lie in the degrees of expression of the three underlying factors, consistent with the well documented evidence of increased self-reported symptom prevalence in Gulf War veterans.

  10. Meta-analysis reveals that seed-applied neonicotinoids and pyrethroids have similar negative effects on abundance of arthropod natural enemies

    PubMed Central

    Tooker, John F.

    2016-01-01

    Background Seed-applied neonicotinoids are widely used in agriculture, yet their effects on non-target species remain incompletely understood. One important group of non-target species is arthropod natural enemies (predators and parasitoids), which contribute considerably to suppression of crop pests. We hypothesized that seed-applied neonicotinoids reduce natural-enemy abundance, but not as strongly as alternative insecticide options such as soil- and foliar-applied pyrethroids. Furthermore we hypothesized that seed-applied neonicotinoids affect natural enemies through a combination of toxin exposure and prey scarcity. Methods To test our hypotheses, we compiled datasets comprising observations from randomized field studies in North America and Europe that compared natural-enemy abundance in plots that were planted with seed-applied neonicotinoids to control plots that were either (1) managed without insecticides (20 studies, 56 site-years, 607 observations) or (2) managed with pyrethroid insecticides (eight studies, 15 site-years, 384 observations). Using the effect size Hedge’s d as the response variable, we used meta-regression to estimate the overall effect of seed-applied neonicotinoids on natural-enemy abundance and to test the influence of potential moderating factors. Results Seed-applied neonicotinoids reduced the abundance of arthropod natural enemies compared to untreated controls (d = −0.30 ± 0.10 [95% confidence interval]), and as predicted under toxin exposure this effect was stronger for insect than for non-insect taxa (QM = 8.70, df = 1, P = 0.003). Moreover, seed-applied neonicotinoids affected the abundance of arthropod natural enemies similarly to soil- or foliar-applied pyrethroids (d = 0.16 ± 0.42 or −0.02 ± 0.12; with or without one outlying study). Effect sizes were surprisingly consistent across both datasets (I2 = 2.7% for no-insecticide controls; I2 = 0% for pyrethroid controls), suggesting little moderating influence of

  11. Meta-analysis reveals that seed-applied neonicotinoids and pyrethroids have similar negative effects on abundance of arthropod natural enemies.

    PubMed

    Douglas, Margaret R; Tooker, John F

    2016-01-01

    Seed-applied neonicotinoids are widely used in agriculture, yet their effects on non-target species remain incompletely understood. One important group of non-target species is arthropod natural enemies (predators and parasitoids), which contribute considerably to suppression of crop pests. We hypothesized that seed-applied neonicotinoids reduce natural-enemy abundance, but not as strongly as alternative insecticide options such as soil- and foliar-applied pyrethroids. Furthermore we hypothesized that seed-applied neonicotinoids affect natural enemies through a combination of toxin exposure and prey scarcity. To test our hypotheses, we compiled datasets comprising observations from randomized field studies in North America and Europe that compared natural-enemy abundance in plots that were planted with seed-applied neonicotinoids to control plots that were either (1) managed without insecticides (20 studies, 56 site-years, 607 observations) or (2) managed with pyrethroid insecticides (eight studies, 15 site-years, 384 observations). Using the effect size Hedge's d as the response variable, we used meta-regression to estimate the overall effect of seed-applied neonicotinoids on natural-enemy abundance and to test the influence of potential moderating factors. Seed-applied neonicotinoids reduced the abundance of arthropod natural enemies compared to untreated controls ( d = -0.30 ± 0.10 [95% confidence interval]), and as predicted under toxin exposure this effect was stronger for insect than for non-insect taxa ( Q M = 8.70, df = 1, P = 0.003). Moreover, seed-applied neonicotinoids affected the abundance of arthropod natural enemies similarly to soil- or foliar-applied pyrethroids ( d = 0.16 ± 0.42 or -0.02 ± 0.12; with or without one outlying study). Effect sizes were surprisingly consistent across both datasets ( I 2  = 2.7% for no-insecticide controls; I 2  = 0% for pyrethroid controls), suggesting little moderating influence of crop species

  12. Sensitivity Analysis of the USLE Soil Erodibility Factor to Its Determining Parameters

    NASA Astrophysics Data System (ADS)

    Mitova, Milena; Rousseva, Svetla

    2014-05-01

    Soil erosion is recognized as one of the most serious soil threats worldwide. Soil erosion prediction is the first step in soil conservation planning. The Universal Soil Loss Equation (USLE) is one of the most widely used models for soil erosion predictions. One of the five USLE predictors is the soil erodibility factor (K-factor), which evaluates the impact of soil characteristics on soil erosion rates. Soil erodibility nomograph defines K-factor depending on soil characteristics, such as: particle size distribution (fractions finer that 0.002 mm and from 0.1 to 0.002 mm), organic matter content, soil structure and soil profile water permeability. Identifying the soil characteristics, which mostly influence the K-factor would give an opportunity to control the soil loss through erosion by controlling the parameters, which reduce the K-factor value. The aim of the report is to present the results of analysis of the relative weight of these soil characteristics in the K-factor values. The relative impact of the soil characteristics on K-factor was studied through a series of statistical analyses of data from the geographic database for soil erosion risk assessments in Bulgaria. Degree of correlation between K-factor values and the parameters that determine it was studied by correlation analysis. The sensitivity of the K-factor was determined by studying the variance of each parameter within the range between minimum and maximum possible values considering average value of the other factors. Normalizing transformation of data sets was applied because of the different dimensions and the orders of variation of the values of the various parameters. The results show that the content of particles finer than 0.002 mm has the most significant relative impact on the soil erodibility, followed by the content of particles with size from 0.1 mm to 0.002 mm, the class of the water permeability of the soil profile, the content of organic matter and the aggregation class. The

  13. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  14. Applied behaviour analysis and standard treatment in intellectual disability: 2-year outcomes.

    PubMed

    Hassiotis, Angela; Canagasabey, Anton; Robotham, Daniel; Marston, Louise; Romeo, Renee; King, Michael

    2011-06-01

    Applied behaviour analysis by a specialist team plus standard treatment for adults with intellectual disability displaying challenging behaviour was reported to be clinically and cost-effective after 6 months. In a 2-year follow-up of the same trial cohort, participants receiving the specialist intervention had significantly lower total and subdomain Aberrant Behavior Checklist scores than those receiving usual care alone. After adjustment for baseline covariates there was no significant difference in costs between the trial arms.

  15. Meta-analysis in applied ecology.

    PubMed

    Stewart, Gavin

    2010-02-23

    This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.

  16. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  17. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  18. Applying reliability analysis to design electric power systems for More-electric aircraft

    NASA Astrophysics Data System (ADS)

    Zhang, Baozhu

    The More-Electric Aircraft (MEA) is a type of aircraft that replaces conventional hydraulic and pneumatic systems with electrically powered components. These changes have significantly challenged the aircraft electric power system design. This thesis investigates how reliability analysis can be applied to automatically generate system topologies for the MEA electric power system. We first use a traditional method of reliability block diagrams to analyze the reliability level on different system topologies. We next propose a new methodology in which system topologies, constrained by a set reliability level, are automatically generated. The path-set method is used for analysis. Finally, we interface these sets of system topologies with control synthesis tools to automatically create correct-by-construction control logic for the electric power system.

  19. SIAST Retention Study. Factors Affecting Retention of First-Year Students in a Canadian Technical Institute of Applied Science and Technology.

    ERIC Educational Resources Information Center

    Sarkar, Gerlinde

    All first-year students enrolled in diploma and certificate programs in the Saskatchewan Institute of Applied Science and Technology (SIAST) were surveyed to determine factors that influence student persistence. A questionnaire was mailed to 2,822 students in October 1991; 1,557 completed questionnaires were received and analyzed. A follow-up…

  20. Physics Metacognition Inventory Part Ii: Confirmatory Factor Analysis and Rasch Analysis

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-01-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition,…

  1. Revealing unobserved factors underlying cortical activity with a rectified latent variable model applied to neural population recordings.

    PubMed

    Whiteway, Matthew R; Butts, Daniel A

    2017-03-01

    The activity of sensory cortical neurons is not only driven by external stimuli but also shaped by other sources of input to the cortex. Unlike external stimuli, these other sources of input are challenging to experimentally control, or even observe, and as a result contribute to variability of neural responses to sensory stimuli. However, such sources of input are likely not "noise" and may play an integral role in sensory cortex function. Here we introduce the rectified latent variable model (RLVM) in order to identify these sources of input using simultaneously recorded cortical neuron populations. The RLVM is novel in that it employs nonnegative (rectified) latent variables and is much less restrictive in the mathematical constraints on solutions because of the use of an autoencoder neural network to initialize model parameters. We show that the RLVM outperforms principal component analysis, factor analysis, and independent component analysis, using simulated data across a range of conditions. We then apply this model to two-photon imaging of hundreds of simultaneously recorded neurons in mouse primary somatosensory cortex during a tactile discrimination task. Across many experiments, the RLVM identifies latent variables related to both the tactile stimulation as well as nonstimulus aspects of the behavioral task, with a majority of activity explained by the latter. These results suggest that properly identifying such latent variables is necessary for a full understanding of sensory cortical function and demonstrate novel methods for leveraging large population recordings to this end. NEW & NOTEWORTHY The rapid development of neural recording technologies presents new opportunities for understanding patterns of activity across neural populations. Here we show how a latent variable model with appropriate nonlinear form can be used to identify sources of input to a neural population and infer their time courses. Furthermore, we demonstrate how these sources are

  2. Clinical efficacy analysis of Ahmed glaucoma valve implantation in neovascular glaucoma and influencing factors

    PubMed Central

    He, Ye; Tian, Ying; Song, Weitao; Su, Ting; Jiang, Haibo; Xia, Xiaobo

    2017-01-01

    Abstract This study aimed to evaluate the efficacy of Ahmed glaucoma valve (AGV) implantation in treating neovascular glaucoma (NVG) and to analyze the factors influencing the surgical success rate. This is a retrospective review of 40 eyes of 40 NVG patients who underwent AGV implantation at Xiangya Hospital of Central South University, China, between January 2014 and December 2016. Pre- and postoperative intraocular pressure (IOP), visual acuity, surgical success rate, medications, and complications were observed. Surgical success criteria were defined as IOP ≤21 and >6 mm Hg with or without additional medications. Kaplan–Meier survival curves and Multivariate cox regression analysis were used to examine success rates and risk factors for surgical outcomes. The mean follow-up period was 8.88 ± 3.12 months (range: 3–17). IOP declined at each visit postoperatively and it was statistically significant (P < .001). An average of 3.55 ± 0.86 drugs was applied preoperatively, while an average of 0.64 ± 0.90 drugs was used postoperatively, with the difference being of statistical significance (P < .05). The complete surgical success rate of 3, 6, and 12 months after the operation was 85%, 75%, and 65%, respectively. Meanwhile, the qualified success rate of 3, 6, and 12 months after the operation was 85%, 80%, and 77.5%, respectively. The multivariate cox regression analysis showed that age (hazard ratio: 3.717, 7.246; 95% confidence interval: 1.149–12.048, 1.349–38.461; P = .028, .021) was influencing factors for complete success rate and qualified success rate among all NVG patients. Gender, previous operation history, primary disease, and preoperative IOP were found to be not significant. AGV implantation is an effective and safe surgical method to treat NVG. Age is an important factor influencing the surgical success rate. PMID:29049253

  3. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    ERIC Educational Resources Information Center

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  4. Applied Behaviour Analysis: Does Intervention Intensity Relate to Family Stressors and Maternal Well-Being?

    ERIC Educational Resources Information Center

    Schwichtenberg, A.; Poehlmann, J.

    2007-01-01

    Background: Interventions based on applied behaviour analysis (ABA) are commonly recommended for children with an autism spectrum disorder (ASD); however, few studies address how this intervention model impacts families. The intense requirements that ABA programmes place on children and families are often cited as a critique of the programme,…

  5. Connectivism in Postsecondary Online Courses: An Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hogg, Nanette; Lomicky, Carol S.

    2012-01-01

    This study explores 465 postsecondary students' experiences in online classes through the lens of connectivism. Downes' 4 properties of connectivism (diversity, autonomy, interactivity, and openness) were used as the study design. An exploratory factor analysis was performed. This study found a 4-factor solution. Subjects indicated that autonomy…

  6. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    PubMed

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  7. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  8. Factor analysis of the Hamilton Depression Rating Scale in Parkinson's disease.

    PubMed

    Broen, M P G; Moonen, A J H; Kuijf, M L; Dujardin, K; Marsh, L; Richard, I H; Starkstein, S E; Martinez-Martin, P; Leentjens, A F G

    2015-02-01

    Several studies have validated the Hamilton Depression Rating Scale (HAMD) in patients with Parkinson's disease (PD), and reported adequate reliability and construct validity. However, the factorial validity of the HAMD has not yet been investigated. The aim of our analysis was to explore the factor structure of the HAMD in a large sample of PD patients. A principal component analysis of the 17-item HAMD was performed on data of 341 PD patients, available from a previous cross sectional study on anxiety. An eigenvalue ≥1 was used to determine the number of factors. Factor loadings ≥0.4 in combination with oblique rotations were used to identify which variables made up the factors. Kaiser-Meyer-Olkin measure (KMO), Cronbach's alpha, Bartlett's test, communality, percentage of non-redundant residuals and the component correlation matrix were computed to assess factor validity. KMO verified the sample's adequacy for factor analysis and Cronbach's alpha indicated a good internal consistency of the total scale. Six factors had eigenvalues ≥1 and together explained 59.19% of the variance. The number of items per factor varied from 1 to 6. Inter-item correlations within each component were low. There was a high percentage of non-redundant residuals and low communality. This analysis demonstrates that the factorial validity of the HAMD in PD is unsatisfactory. This implies that the scale is not appropriate for studying specific symptom domains of depression based on factorial structure in a PD population. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Multiscale permutation entropy analysis of electrocardiogram

    NASA Astrophysics Data System (ADS)

    Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao

    2017-04-01

    To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.

  10. Evaluating voice characteristics of first-year acting students in Israel: factor analysis.

    PubMed

    Amir, Ofer; Primov-Fever, Adi; Kushnir, Tami; Kandelshine-Waldman, Osnat; Wolf, Michael

    2013-01-01

    Acting students require diverse, high-quality, and high-intensity vocal performance from early stages of their training. Demanding vocal activities, before developing the appropriate vocal skills, put them in high risk for developing vocal problems. A retrospective analysis of voice characteristics of first-year acting students using several voice evaluation tools. A total of 79 first-year acting students (55 women and 24 men) were assigned into two study groups: laryngeal findings (LFs) and no laryngeal findings, based on stroboscopic findings. Their voice characteristics were evaluated using acoustic analysis, aerodynamic examination, perceptual scales, and self-report questionnaires. Results obtained from each set of measures were examined using a factor analysis approach. Significant differences between the two groups were found for a single fundamental frequency (F(0))-Regularity factor; a single Grade, Roughness, Breathiness, Asthenia, Strain perceptual factor; and the three self-evaluation factors. Gender differences were found for two acoustic analysis factors, which were based on F(0) and its derivatives, namely an aerodynamic factor that represents expiratory volume measurements and a single self-evaluation factor that represents the tendency to seek therapy. Approximately 50% of the first-year acting students had LFs. These students differed from their peers in the control group in a single acoustic analysis factor, as well as perceptual and self-report factors. No group differences, however, were found for the aerodynamic factors. Early laryngeal examination and voice evaluation of future professional voice users could provide a valuable individual baseline, to which later examinations could be compared, and assist in providing personally tailored treatment. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  11. Inference of sigma factor controlled networks by using numerical modeling applied to microarray time series data of the germinating prokaryote.

    PubMed

    Strakova, Eva; Zikova, Alice; Vohradsky, Jiri

    2014-01-01

    A computational model of gene expression was applied to a novel test set of microarray time series measurements to reveal regulatory interactions between transcriptional regulators represented by 45 sigma factors and the genes expressed during germination of a prokaryote Streptomyces coelicolor. Using microarrays, the first 5.5 h of the process was recorded in 13 time points, which provided a database of gene expression time series on genome-wide scale. The computational modeling of the kinetic relations between the sigma factors, individual genes and genes clustered according to the similarity of their expression kinetics identified kinetically plausible sigma factor-controlled networks. Using genome sequence annotations, functional groups of genes that were predominantly controlled by specific sigma factors were identified. Using external binding data complementing the modeling approach, specific genes involved in the control of the studied process were identified and their function suggested.

  12. Risk factors for re-bleeding of aneurysmal subarachnoid hemorrhage: meta-analysis of observational studies.

    PubMed

    Alfotih, Gobran Taha Ahmed; Li, FangCheng; Xu, XinKe; Zhang, ShangYi

    2014-01-01

    The mortality of re-bleeding following aneurysmal subarachnoid hemorrhage is high, and surviving patients often have poor clinical condition and worse outcome than patients with a single bleed. In this study, we performed an updated systematic review and meta-analysis to determine the most common risk factors for re-bleeding in this patient population, with the goal of providing neurologists, neurosurgeons, neuro-interventionalists with a simple and fast method to evaluate the re-bleeding risk for aneurysmal subarachnoid hemorrhage. We conducted a thorough meta-analysis of the risk factors associated with re-bleeding or re-rupture of intracranial aneurysms in cases published between 2000 and 2013. Pooled mean difference was calculated for the continuous variables (age), and pooled odds ratio (OR) was calculated for categorical factors. If heterogeneity was significant (p<0.05), a random effect model was applied; otherwise, a fixed model was used. Testing for pooled effects and statistical significance for each potential risk factor were analyzed using Review Manager software. Our literature search identified 174 articles. Of these, only seven retrospective studies met the inclusion criteria. These seven studies consisted of 2470 patients, 283 of which had aneurysmal re-bleeding, resulting in a weighted average rate of re-bleeding of 11.3% with 95% confidence interval [CI]: 10.1-12.6. In this population, sex (OR 1.46; 95% CI: 1.11-1.92), high systolic blood pressure [SBP] (OR 2.52; 95% CI: 1.40-4.53), aneurysm size (OR 3.00; 95% CI: 2.06-4.37), clinical condition (Hunt & Hess) (OR 4.94; 95% CI: 2.29,10.68), and Fisher grade (OR 2.29; 95% CI: 1.45, 3.61) were statistically significant risk factors for re-bleeding. Sex, high SBP, high Fisher grade, aneurysm size larger than 10mm, and poor clinical condition were independent risk factors for aneurysmal re-bleeding. The importance of early aneurysm intervention and careful consideration of patient risk factors should be

  13. Prediction of beef carcass and meat quality traits from factors characterising the rearing management system applied during the whole life of heifers.

    PubMed

    Soulat, J; Picard, B; Léger, S; Monteils, V

    2018-06-01

    In this study, four prediction models were developed by logistic regression using individual data from 96 heifers. Carcass and sensory rectus abdominis quality clusters were identified then predicted using the rearing factors data. The obtained models from rearing factors applied during the fattening period were compared to those characterising the heifers' whole life. The highest prediction power of carcass and meat quality clusters were obtained from the models considering the whole life, with success rates of 62.8% and 54.9%, respectively. Rearing factors applied during both pre-weaning and fattening periods influenced carcass and meat quality. According to models, carcass traits were improved when heifer's mother was older for first calving, calves ingested concentrates during pasture preceding weaning and heifers were slaughtered older. Meat traits were improved by the genetic of heifers' parents (i.e., calving ease and early muscularity) and when heifers were slaughtered older. A management of carcass and meat quality traits is possible at different periods of the heifers' life. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Three-dimensional computer-aided human factors engineering analysis of a grafting robot.

    PubMed

    Chiu, Y C; Chen, S; Wu, G J; Lin, Y H

    2012-07-01

    The objective of this research was to conduct a human factors engineering analysis of a grafting robot design using computer-aided 3D simulation technology. A prototype tubing-type grafting robot for fruits and vegetables was the subject of a series of case studies. To facilitate the incorporation of human models into the operating environment of the grafting robot, I-DEAS graphic software was applied to establish individual models of the grafting robot in line with Jack ergonomic analysis. Six human models (95th percentile, 50th percentile, and 5th percentile by height for both males and females) were employed to simulate the operating conditions and working postures in a real operating environment. The lower back and upper limb stresses of the operators were analyzed using the lower back analysis (LBA) and rapid upper limb assessment (RULA) functions in Jack. The experimental results showed that if a leg space is introduced under the robot, the operator can sit closer to the robot, which reduces the operator's level of lower back and upper limbs stress. The proper environmental layout for Taiwanese operators for minimum levels of lower back and upper limb stress are to set the grafting operation at 23.2 cm away from the operator at a height of 85 cm and with 45 cm between the rootstock and scion units.

  15. Establishing Evidence for Internal Structure Using Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Watson, Joshua C.

    2017-01-01

    Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…

  16. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    NASA Astrophysics Data System (ADS)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  17. Phase quantification by X-ray photoemission valence band analysis applied to mixed phase TiO2 powders

    NASA Astrophysics Data System (ADS)

    Breeson, Andrew C.; Sankar, Gopinathan; Goh, Gregory K. L.; Palgrave, Robert G.

    2017-11-01

    A method of quantitative phase analysis using valence band X-ray photoelectron spectra is presented and applied to the analysis of TiO2 anatase-rutile mixtures. The valence band spectra of pure TiO2 polymorphs were measured, and these spectral shapes used to fit valence band spectra from mixed phase samples. Given the surface sensitive nature of the technique, this yields a surface phase fraction. Mixed phase samples were prepared from high and low surface area anatase and rutile powders. In the samples studied here, the surface phase fraction of anatase was found to be linearly correlated with photocatalytic activity of the mixed phase samples, even for samples with very different anatase and rutile surface areas. We apply this method to determine the surface phase fraction of P25 powder. This method may be applied to other systems where a surface phase fraction is an important characteristic.

  18. Applying to Higher Education: Information Sources and Choice Factors

    ERIC Educational Resources Information Center

    Simoes, Claudia; Soares, Ana Maria

    2010-01-01

    Higher education institutions are facing increasingly complex challenges, which demand a deeper understanding of the sources prospective students use when applying to a higher education institution. This research centres on students' decision-making process for higher education institutions, focusing on the pre-purchase period, and, in particular,…

  19. Stability Analysis of Finite Difference Schemes for Hyperbolic Systems, and Problems in Applied and Computational Linear Algebra.

    DTIC Science & Technology

    FINITE DIFFERENCE THEORY, * LINEAR ALGEBRA , APPLIED MATHEMATICS, APPROXIMATION(MATHEMATICS), BOUNDARY VALUE PROBLEMS, COMPUTATIONS, HYPERBOLAS, MATHEMATICAL MODELS, NUMERICAL ANALYSIS, PARTIAL DIFFERENTIAL EQUATIONS, STABILITY.

  20. Systems thinking applied to safety during manual handling tasks in the transport and storage industry.

    PubMed

    Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter

    2014-07-01

    Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Analysis of the influencing factors of global energy interconnection development

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; He, Yongxiu; Ge, Sifan; Liu, Lin

    2018-04-01

    Under the background of building global energy interconnection and achieving green and low-carbon development, this paper grasps a new round of energy restructuring and the trend of energy technology change, based on the present situation of global and China's global energy interconnection development, established the index system of the impact of global energy interconnection development factors. A subjective and objective weight analysis of the factors affecting the development of the global energy interconnection was conducted separately by network level analysis and entropy method, and the weights are summed up by the method of additive integration, which gives the comprehensive weight of the influencing factors and the ranking of their influence.

  2. Different spectrophotometric methods applied for the analysis of binary mixture of flucloxacillin and amoxicillin: A comparative study.

    PubMed

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-05-15

    Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. [Analysis of dietary pattern and diabetes mellitus influencing factors identified by classification tree model in adults of Fujian].

    PubMed

    Yu, F L; Ye, Y; Yan, Y S

    2017-05-10

    Objective: To find out the dietary patterns and explore the relationship between environmental factors (especially dietary patterns) and diabetes mellitus in the adults of Fujian. Methods: Multi-stage sampling method were used to survey residents aged ≥18 years by questionnaire, physical examination and laboratory detection in 10 disease surveillance points in Fujian. Factor analysis was used to identify the dietary patterns, while logistic regression model was applied to analyze relationship between dietary patterns and diabetes mellitus, and classification tree model was adopted to identify the influencing factors for diabetes mellitus. Results: There were four dietary patterns in the population, including meat, plant, high-quality protein, and fried food and beverages patterns. The result of logistic analysis showed that plant pattern, which has higher factor loading of fresh fruit-vegetables and cereal-tubers, was a protective factor for non-diabetes mellitus. The risk of diabetes mellitus in the population at T2 and T3 levels of factor score were 0.727 (95 %CI: 0.561-0.943) times and 0.736 (95 %CI : 0.573-0.944) times higher, respectively, than those whose factor score was in lowest quartile. Thirteen influencing factors and eleven group at high-risk for diabetes mellitus were identified by classification tree model. The influencing factors were dyslipidemia, age, family history of diabetes, hypertension, physical activity, career, sex, sedentary time, abdominal adiposity, BMI, marital status, sleep time and high-quality protein pattern. Conclusion: There is a close association between dietary patterns and diabetes mellitus. It is necessary to promote healthy and reasonable diet, strengthen the monitoring and control of blood lipids, blood pressure and body weight, and have good lifestyle for the prevention and control of diabetes mellitus.

  4. Examining Evolving Performance on the Force Concept Inventory Using Factor Analysis

    ERIC Educational Resources Information Center

    Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W

    2017-01-01

    The application of factor analysis to the "Force Concept Inventory" (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a…

  5. A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis

    ERIC Educational Resources Information Center

    Edwards, Michael C.

    2010-01-01

    Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…

  6. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    ERIC Educational Resources Information Center

    Baglin, James

    2014-01-01

    Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…

  7. Orthogonal higher order structure and confirmatory factor analysis of the French Wechsler Adult Intelligence Scale (WAIS-III).

    PubMed

    Golay, Philippe; Lecerf, Thierry

    2011-03-01

    According to the most widely accepted Cattell-Horn-Carroll (CHC) model of intelligence measurement, each subtest score of the Wechsler Intelligence Scale for Adults (3rd ed.; WAIS-III) should reflect both 1st- and 2nd-order factors (i.e., 4 or 5 broad abilities and 1 general factor). To disentangle the contribution of each factor, we applied a Schmid-Leiman orthogonalization transformation (SLT) to the standardization data published in the French technical manual for the WAIS-III. Results showed that the general factor accounted for 63% of the common variance and that the specific contributions of the 1st-order factors were weak (4.7%-15.9%). We also addressed this issue by using confirmatory factor analysis. Results indicated that the bifactor model (with 1st-order group and general factors) better fit the data than did the traditional higher order structure. Models based on the CHC framework were also tested. Results indicated that a higher order CHC model showed a better fit than did the classical 4-factor model; however, the WAIS bifactor structure was the most adequate. We recommend that users do not discount the Full Scale IQ when interpreting the index scores of the WAIS-III because the general factor accounts for the bulk of the common variance in the French WAIS-III. The 4 index scores cannot be considered to reflect only broad ability because they include a strong contribution of the general factor.

  8. Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis

    ERIC Educational Resources Information Center

    Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John

    2012-01-01

    Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…

  9. Risk factors for Salmonella spp in Portuguese breeding pigs using a multilevel analysis.

    PubMed

    Correia-Gomes, C; Mendonça, D; Vieira-Pinto, M; Niza-Ribeiro, J

    2013-02-01

    Salmonella is the second most frequent cause of foodborne illness in the European Union (EU), so EU enforced legislation to achieve a reduction in Salmonella prevalence in the swine sector. To set the reduction target each country carried out a baseline survey to estimate Salmonella prevalence. The aim of our study was to identify risk factors for the presence of Salmonella in breeding pigs based on the data of the Baseline Study for Salmonella in Breeding Pigs in Portugal. In total, 1670 pen fecal samples from 167 herds were tested by culture and 170 samples tested positive. Along with the collection of the samples a survey was applied to collect information about the herd management and potential risk factors. Multilevel analysis was applied to the data using generalized linear mixed models and a logit link function. The outcome variable was the presence/absence of Salmonella in the pen fecal samples. The first level was assigned to the pen fecal samples and the second level to the herds. The results showed significant associations between Salmonella occurrence and the factors (p<0.05): maternity pens versus mating pens (OR=0.39, 95%CI: 0.24-0.63), feed from external or mixed source versus home source (OR=2.81, 95%CI: 1.19-6.61), more than 10 animals per pen versus 10 animals per pen (OR=2.02, 95%CI: 1.19-3.43), North Region versus Alentejo Region (OR=3.86, 95%CI: 1.08-13.75), rodents control (OR=0.23, 95%CI: 0.090-0.59), more than 90% of boars homebred or no boars versus more than 90% of boars from an external source (OR=0.54, 95%CI: 0.3-0.97), semen from another herd versus semen from insemination centers (OR=4.47, 95%CI: 1.38-14.43) and herds with a size of 170 or more sows (OR=1.82, 95%CI: 1.04-3.19). This study offers very relevant information for both the Portuguese veterinary authorities and the pig farmers currently developing control programmes for Salmonella. This is the first study providing evidence for semen and boars source as risk factors for

  10. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  11. Understanding clinician attitudes towards implementation of guided self-help cognitive behaviour therapy for those who hear distressing voices: using factor analysis to test normalisation process theory.

    PubMed

    Hazell, Cassie M; Strauss, Clara; Hayward, Mark; Cavanagh, Kate

    2017-07-24

    The Normalisation Process Theory (NPT) has been used to understand the implementation of physical health care interventions. The current study aims to apply the NPT model to a secondary mental health context, and test the model using exploratory factor analysis. This study will consider the implementation of a brief cognitive behaviour therapy for psychosis (CBTp) intervention. Mental health clinicians were asked to complete a NPT-based questionnaire on the implementation of a brief CBTp intervention. All clinicians had experience of either working with the target client group or were able to deliver psychological therapies. In total, 201 clinicians completed the questionnaire. The results of the exploratory factor analysis found partial support for the NPT model, as three of the NPT factors were extracted: (1) coherence, (2) cognitive participation, and (3) reflexive monitoring. We did not find support for the fourth NPT factor (collective action). All scales showed strong internal consistency. Secondary analysis of these factors showed clinicians to generally support the implementation of the brief CBTp intervention. This study provides strong evidence for the validity of the three NPT factors extracted. Further research is needed to determine whether participants' level of seniority moderates factor extraction, whether this factor structure can be generalised to other healthcare settings, and whether pre-implementation attitudes predict actual implementation outcomes.

  12. School-wide PBIS: An Example of Applied Behavior Analysis Implemented at a Scale of Social Importance.

    PubMed

    Horner, Robert H; Sugai, George

    2015-05-01

    School-wide Positive Behavioral Interventions and Supports (PBIS) is an example of applied behavior analysis implemented at a scale of social importance. In this paper, PBIS is defined and the contributions of behavior analysis in shaping both the content and implementation of PBIS are reviewed. Specific lessons learned from implementation of PBIS over the past 20 years are summarized.

  13. A Technique of Two-Stage Clustering Applied to Environmental and Civil Engineering and Related Methods of Citation Analysis.

    ERIC Educational Resources Information Center

    Miyamoto, S.; Nakayama, K.

    1983-01-01

    A method of two-stage clustering of literature based on citation frequency is applied to 5,065 articles from 57 journals in environmental and civil engineering. Results of related methods of citation analysis (hierarchical graph, clustering of journals, multidimensional scaling) applied to same set of articles are compared. Ten references are…

  14. Space operations and the human factor

    NASA Astrophysics Data System (ADS)

    Brody, Adam R.

    1993-10-01

    Although space flight does not put the public at high risk, billions of dollars in hardware are destroyed and the space program halted when an accident occurs. Researchers are therefore applying human-factors techniques similar to those used in the aircraft industry, albeit at a greatly reduced level, to the spacecraft environment. The intent is to reduce the likelihood of catastrophic failure. To increase safety and efficiency, space human factors researchers have simulated spacecraft docking and extravehicular activity rescue. Engineers have also studied EVA suit mobility and aids. Other basic human-factors issues that have been applied to the space environment include antropometry, biomechanics, and ergonomics. Workstation design, workload, and task analysis currently receive much attention, as do habitability and other aspects of confined environments. Much work also focuses on individual payloads, as each presents its own complexities.

  15. Assessing suicide risk among callers to crisis hotlines: a confirmatory factor analysis.

    PubMed

    Witte, Tracy K; Gould, Madelyn S; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E; Kalafat, John

    2010-09-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. We conducted an Exploratory Factor Analysis (EFA), an EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and a CFA on independent subsamples derived from a total sample of 1,085. Similar to previous studies, we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations; RPP) and one factor representing milder suicidal ideation (i.e., Suicidal Desire and Ideation; SDI). The RPP factor trended toward being more predictive of suicidal ideation at follow-up than the SDI factor. (c) 2010 Wiley Periodicals, Inc.

  16. Management factors affecting ammonia volatilization from land-applied cattle slurry in the Mid-Atlantic USA.

    PubMed

    Thompson, R B; Meisinger, J J

    2002-01-01

    Ammonia (NH3) volatilization commonly causes a substantial loss of crop-available N from surface-applied cattle slurry. Field studies were conducted with small wind tunnels to assess the effect of management factors on NH3 volatilization. Two studies compared NH3 volatilization from grass sward and bare soil. The average total NH3 loss was 1.5 times greater from slurry applied to grass sward. Two studies examined the effect of slurry dry matter (DM) content on NH3 loss under hot, summer conditions in Maryland, USA. Slurry DM contents were between 54 and 134 g kg(-1). Dry matter content did not affect total NH3 loss, but did influence the time course of NH3 loss. Higher DM content slurries had relatively higher rates of NH3 volatilization during the first 12 to 24 h, but lower rates thereafter. Under the hot conditions, the higher DM content slurries appeared to dry and crust more rapidly causing smaller rates of NH3 volatilization after 12 to 24 h, which offset the earlier positive effects of DM content on NH3 volatilization. Three studies compared immediate incorporation with different tillage implements. Total NH3 loss from unincorporated slurry was 45% of applied slurry NH4+-N, while losses following immediate incorporation with a moldboard plow, tandem-disk harrow, or chisel plow were, respectively, 0 to 3, 2 to 8, and 8 to 12%. These ground cover and DM content data can be used to improve predictions of NH3 loss under specific farming conditions. The immediate incorporation data demonstrate management practices that can reduce NH3 volatilization, which can improve slurry N utilization in crop-forage production.

  17. Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.

    PubMed

    Cinco, M

    1977-11-01

    Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.

  18. Risk factors for baclofen pump infection in children: a multivariate analysis.

    PubMed

    Spader, Heather S; Bollo, Robert J; Bowers, Christian A; Riva-Cambrin, Jay

    2016-06-01

    OBJECTIVE Intrathecal baclofen infusion systems to manage severe spasticity and dystonia are associated with higher infection rates in children than in adults. Factors unique to this population, such as poor nutrition and physical limitations for pump placement, have been hypothesized as the reasons for this disparity. The authors assessed potential risk factors for infection in a multivariate analysis. METHODS Patients who underwent implantation of a programmable pump and intrathecal catheter for baclofen infusion at a single center between January 1, 2000, and March 1, 2012, were identified in this retrospective cohort study. The primary end point was infection. Potential risk factors investigated included preoperative (i.e., demographics, body mass index [BMI], gastrostomy tube, tracheostomy, previous spinal fusion), intraoperative (i.e., surgeon, antibiotics, pump size, catheter location), and postoperative (i.e., wound dehiscence, CSF leak, and number of revisions) factors. Univariate analysis was performed, and a multivariate logistic regression model was created to identify independent risk factors for infection. RESULTS A total of 254 patients were evaluated. The overall infection rate was 9.8%. Univariate analysis identified young age, shorter height, lower weight, dehiscence, CSF leak, and number of revisions within 6 months of pump placement as significantly associated with infection. Multivariate analysis identified young age, dehiscence, and number of revisions as independent risk factors for infection. CONCLUSIONS Young age, wound dehiscence, and number of revisions were independent risk factors for infection in this pediatric cohort. A low BMI and the presence of either a gastrostomy or tracheostomy were not associated with infection and may not be contraindications for this procedure.

  19. The application of Global Sensitivity Analysis to quantify the dominant input factors for hydraulic model simulations

    NASA Astrophysics Data System (ADS)

    Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2015-04-01

    Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum

  20. Metabolic syndrome in menopause and associated factors: a meta-analysis.

    PubMed

    Pu, D; Tan, R; Yu, Q; Wu, J

    2017-12-01

    Metabolic syndrome (MetS) is a cluster of risk factors for cardiovascular disease and diabetes. Menopause is associated with an increased risk for MetS. The purpose of this meta-analysis is to better understand the relationship between MetS and menopause. MEDLINE and EMBASE were searched for all the associated articles on (1) MetS components in postmenopausal women vs. premenopausal women, (2) comparison of MetS incidence between surgical menopause and natural menopause, (3) the effect of hormone therapy (HT) with 17β-estradiol (E2) compared to conjugated equine estrogen (CEE) on MetS components among postmenopausal women. A meta-analysis was applied by Review Manager 5.3 software. All comparable indicators were significantly unfavorably changed in postmenopausal women compared to premenopausal women except for high density lipoprotein cholesterol. Women who underwent surgical menopause suffered a 1.51-fold higher risk for MetS compared to those with natural menopause. HT with E2 provided more benefits for levels of triglyceride and diastolic blood, while CEE showed a better effect on both high and low density lipoprotein cholesterol levels. Menopause nearly adversely affects all components of MetS, and surgical menopause may lead to a higher incidence of MetS compared to natural menopause. HT with various preparations may have different effects on MetS components. These results may clarify the management of menopause-related MetS in clinical practice.

  1. An implict LU scheme for the Euler equations applied to arbitrary cascades. [new method of factoring

    NASA Technical Reports Server (NTRS)

    Buratynski, E. K.; Caughey, D. A.

    1984-01-01

    An implicit scheme for solving the Euler equations is derived and demonstrated. The alternating-direction implicit (ADI) technique is modified, using two implicit-operator factors corresponding to lower-block-diagonal (L) or upper-block-diagonal (U) algebraic systems which can be easily inverted. The resulting LU scheme is implemented in finite-volume mode and applied to 2D subsonic and transonic cascade flows with differing degrees of geometric complexity. The results are presented graphically and found to be in good agreement with those of other numerical and analytical approaches. The LU method is also 2.0-3.4 times faster than ADI, suggesting its value in calculating 3D problems.

  2. Analysis on Influence Factors of Adaptive Filter Acting on ANC

    NASA Astrophysics Data System (ADS)

    Zhang, Xiuqun; Zou, Liang; Ni, Guangkui; Wang, Xiaojun; Han, Tao; Zhao, Quanfu

    The noise problem has become more and more serious in recent years. The adaptive filter theory which is applied in ANC [1] (active noise control) has also attracted more and more attention. In this article, the basic principle and algorithm of adaptive theory are both researched. And then the influence factor that affects its covergence rate and noise reduction is also simulated.

  3. Analysis of Spatial Pattern and Influencing Factors of E-Commerce

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Chen, J.; Zhang, S.

    2017-09-01

    This paper aims to study the relationship between e-commerce development and geographical characteristics using data of e-commerce, economy, Internet, express delivery and population from 2011 to 2015. Moran's I model and GWR model are applied to analyze the spatial pattern of E-commerce and its influencing factors. There is a growth trend of e-commerce from west to east, and it is obvious to see that e-commerce development has a space-time clustering, especially around the Yangtze River delta. The comprehensive factors caculated through PCA are described as fundamental social productivity, resident living standard and population sex structure. The first two factors have positive correlation with e-commerce, and the intensity of effect increases yearly. However, the influence of population sex structure on the E-commerce development is not significant. Our results suggest that the clustering of e-commerce has a downward trend and the impact of driving factors on e-commerce is observably distinct from year to year in space.

  4. 34 CFR 477.4 - What regulations apply?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., DEPARTMENT OF EDUCATION STATE PROGRAM ANALYSIS ASSISTANCE AND POLICY STUDIES PROGRAM General § 477.4 What regulations apply? The following regulations apply to the State Program Analysis Assistance and Policy Studies...

  5. 34 CFR 477.4 - What regulations apply?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF EDUCATION STATE PROGRAM ANALYSIS ASSISTANCE AND POLICY STUDIES PROGRAM General § 477.4 What regulations apply? The following regulations apply to the State Program Analysis Assistance and Policy Studies...

  6. Quantifying the importance of spatial resolution and other factors through global sensitivity analysis of a flood inundation model

    NASA Astrophysics Data System (ADS)

    Thomas Steven Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2016-11-01

    Where high-resolution topographic data are available, modelers are faced with the decision of whether it is better to spend computational resource on resolving topography at finer resolutions or on running more simulations to account for various uncertain input factors (e.g., model parameters). In this paper we apply global sensitivity analysis to explore how influential the choice of spatial resolution is when compared to uncertainties in the Manning's friction coefficient parameters, the inflow hydrograph, and those stemming from the coarsening of topographic data used to produce Digital Elevation Models (DEMs). We apply the hydraulic model LISFLOOD-FP to produce several temporally and spatially variable model outputs that represent different aspects of flood inundation processes, including flood extent, water depth, and time of inundation. We find that the most influential input factor for flood extent predictions changes during the flood event, starting with the inflow hydrograph during the rising limb before switching to the channel friction parameter during peak flood inundation, and finally to the floodplain friction parameter during the drying phase of the flood event. Spatial resolution and uncertainty introduced by resampling topographic data to coarser resolutions are much more important for water depth predictions, which are also sensitive to different input factors spatially and temporally. Our findings indicate that the sensitivity of LISFLOOD-FP predictions is more complex than previously thought. Consequently, the input factors that modelers should prioritize will differ depending on the model output assessed, and the location and time of when and where this output is most relevant.

  7. Job demands and job strain as risk factors for employee wellbeing in elderly care: an instrumental-variables analysis.

    PubMed

    Elovainio, Marko; Heponiemi, Tarja; Kuusio, Hannamaria; Jokela, Markus; Aalto, Anna-Mari; Pekkarinen, Laura; Noro, Anja; Finne-Soveri, Harriet; Kivimäki, Mika; Sinervo, Timo

    2015-02-01

    The association between psychosocial work environment and employee wellbeing has repeatedly been shown. However, as environmental evaluations have typically been self-reported, the observed associations may be attributable to reporting bias. Applying instrumental-variable regression, we used staffing level (the ratio of staff to residents) as an unconfounded instrument for self-reported job demands and job strain to predict various indicators of wellbeing (perceived stress, psychological distress and sleeping problems) among 1525 registered nurses, practical nurses and nursing assistants working in elderly care wards. In ordinary regression, higher self-reported job demands and job strain were associated with increased risk of perceived stress, psychological distress and sleeping problems. The effect estimates for the associations of these psychosocial factors with perceived stress and psychological distress were greater, but less precisely estimated, in an instrumental-variables analysis which took into account only the variation in self-reported job demands and job strain that was explained by staffing level. No association between psychosocial factors and sleeping problems was observed with the instrumental-variable analysis. These results support a causal interpretation of high self-reported job demands and job strain being risk factors for employee wellbeing. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  8. Six Sigma methods applied to cryogenic coolers assembly line

    NASA Astrophysics Data System (ADS)

    Ventre, Jean-Marc; Germain-Lacour, Michel; Martin, Jean-Yves; Cauquil, Jean-Marc; Benschop, Tonny; Griot, René

    2009-05-01

    Six Sigma method have been applied to manufacturing process of a rotary Stirling cooler: RM2. Name of the project is NoVa as main goal of the Six Sigma approach is to reduce variability (No Variability). Project has been based on the DMAIC guideline following five stages: Define, Measure, Analyse, Improve, Control. Objective has been set on the rate of coolers succeeding performance at first attempt with a goal value of 95%. A team has been gathered involving people and skills acting on the RM2 manufacturing line. Measurement System Analysis (MSA) has been applied to test bench and results after R&R gage show that measurement is one of the root cause for variability in RM2 process. Two more root causes have been identified by the team after process mapping analysis: regenerator filling factor and cleaning procedure. Causes for measurement variability have been identified and eradicated as shown by new results from R&R gage. Experimental results show that regenerator filling factor impacts process variability and affects yield. Improved process haven been set after new calibration process for test bench, new filling procedure for regenerator and an additional cleaning stage have been implemented. The objective for 95% coolers succeeding performance test at first attempt has been reached and kept for a significant period. RM2 manufacturing process is now managed according to Statistical Process Control based on control charts. Improvement in process capability have enabled introduction of sample testing procedure before delivery.

  9. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    PubMed

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. A Framework for Batched and GPU-Resident Factorization Algorithms Applied to Block Householder Transformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Tingzing Tim; Tomov, Stanimire Z; Luszczek, Piotr R

    As modern hardware keeps evolving, an increasingly effective approach to developing energy efficient and high-performance solvers is to design them to work on many small size and independent problems. Many applications already need this functionality, especially for GPUs, which are currently known to be about four to five times more energy efficient than multicore CPUs. We describe the development of one-sided factorizations that work for a set of small dense matrices in parallel, and we illustrate our techniques on the QR factorization based on Householder transformations. We refer to this mode of operation as a batched factorization. Our approach ismore » based on representing the algorithms as a sequence of batched BLAS routines for GPU-only execution. This is in contrast to the hybrid CPU-GPU algorithms that rely heavily on using the multicore CPU for specific parts of the workload. But for a system to benefit fully from the GPU's significantly higher energy efficiency, avoiding the use of the multicore CPU must be a primary design goal, so the system can rely more heavily on the more efficient GPU. Additionally, this will result in the removal of the costly CPU-to-GPU communication. Furthermore, we do not use a single symmetric multiprocessor(on the GPU) to factorize a single problem at a time. We illustrate how our performance analysis, and the use of profiling and tracing tools, guided the development and optimization of our batched factorization to achieve up to a 2-fold speedup and a 3-fold energy efficiency improvement compared to our highly optimized batched CPU implementations based on the MKL library(when using two sockets of Intel Sandy Bridge CPUs). Compared to a batched QR factorization featured in the CUBLAS library for GPUs, we achieved up to 5x speedup on the K40 GPU.« less

  11. FACTOR 9.2: A Comprehensive Program for Fitting Exploratory and Semiconfirmatory Factor Analysis and IRT Models

    ERIC Educational Resources Information Center

    Lorenzo-Seva, Urbano; Ferrando, Pere J.

    2013-01-01

    FACTOR 9.2 was developed for three reasons. First, exploratory factor analysis (FA) is still an active field of research although most recent developments have not been incorporated into available programs. Second, there is now renewed interest in semiconfirmatory (SC) solutions as suitable approaches to the complex structures are commonly found…

  12. Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors

    USDA-ARS?s Scientific Manuscript database

    Process factors of enzyme concentration, time, power and frequency were investigated for ultrasound-enhanced bioscouring of greige cotton. A fractional factorial experimental design and subsequent regression analysis of the process factors were employed to determine the significance of each factor a...

  13. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  14. Confirmatory Factor Analysis of the WISC-III with Child Psychiatric Inpatients.

    ERIC Educational Resources Information Center

    Tupa, David J.; Wright, Margaret O'Dougherty; Fristad, Mary A.

    1997-01-01

    Factor models of the Wechsler Intelligence Scale for Children-Third Edition (WISC-III) for one, two, three, and four factors were tested using confirmatory factor analysis with a sample of 177 child psychiatric inpatients. The four-factor model proposed in the WISC-III manual provided the best fit to the data. (SLD)

  15. Confirmatory Factor Analysis of the Finnish Job Content Questionnaire (JCQ) in 590 Professional Musicians.

    PubMed

    Vastamäki, Heidi; Vastamäki, Martti; Laimi, Katri; Saltychev, Michail

    2017-07-01

    Poorly functioning work environments may lead to dissatisfaction for the employees and financial loss for the employers. The Job Content Questionnaire (JCQ) was designed to measure social and psychological characteristics of work environments. To investigate the factor construct of the Finnish 14-item version of JCQ when applied to professional orchestra musicians. In a cross-sectional survey, the questionnaire was sent by mail to 1550 orchestra musicians and students. 630 responses were received. Full data were available for 590 respondents (response rate 38%).The questionnaire also contained questions on demographics, job satisfaction, health status, health behaviors, and intensity of playing music. Confirmatory factor analysis of the 2-factor model of JCQ was conducted. Of the 5 estimates, JCQ items in the "job demand" construct, the "conflicting demands" (question 5) explained most of the total variance in this construct (79%) demonstrating almost perfect correlation of 0.63. In the construct of "job control," "opinions influential" (question 10) demonstrated a perfect correlation index of 0.84 and the items "little decision freedom" (question 14) and "allows own decisions" (question 6) showed substantial correlations of 0.77 and 0.65. The 2-factor model of the Finnish 14-item version of JCQ proposed in this study fitted well into the observed data. The "conflicting demands," "opinions influential," "little decision freedom," and "allows own decisions" items demonstrated the strongest correlations with latent factors suggesting that in a population similar to the studied one, especially these items should be taken into account when observed in the response of a population.

  16. Emotional experiences and motivating factors associated with fingerprint analysis.

    PubMed

    Charlton, David; Fraser-Mackenzie, Peter A F; Dror, Itiel E

    2010-03-01

    In this study, we investigated the emotional and motivational factors involved in fingerprint analysis in day-to-day routine case work and in significant and harrowing criminal investigations. Thematic analysis was performed on interviews with 13 experienced fingerprint examiners from a variety of law enforcement agencies. The data revealed factors relating to job satisfaction and the use of skill. Individual satisfaction related to catching criminals was observed; this was most notable in solving high profile, serious, or long-running cases. There were positive emotional effects associated with matching fingerprints and apparent fear of making errors. Finally, we found evidence for a need of cognitive closure in fingerprint examiner decision-making.

  17. Image-derived input function with factor analysis and a-priori information.

    PubMed

    Simončič, Urban; Zanotti-Fregonara, Paolo

    2015-02-01

    Quantitative PET studies often require the cumbersome and invasive procedure of arterial cannulation to measure the input function. This study sought to minimize the number of necessary blood samples by developing a factor-analysis-based image-derived input function (IDIF) methodology for dynamic PET brain studies. IDIF estimation was performed as follows: (a) carotid and background regions were segmented manually on an early PET time frame; (b) blood-weighted and tissue-weighted time-activity curves (TACs) were extracted with factor analysis; (c) factor analysis results were denoised and scaled using the voxels with the highest blood signal; (d) using population data and one blood sample at 40 min, whole-blood TAC was estimated from postprocessed factor analysis results; and (e) the parent concentration was finally estimated by correcting the whole-blood curve with measured radiometabolite concentrations. The methodology was tested using data from 10 healthy individuals imaged with [(11)C](R)-rolipram. The accuracy of IDIFs was assessed against full arterial sampling by comparing the area under the curve of the input functions and by calculating the total distribution volume (VT). The shape of the image-derived whole-blood TAC matched the reference arterial curves well, and the whole-blood area under the curves were accurately estimated (mean error 1.0±4.3%). The relative Logan-V(T) error was -4.1±6.4%. Compartmental modeling and spectral analysis gave less accurate V(T) results compared with Logan. A factor-analysis-based IDIF for [(11)C](R)-rolipram brain PET studies that relies on a single blood sample and population data can be used for accurate quantification of Logan-V(T) values.

  18. Testing of technology readiness index model based on exploratory factor analysis approach

    NASA Astrophysics Data System (ADS)

    Ariani, AF; Napitupulu, D.; Jati, RK; Kadar, JA; Syafrullah, M.

    2018-04-01

    SMEs readiness in using ICT will determine the adoption of ICT in the future. This study aims to evaluate the model of technology readiness in order to apply the technology on SMEs. The model is tested to find if TRI model is relevant to measure ICT adoption, especially for SMEs in Indonesia. The research method used in this paper is survey to a group of SMEs in South Tangerang. The survey measures the readiness to adopt ICT based on four variables which is Optimism, Innovativeness, Discomfort, and Insecurity. Each variable contains several indicators to make sure the variable is measured thoroughly. The data collected through survey is analysed using factor analysis methodwith the help of SPSS software. The result of this study shows that TRI model gives more descendants on some indicators and variables. This result can be caused by SMEs owners’ knowledge is not homogeneous about either the technology that they are used, knowledge or the type of their business.

  19. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  20. What Is Rotating in Exploratory Factor Analysis?

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2015-01-01

    Exploratory factor analysis (EFA) is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what "rotation" is, what exactly is rotating, and why we use rotation when performing…

  1. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  2. Sparse non-negative matrix factorizations via alternating non-negativity-constrained least squares for microarray data analysis.

    PubMed

    Kim, Hyunsoo; Park, Haesun

    2007-06-15

    Many practical pattern recognition problems require non-negativity constraints. For example, pixels in digital images and chemical concentrations in bioinformatics are non-negative. Sparse non-negative matrix factorizations (NMFs) are useful when the degree of sparseness in the non-negative basis matrix or the non-negative coefficient matrix in an NMF needs to be controlled in approximating high-dimensional data in a lower dimensional space. In this article, we introduce a novel formulation of sparse NMF and show how the new formulation leads to a convergent sparse NMF algorithm via alternating non-negativity-constrained least squares. We apply our sparse NMF algorithm to cancer-class discovery and gene expression data analysis and offer biological analysis of the results obtained. Our experimental results illustrate that the proposed sparse NMF algorithm often achieves better clustering performance with shorter computing time compared to other existing NMF algorithms. The software is available as supplementary material.

  3. Logistic Regression and Path Analysis Method to Analyze Factors influencing Students’ Achievement

    NASA Astrophysics Data System (ADS)

    Noeryanti, N.; Suryowati, K.; Setyawan, Y.; Aulia, R. R.

    2018-04-01

    Students' academic achievement cannot be separated from the influence of two factors namely internal and external factors. The first factors of the student (internal factors) consist of intelligence (X1), health (X2), interest (X3), and motivation of students (X4). The external factors consist of family environment (X5), school environment (X6), and society environment (X7). The objects of this research are eighth grade students of the school year 2016/2017 at SMPN 1 Jiwan Madiun sampled by using simple random sampling. Primary data are obtained by distributing questionnaires. The method used in this study is binary logistic regression analysis that aims to identify internal and external factors that affect student’s achievement and how the trends of them. Path Analysis was used to determine the factors that influence directly, indirectly or totally on student’s achievement. Based on the results of binary logistic regression, variables that affect student’s achievement are interest and motivation. And based on the results obtained by path analysis, factors that have a direct impact on student’s achievement are students’ interest (59%) and students’ motivation (27%). While the factors that have indirect influences on students’ achievement, are family environment (97%) and school environment (37).

  4. [Effect factors analysis of knee function recovery after distal femoral fracture operation].

    PubMed

    Bei, Chaoyong; Wang, Ruiying; Tang, Jicun; Li, Qiang

    2009-09-01

    To investigate the effect factors of knee function recovery after operation in distal femoral fractures. From January 2001 to May 2007, 92 cases of distal femoral fracture were treated. There were 50 males and 42 females, aged 20-77 years old (average 46.7 years old). Fracture was caused by traffic accident in 48 cases, by falling from height in 26 cases, by bruise in 12 cases and by tumble in 6 cases. According to Müller's Fracture classification, there were 29 cases of type A, 12 cases of type B and 51 cases of type C. According to American Society of Anesthesiologists (ASA) classification, there were 21 cases of grade I, 39 cases of grade II, 24 cases of grade III, and 8 cases of grade IV. The time from injury to operation was 4 hours to 24 days with an average of 7 days. Anatomical plate was used in 43 cases, retrograde interlocking intramedullary nail in 37 cases, and bone screws, bolts and internal fixation with Kirschner pins in 12 cases. After operation, the HSS knee function score was used to evaluate efficacy. Ten related factors were applied for statistical analysis, to knee function recovery after operation in distal femoral fractures, such as age, sex, preoperative ASA classification, injury to surgery time, fracture type, treatment, reduction quality, functional exercise after operation, whether or not CPM functional training and postoperative complications. Wound healed by first intention in 88 cases, infection occurred in 4 cases. All patients followed up 16-32 months with an average of 23.1 months. Clinical union of fracture was achieved within 3-7 months after operation. Extensor device adhesions and the scope of activities of <80 degrees occurred in 29 cases, traumatic arthritis in 25 cases, postoperative fracture displacement in 6 cases, mild knee varus or valgus in 7 cases and implant loosening in 6 cases. According to HSS knee function score, the results were excellent in 52 cases, good in 15 cases, fair in 10 cases and poor in 15 cases with

  5. Nuclear factor-kappaB bioluminescence imaging-guided transcriptomic analysis for the assessment of host-biomaterial interaction in vivo.

    PubMed

    Hsiang, Chien-Yun; Chen, Yueh-Sheng; Ho, Tin-Yun

    2009-06-01

    Establishment of a comprehensive platform for the assessment of host-biomaterial interaction in vivo is an important issue. Nuclear factor-kappaB (NF-kappaB) is an inducible transcription factor that is activated by numerous stimuli. Therefore, NF-kappaB-dependent luminescent signal in transgenic mice carrying the luciferase genes was used as the guide to monitor the biomaterials-affected organs, and transcriptomic analysis was further applied to evaluate the complex host responses in affected organs in this study. In vivo imaging showed that genipin-cross-linked gelatin conduit (GGC) implantation evoked the strong NF-kappaB activity at 6h in the implanted region, and transcriptomic analysis showed that the expressions of interleukin-6 (IL-6), IL-24, and IL-1 family were up-regulated. A strong luminescent signal was observed in spleen on 14 d, suggesting that GGC implantation might elicit the biological events in spleen. Transcriptomic analysis of spleen showed that 13 Kyoto Encyclopedia of Genes and Genomes pathways belonging to cell cycles, immune responses, and metabolism were significantly altered by GGC implants. Connectivity Map analysis suggested that the gene signatures of GGC were similar to those of compounds that affect lipid or glucose metabolism. GeneSetTest analysis further showed that host responses to GGC implants might be related to diseases states, especially the metabolic and cardiovascular diseases. In conclusion, our data provided a concept of molecular imaging-guided transcriptomic platform for the evaluation and the prediction of host-biomaterial interaction in vivo.

  6. Factors Associated With Care Workers' Intention to Leave Employment in Nursing Homes: A Secondary Data Analysis of the Swiss Nursing Homes Human Resources Project.

    PubMed

    Gaudenz, Clergia; De Geest, Sabina; Schwendimann, René; Zúñiga, Franziska

    2017-07-01

    The emerging care personnel shortage in Swiss nursing homes is aggravated by high turnover rates. As intention to leave is a predictor of turnover, awareness of its associated factors is essential. This study applied a secondary data analysis to evaluate the prevalence and variability of 3,984 nursing home care workers' intention to leave. Work environment factors and care worker outcomes were tested via multiple regression analysis. Although 56% of care workers reported intention to leave, prevalences varied widely between facilities. Overall, intention to leave showed strong inverse relationships with supportive leadership and affective organizational commitment and weaker positive relationships with stress due to workload, emotional exhaustion, and care worker health problems. The strong direct relationship of nursing home care workers' intention to leave with affective organizational commitment and perceptions of leadership quality suggest that multilevel interventions to improve these factors might reduce intention to leave.

  7. Multigroup confirmatory factor analysis and structural invariance with age of the Behavior Rating Inventory of Executive Function (BRIEF)--French version.

    PubMed

    Fournet, Nathalie; Roulin, Jean-Luc; Monnier, Catherine; Atzeni, Thierry; Cosnefroy, Olivier; Le Gall, Didier; Roy, Arnaud

    2015-01-01

    The parent and teacher forms of the French version of the Behavioral Rating Inventory of Executive Function (BRIEF) were used to evaluate executive function in everyday life in a large sample of healthy children (N = 951) aged between 5 and 18. Several psychometric methods were applied, with a view to providing clinicians with tools for score interpretation. The parent and teacher forms of the BRIEF were acceptably reliable. Demographic variables (such as age and gender) were found to influence the BRIEF scores. Confirmatory factor analysis was then used to test five competing models of the BRIEF's latent structure. Two of these models (a three-factor model and a two-factor model, both based on a nine-scale structure) had a good fit. However, structural invariance with age was only obtained with the two-factor model. The French version of the BRIEF provides a useful measure of everyday executive function and can be recommended for use in clinical research and practice.

  8. Applied Counterfactual Reasoning

    NASA Astrophysics Data System (ADS)

    Hendrickson, Noel

    This chapter addresses two goals: The development of a structured method to aid intelligence and security analysts in assessing counterfactuals, and forming a structured method to educate (future) analysts in counterfactual reasoning. In order to pursue these objectives, I offer here an analysis of the purposes, problems, parts, and principles of applied counterfactual reasoning. In particular, the ways in which antecedent scenarios are selected and the ways in which scenarios are developed constitute essential (albeit often neglected) aspects of counterfactual reasoning. Both must be addressed to apply counterfactual reasoning effectively. Naturally, further issues remain, but these should serve as a useful point of departure. They are the beginning of a path to more rigorous and relevant counterfactual reasoning in intelligence analysis and counterterrorism.

  9. Factor Analysis of the Aberrant Behavior Checklist in Individuals with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Brinkley, Jason; Nations, Laura; Abramson, Ruth K.; Hall, Alicia; Wright, Harry H.; Gabriels, Robin; Gilbert, John R.; Pericak-Vance, Margaret A. O.; Cuccaro, Michael L.

    2007-01-01

    Exploratory factor analysis (varimax and promax rotations) of the aberrant behavior checklist-community version (ABC) in 275 individuals with Autism spectrum disorder (ASD) identified four- and five-factor solutions which accounted for greater than 70% of the variance. Confirmatory factor analysis (Lisrel 8.7) revealed indices of moderate fit for…

  10. A Cost-Benefit Analysis Applied to Example Proposals for Army Training and Education Research

    DTIC Science & Technology

    2008-02-01

    with individuals who have autism (Marckel, Neef, and Ferreri, 2006): Language and communication are major areas of concern for children with autism ...communicative interactions of children with autism and enable them to exercise control over their environments (e.g., by making requests)... Fields...preliminary analysis of teaching improvisation with the Picture Exchange Communication System to children with autism . Journal of Applied Behavior

  11. Independent Component Analysis applied to Ground-based observations

    NASA Astrophysics Data System (ADS)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  12. A factor analysis of landscape pattern and structure metrics

    Treesearch

    Kurt H. Riitters; R.V. O' Neill; C.T. Hunsaker; James D. Wickham; D.H. Yankee; S.P. Timmins; K.B. Jones; B.L. Jackson

    1995-01-01

    Fifty-five metrics of landscape pattern and structure were calculated for 85 maps of land use and land cover. A multivariate factor analysis was used to identify the common axes (or dimensions) of pattern and structure which were measured by a reduced set of 26 metrics. The first six factors explained about 87% of the variation in the 26 landscape metrics. These...

  13. Measuring psychosocial environments using individual responses: an application of multilevel factor analysis to examining students in schools.

    PubMed

    Dunn, Erin C; Masyn, Katherine E; Jones, Stephanie M; Subramanian, S V; Koenen, Karestan C

    2015-07-01

    Interest in understanding how psychosocial environments shape youth outcomes has grown considerably. School environments are of particular interest to prevention scientists as many prevention interventions are school-based. Therefore, effective conceptualization and operationalization of the school environment is critical. This paper presents an illustration of an emerging analytic method called multilevel factor analysis (MLFA) that provides an alternative strategy to conceptualize, measure, and model environments. MLFA decomposes the total sample variance-covariance matrix for variables measured at the individual level into within-cluster (e.g., student level) and between-cluster (e.g., school level) matrices and simultaneously models potentially distinct latent factor structures at each level. Using data from 79,362 students from 126 schools in the National Longitudinal Study of Adolescent to Adult Health (formerly known as the National Longitudinal Study of Adolescent Health), we use MLFA to show how 20 items capturing student self-reported behaviors and emotions provide information about both students (within level) and their school environment (between level). We identified four latent factors at the within level: (1) school adjustment, (2) externalizing problems, (3) internalizing problems, and (4) self-esteem. Three factors were identified at the between level: (1) collective school adjustment, (2) psychosocial environment, and (3) collective self-esteem. The finding of different and substantively distinct latent factor structures at each level emphasizes the need for prevention theory and practice to separately consider and measure constructs at each level of analysis. The MLFA method can be applied to other nested relationships, such as youth in neighborhoods, and extended to a multilevel structural equation model to better understand associations between environments and individual outcomes and therefore how to best implement preventive interventions.

  14. Phenotypic factor analysis of psychopathology reveals a new body-related transdiagnostic factor.

    PubMed

    Pezzoli, Patrizia; Antfolk, Jan; Santtila, Pekka

    2017-01-01

    Comorbidity challenges the notion of mental disorders as discrete categories. An increasing body of literature shows that symptoms cut across traditional diagnostic boundaries and interact in shaping the latent structure of psychopathology. Using exploratory and confirmatory factor analysis, we reveal the latent sources of covariation among nine measures of psychopathological functioning in a population-based sample of 13024 Finnish twins and their siblings. By implementing unidimensional, multidimensional, second-order, and bifactor models, we illustrate the relationships between observed variables, specific, and general latent factors. We also provide the first investigation to date of measurement invariance of the bifactor model of psychopathology across gender and age groups. Our main result is the identification of a distinct "Body" factor, alongside the previously identified Internalizing and Externalizing factors. We also report relevant cross-disorder associations, especially between body-related psychopathology and trait anger, as well as substantial sex and age differences in observed and latent means. The findings expand the meta-structure of psychopathology, with implications for empirical and clinical practice, and demonstrate shared mechanisms underlying attitudes towards nutrition, self-image, sexuality and anger, with gender- and age-specific features.

  15. Bayes factor design analysis: Planning for compelling evidence.

    PubMed

    Schönbrodt, Felix D; Wagenmakers, Eric-Jan

    2018-02-01

    A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.

  16. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  17. Exploratory Factor Analysis with Small Sample Sizes

    ERIC Educational Resources Information Center

    de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.

    2009-01-01

    Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…

  18. Space Human Factors Engineering Gap Analysis Project Final Report

    NASA Technical Reports Server (NTRS)

    Hudy, Cynthia; Woolford, Barbara

    2006-01-01

    Humans perform critical functions throughout each phase of every space mission, beginning with the mission concept and continuing to post-mission analysis (Life Sciences Division, 1996). Space missions present humans with many challenges - the microgravity environment, relative isolation, and inherent dangers of the mission all present unique issues. As mission duration and distance from Earth increases, in-flight crew autonomy will increase along with increased complexity. As efforts for exploring the moon and Mars advance, there is a need for space human factors research and technology development to play a significant role in both on-orbit human-system interaction, as well as the development of mission requirements and needs before and after the mission. As part of the Space Human Factors Engineering (SHFE) Project within the Human Research Program (HRP), a six-month Gap Analysis Project (GAP) was funded to identify any human factors research gaps or knowledge needs. The overall aim of the project was to review the current state of human factors topic areas and requirements to determine what data, processes, or tools are needed to aid in the planning and development of future exploration missions, and also to prioritize proposals for future research and technology development.

  19. Assessing Suicide Risk Among Callers to Crisis Hotlines: A Confirmatory Factor Analysis

    PubMed Central

    Witte, Tracy K.; Gould, Madelyn S.; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E.; Kalafat, John

    2012-01-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. 1,085 suicidal callers to crisis hotlines were divided into three sub-samples, which allowed us to conduct an independent Exploratory Factor Analysis (EFA), EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and CFA. Similar to previous factor analytic studies (Beck et al., 1997; Holden & DeLisle, 2005; Joiner, Rudd, & Rajab, 1997; Witte et al., 2006), we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations) and one factor representing more mild suicidal ideation (i.e., Suicidal Desire and Ideation). Using structural equation modeling techniques, we found preliminary evidence that the Resolved Plans and Preparations factor trended toward being more predictive of suicidal ideation than the Suicidal Desire and Ideation factor. This factor analytic study is the first longitudinal study of the obtained factors. PMID:20578186

  20. Factor analysis of some socio-economic and demographic variables for Bangladesh.

    PubMed

    Islam, S M

    1986-01-01

    The author carries out an exploratory factor analysis of some socioeconomic and demographic variables for Bangladesh using the classical or common factor approach with the varimax rotation method. The socioeconomic and demographic indicators used in this study include literacy, rate of growth, female employment, economic development, urbanization, population density, childlessness, sex ratio, proportion of women ever married, and fertility. The 18 administrative districts of Bangladesh constitute the unit of analysis. 3 common factors--modernization, fertility, and social progress--are identified in this study to explain the correlations among the set of selected socioeconomic and demographic variables.

  1. Using Module Analysis for Multiple Choice Responses: A New Method Applied to Force Concept Inventory Data

    ERIC Educational Resources Information Center

    Brewe, Eric; Bruun, Jesper; Bearden, Ian G.

    2016-01-01

    We describe "Module Analysis for Multiple Choice Responses" (MAMCR), a new methodology for carrying out network analysis on responses to multiple choice assessments. This method is used to identify modules of non-normative responses which can then be interpreted as an alternative to factor analysis. MAMCR allows us to identify conceptual…

  2. Identifying items to assess methodological quality in physical therapy trials: a factor analysis.

    PubMed

    Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-09-01

    Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items.

  3. Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences

    PubMed Central

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566

  4. Framework Design and Influencing Factor Analysis of a Water Environmental Functional Zone-Based Effluent Trading System.

    PubMed

    Chen, Lei; Han, Zhaoxing; Li, Shuang; Shen, Zhenyao

    2016-10-01

    The efficacy of traditional effluent trading systems is questionable due to their neglect of seasonal hydrological variation and the creation of upstream hot spots within a watershed. Besides, few studies have been conducted to distinguish the impacts of each influencing factor on effluent trading systems outputs. In this study, a water environmental functional zone-based effluent trading systems framework was configured and a comprehensive analysis of its influencing factors was conducted. This proposed water environmental functional zone-based effluent trading systems was then applied for the control of chemical oxygen demand in the Beiyun River watershed, Beijing, China. Optimal trading results highlighted the integration of water quality constraints and different hydrological seasons, especially for downstream dischargers. The optimal trading of each discharger, in terms of pollutant reduction load and abatement cost, is greatly influenced by environmental and political factors such as background water quality, the location of river assessment points, and tradable discharge permits. In addition, the initial permit allowance has little influence on the market as a whole but does impact the individual discharger. These results provide information that is critical to understanding the impact of policy design on the functionality of an effluent trading systems.

  5. Framework Design and Influencing Factor Analysis of a Water Environmental Functional Zone-Based Effluent Trading System

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Han, Zhaoxing; Li, Shuang; Shen, Zhenyao

    2016-10-01

    The efficacy of traditional effluent trading systems is questionable due to their neglect of seasonal hydrological variation and the creation of upstream hot spots within a watershed. Besides, few studies have been conducted to distinguish the impacts of each influencing factor on effluent trading systems outputs. In this study, a water environmental functional zone-based effluent trading systems framework was configured and a comprehensive analysis of its influencing factors was conducted. This proposed water environmental functional zone-based effluent trading systems was then applied for the control of chemical oxygen demand in the Beiyun River watershed, Beijing, China. Optimal trading results highlighted the integration of water quality constraints and different hydrological seasons, especially for downstream dischargers. The optimal trading of each discharger, in terms of pollutant reduction load and abatement cost, is greatly influenced by environmental and political factors such as background water quality, the location of river assessment points, and tradable discharge permits. In addition, the initial permit allowance has little influence on the market as a whole but does impact the individual discharger. These results provide information that is critical to understanding the impact of policy design on the functionality of an effluent trading systems.

  6. Confirmatory Factor Analysis of the Cancer Locus of Control Scale.

    ERIC Educational Resources Information Center

    Henderson, Jessica W.; Donatelle, Rebecca J.; Acock, Alan C.

    2002-01-01

    Conducted a confirmatory factor analysis of the Cancer Locus of Control scale (M. Watson and others, 1990), administered to 543 women with a history of breast cancer. Results support a three-factor model of the scale and support use of the scale to assess control dimensions. (SLD)

  7. Analysis of Social Cohesion in Health Data by Factor Analysis Method: The Ghanaian Perspective

    ERIC Educational Resources Information Center

    Saeed, Bashiru I. I.; Xicang, Zhao; Musah, A. A. I.; Abdul-Aziz, A. R.; Yawson, Alfred; Karim, Azumah

    2013-01-01

    We investigated the study of the overall social cohesion of Ghanaians. In this study, we considered the paramount interest of the involvement of Ghanaians in their communities, their views of other people and institutions, and their level of interest in both local and national politics. The factor analysis method was employed for analysis using R…

  8. Evidence Regarding the Internal Structure: Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Lewis, Todd F.

    2017-01-01

    American Educational Research Association (AERA) standards stipulate that researchers show evidence of the internal structure of instruments. Confirmatory factor analysis (CFA) is one structural equation modeling procedure designed to assess construct validity of assessments that has broad applicability for counselors interested in instrument…

  9. Writing, Literacy, and Applied Linguistics.

    ERIC Educational Resources Information Center

    Leki, Ilona

    2000-01-01

    Discusses writing and literacy in the domain of applied linguistics. Focus is on needs analysis for literacy acquisition; second language learner identity; longitudinal studies as extensions of identity work; and applied linguistics contributions to second language literacy research. (Author/VWL)

  10. Implementation of MCA Method for Identification of Factors for Conceptual Cost Estimation of Residential Buildings

    NASA Astrophysics Data System (ADS)

    Juszczyk, Michał; Leśniak, Agnieszka; Zima, Krzysztof

    2013-06-01

    Conceptual cost estimation is important for construction projects. Either underestimation or overestimation of building raising cost may lead to failure of a project. In the paper authors present application of a multicriteria comparative analysis (MCA) in order to select factors influencing residential building raising cost. The aim of the analysis is to indicate key factors useful in conceptual cost estimation in the early design stage. Key factors are being investigated on basis of the elementary information about the function, form and structure of the building, and primary assumptions of technological and organizational solutions applied in construction process. The mentioned factors are considered as variables of the model which aim is to make possible conceptual cost estimation fast and with satisfying accuracy. The whole analysis included three steps: preliminary research, choice of a set of potential variables and reduction of this set to select the final set of variables. Multicriteria comparative analysis is applied in problem solution. Performed analysis allowed to select group of factors, defined well enough at the conceptual stage of the design process, to be used as a describing variables of the model.

  11. Factor Structure of the Student-Teacher Relationship Scale for Norwegian School-Age Children Explored with Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Drugli, May Britt; Hjemdal, Odin

    2013-01-01

    The validity of the Student-Teacher Relationship Scale (STRS) was examined in a national sample of 863 Norwegian schoolchildren in grades 1-7 (aged 6-13). The original factor structure of the STRS was tested by confirmatory factor analysis (CFA). The CFA results did not support the original three-factor structure of the STRS. Subsequent CFA of the…

  12. Remote sensing applied to agriculture: Basic principles, methodology, and applications

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Mendonca, F. J.

    1981-01-01

    The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.

  13. The School Counselor Leadership Survey: Instrument Development and Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Young, Anita; Bryan, Julia

    2015-01-01

    This study examined the factor structure of the School Counselor Leadership Survey (SCLS). Survey development was a threefold process that resulted in a 39-item survey of 801 school counselors and school counselor supervisors. The exploratory factor analysis indicated a five-factor structure that revealed five key dimensions of school counselor…

  14. The fundamental parameter method applied to X-ray fluorescence analysis with synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Pantenburg, F. J.; Beier, T.; Hennrich, F.; Mommsen, H.

    1992-05-01

    Quantitative X-ray fluorescence analysis applying the fundamental parameter method is usually restricted to monochromatic excitation sources. It is shown here, that such analyses can be performed as well with a white synchrotron radiation spectrum. To determine absolute elemental concentration values it is necessary to know the spectral distribution of this spectrum. A newly designed and tested experimental setup, which uses the synchrotron radiation emitted from electrons in a bending magnet of ELSA (electron stretcher accelerator of the university of Bonn) is presented. The determination of the exciting spectrum, described by the given electron beam parameters, is limited due to uncertainties in the vertical electron beam size and divergence. We describe a method which allows us to determine the relative and absolute spectral distributions needed for accurate analysis. First test measurements of different alloys and standards of known composition demonstrate that it is possible to determine exact concentration values in bulk and trace element analysis.

  15. The factor structures and correlates of PTSD in post-conflict Timor-Leste: an analysis of the Harvard Trauma Questionnaire.

    PubMed

    Tay, Alvin Kuowei; Mohsin, Mohammed; Rees, Susan; Steel, Zachary; Tam, Natalino; Soares, Zelia; Baker, Jessica; Silove, Derrick

    2017-05-22

    Post-traumatic stress disorder (PTSD) is the most widely assessed form of mental distress in cross-cultural studies conducted amongst populations exposed to mass conflict and displacement. Nevertheless, there have been longstanding concerns about the universality of PTSD as a diagnostic category when applied across cultures. One approach to examining this question is to assess whether the same factor structure can be identified in culturally diverse populations as has been described in populations of western societies. We examine this issue based on an analysis of the Harvard Trauma Questionnaire (HTQ) completed by a large community sample in conflict-affected Timor-Leste. Culturally adapted measures were applied to assess exposure to conflict-related traumatic events (TEs), ongoing adversities, symptoms of PTSD and psychological distress, and functional impairment amongst a large population sample (n = 2964, response rate: 82.4%) in post-conflict Timor-Leste. Confirmatory factor analyses of the ICD-10, ICD-11, DSM-IV, four-factor Emotional Numbing and five-factor Dysphoric-Arousal PTSD structures, found considerable support for all these models. Based on these classifications, concurrent validity was indicated by logistic regression analyses which showed that being a woman, trauma exposure, ongoing adversity, severe distress, and functional impairment were all associated with PTSD. Although symptom prevalence estimates varied widely based on different classifications, our study found a general agreement in PTSD assignments across contemporary diagnostic systems in a large conflict-affected population in Timor-Leste. Further studies are needed, however, to establish the construct and concurrent validity of PTSD in other cultures.

  16. Determinants of Standard Errors of MLEs in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Cheng, Ying; Zhang, Wei

    2010-01-01

    This paper studies changes of standard errors (SE) of the normal-distribution-based maximum likelihood estimates (MLE) for confirmatory factor models as model parameters vary. Using logical analysis, simplified formulas and numerical verification, monotonic relationships between SEs and factor loadings as well as unique variances are found.…

  17. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  18. Human Factors Research in Anesthesia Patient Safety

    PubMed Central

    Weinger, Matthew B.; Slagle, Jason

    2002-01-01

    Patient safety has become a major public concern. Human factors research in other high-risk fields has demonstrated how rigorous study of factors that affect job performance can lead to improved outcome and reduced errors after evidence-based redesign of tasks or systems. These techniques have increasingly been applied to the anesthesia work environment. This paper describes data obtained recently using task analysis and workload assessment during actual patient care and the use of cognitive task analysis to study clinical decision making. A novel concept of “non-routine events” is introduced and pilot data are presented. The results support the assertion that human factors research can make important contributions to patient safety. Information technologies play a key role in these efforts.

  19. Human factors research in anesthesia patient safety.

    PubMed Central

    Weinger, M. B.; Slagle, J.

    2001-01-01

    Patient safety has become a major public concern. Human factors research in other high-risk fields has demonstrated how rigorous study of factors that affect job performance can lead to improved outcome and reduced errors after evidence-based redesign of tasks or systems. These techniques have increasingly been applied to the anesthesia work environment. This paper describes data obtained recently using task analysis and workload assessment during actual patient care and the use of cognitive task analysis to study clinical decision making. A novel concept of "non-routine events" is introduced and pilot data are presented. The results support the assertion that human factors research can make important contributions to patient safety. Information technologies play a key role in these efforts. PMID:11825287

  20. Independent Pre-Transplant Recipient Cancer Risk Factors after Kidney Transplantation and the Utility of G-Chart Analysis for Clinical Process Control.

    PubMed

    Schrem, Harald; Schneider, Valentin; Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz

    2016-01-01

    The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33-3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root

  1. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  2. Understanding Older Adults' Perceptions of Internet Use: An Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zheng, Robert; Spears, Jeffrey; Luptak, Marilyn; Wilby, Frances

    2015-01-01

    The current study examined factors related to older adults' perceptions of Internet use. Three hundred ninety five older adults participated in the study. The factor analysis revealed four factors perceived by older adults as critical to their Internet use: social connection, self-efficacy, the need to seek financial information, and the need to…

  3. Factor analysis and multiple regression between topography and precipitation on Jeju Island, Korea

    NASA Astrophysics Data System (ADS)

    Um, Myoung-Jin; Yun, Hyeseon; Jeong, Chang-Sam; Heo, Jun-Haeng

    2011-11-01

    SummaryIn this study, new factors that influence precipitation were extracted from geographic variables using factor analysis, which allow for an accurate estimation of orographic precipitation. Correlation analysis was also used to examine the relationship between nine topographic variables from digital elevation models (DEMs) and the precipitation in Jeju Island. In addition, a spatial analysis was performed in order to verify the validity of the regression model. From the results of the correlation analysis, it was found that all of the topographic variables had a positive correlation with the precipitation. The relations between the variables also changed in accordance with a change in the precipitation duration. However, upon examining the correlation matrix, no significant relationship between the latitude and the aspect was found. According to the factor analysis, eight topographic variables (latitude being the exception) were found to have a direct influence on the precipitation. Three factors were then extracted from the eight topographic variables. By directly comparing the multiple regression model with the factors (model 1) to the multiple regression model with the topographic variables (model 3), it was found that model 1 did not violate the limits of statistical significance and multicollinearity. As such, model 1 was considered to be appropriate for estimating the precipitation when taking into account the topography. In the study of model 1, the multiple regression model using factor analysis was found to be the best method for estimating the orographic precipitation on Jeju Island.

  4. Exploring Peer Relationships, Friendships and Group Work Dynamics in Higher Education: Applying Social Network Analysis

    ERIC Educational Resources Information Center

    Mamas, Christoforos

    2018-01-01

    This study primarily applied social network analysis (SNA) to explore the relationship between friendships, peer social interactions and group work dynamics within a higher education undergraduate programme in England. A critical case study design was adopted so as to allow for an in-depth exploration of the students' voice. In doing so, the views…

  5. Confirmatory factor analysis of the Child Oral Health Impact Profile (Korean version).

    PubMed

    Cho, Young Il; Lee, Soonmook; Patton, Lauren L; Kim, Hae-Young

    2016-04-01

    Empirical support for the factor structure of the Child Oral Health Impact Profile (COHIP) has not been fully established. The purposes of this study were to evaluate the factor structure of the Korean version of the COHIP (COHIP-K) empirically using confirmatory factor analysis (CFA) based on the theoretical framework and then to assess whether any of the factors in the structure could be grouped into a simpler single second-order factor. Data were collected through self-reported COHIP-K responses from a representative community sample of 2,236 Korean children, 8-15 yr of age. Because a large inter-factor correlation of 0.92 was estimated in the original five-factor structure, the two strongly correlated factors were combined into one factor, resulting in a four-factor structure. The revised four-factor model showed a reasonable fit with appropriate inter-factor correlations. Additionally, the second-order model with four sub-factors was reasonable with sufficient fit and showed equal fit to the revised four-factor model. A cross-validation procedure confirmed the appropriateness of the findings. Our analysis empirically supported a four-factor structure of COHIP-K, a summarized second-order model, and the use of an integrated summary COHIP score. © 2016 Eur J Oral Sci.

  6. Determinants of job stress in chemical process industry: A factor analysis approach.

    PubMed

    Menon, Balagopal G; Praveensal, C J; Madhu, G

    2015-01-01

    Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.

  7. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    NASA Astrophysics Data System (ADS)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  8. Dual-Tracer PET Using Generalized Factor Analysis of Dynamic Sequences

    PubMed Central

    Fakhri, Georges El; Trott, Cathryn M.; Sitek, Arkadiusz; Bonab, Ali; Alpert, Nathaniel M.

    2013-01-01

    Purpose With single-photon emission computed tomography, simultaneous imaging of two physiological processes relies on discrimination of the energy of the emitted gamma rays, whereas the application of dual-tracer imaging to positron emission tomography (PET) imaging has been limited by the characteristic 511-keV emissions. Procedures To address this limitation, we developed a novel approach based on generalized factor analysis of dynamic sequences (GFADS) that exploits spatio-temporal differences between radiotracers and applied it to near-simultaneous imaging of 2-deoxy-2-[18F]fluoro-D-glucose (FDG) (brain metabolism) and 11C-raclopride (D2) with simulated human data and experimental rhesus monkey data. We show theoretically and verify by simulation and measurement that GFADS can separate FDG and raclopride measurements that are made nearly simultaneously. Results The theoretical development shows that GFADS can decompose the studies at several levels: (1) It decomposes the FDG and raclopride study so that they can be analyzed as though they were obtained separately. (2) If additional physiologic/anatomic constraints can be imposed, further decomposition is possible. (3) For the example of raclopride, specific and nonspecific binding can be determined on a pixel-by-pixel basis. We found good agreement between the estimated GFADS factors and the simulated ground truth time activity curves (TACs), and between the GFADS factor images and the corresponding ground truth activity distributions with errors less than 7.3±1.3 %. Biases in estimation of specific D2 binding and relative metabolism activity were within 5.9±3.6 % compared to the ground truth values. We also evaluated our approach in simultaneous dual-isotope brain PET studies in a rhesus monkey and obtained accuracy of better than 6 % in a mid-striatal volume, for striatal activity estimation. Conclusions Dynamic image sequences acquired following near-simultaneous injection of two PET radiopharmaceuticals

  9. An Empirical Analysis of Factors Affecting Honors Program Completion Rates

    ERIC Educational Resources Information Center

    Savage, Hallie; Raehsler, Rod D.; Fiedor, Joseph

    2014-01-01

    One of the most important issues in any educational environment is identifying factors that promote academic success. A plethora of research on such factors exists across most academic fields, involving a wide range of student demographics, and the definition of student success varies across the range of studies published. The analysis in this…

  10. Factors of Spatial Visualization: An Analysis of the PSVT:R

    ERIC Educational Resources Information Center

    Ernst, Jeremy V.; Willams, Thomas O.; Clark, Aaron C.; Kelly, Daniel P.

    2017-01-01

    The Purdue Spatial Visualization Test: Visualization of Rotations (PVST:R) is among the most commonly used measurement instruments to assess spatial ability among engineering students. Previous analysis that explores the factor structure of the PSVT:R indicates a single-factor measure of the instrument. With this as a basis, this research seeks to…

  11. Escalation research: Providing new frontiers for applying behavior analysis to organizational behavior

    PubMed Central

    Goltz, Sonia M.

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to “throw good money after bad,” can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis. PMID:22478347

  12. The risk factors for avian influenza on poultry farms: a meta-analysis.

    PubMed

    Wang, Youming; Li, Peng; Wu, Yangli; Sun, Xiangdong; Yu, Kangzhen; Yu, Chuanhua; Qin, Aijian

    2014-11-01

    Avian influenza is a severe threat both to humans and poultry, but so far, no systematic review on the identification and evaluation of the risk factors of avian influenza infection has been published. The objective of this meta-analysis is to provide evidence for decision-making and further research on AI prevention through identifying the risk factors associated with AI infection on poultry farms. The results from 15 selected studies on risk factors for AI infections on poultry farms were analyzed quantitatively by meta-analysis. Open water source (OR=2.89), infections on nearby farms (OR=4.54), other livestock (OR=1.90) and disinfection of farm (OR=0.54) have significant association with AI infection on poultry farms. The subgroup analysis results indicate that there exist different risk factors for AI infections in different types of farms. The main risk factors for AI infection in poultry farms are environmental conditions (open water source, infections on nearby farms), keeping other livestock on the same farm and no disinfection of the farm. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis

    PubMed Central

    Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-01-01

    Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor

  14. A neuro-data envelopment analysis approach for optimization of uncorrelated multiple response problems with smaller the better type controllable factors

    NASA Astrophysics Data System (ADS)

    Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed

    2013-11-01

    In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

  15. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  16. Investigation on Constrained Matrix Factorization for Hyperspectral Image Analysis

    DTIC Science & Technology

    2005-07-25

    analysis. Keywords: matrix factorization; nonnegative matrix factorization; linear mixture model ; unsupervised linear unmixing; hyperspectral imagery...spatial resolution permits different materials present in the area covered by a single pixel. The linear mixture model says that a pixel reflectance in...in r. In the linear mixture model , r is considered as the linear mixture of m1, m2, …, mP as nMαr += (1) where n is included to account for

  17. Confirmatory Factor Analysis of the Minnesota Nicotine Withdrawal Scale

    PubMed Central

    Toll, Benjamin A.; O’Malley, Stephanie S.; McKee, Sherry A.; Salovey, Peter; Krishnan-Sarin, Suchitra

    2008-01-01

    The authors examined the factor structure of the Minnesota Nicotine Withdrawal Scale (MNWS) using confirmatory factor analysis in clinical research samples of smokers trying to quit (n = 723). Three confirmatory factor analytic models, based on previous research, were tested with each of the 3 study samples at multiple points in time. A unidimensional model including all 8 MNWS items was found to be the best explanation of the data. This model produced fair to good internal consistency estimates. Additionally, these data revealed that craving should be included in the total score of the MNWS. Factor scores derived from this single-factor, 8-item model showed that increases in withdrawal were associated with poor smoking outcome for 2 of the clinical studies. Confirmatory factor analyses of change scores showed that the MNWS symptoms cohere as a syndrome over time. Future investigators should report a total score using all of the items from the MNWS. PMID:17563141

  18. Methods and analysis of factors impact on the efficiency of the photovoltaic generation

    NASA Astrophysics Data System (ADS)

    Tianze, Li; Xia, Zhang; Chuan, Jiang; Luan, Hou

    2011-02-01

    First of all, the thesis elaborates two important breakthroughs which happened In the field of the application of solar energy in the 1950s.The 21st century the development of solar photovoltaic power generation will have the following characteristics: the continued high growth of industrial development, the significantly reducing cost of the solar cell, the large-scale high-tech development of photovoltaic industries, the breakthroughs of the film battery technology, the rapid development of solar PV buildings integration and combined to the grids. The paper makes principles of solar cells the theoretical analysis. On the basis, we study the conversion efficiency of solar cells, find the factors impact on the efficiency of the photovoltaic generation, solve solar cell conversion efficiency of technical problems through the development of new technology, and open up new ways to improve the solar cell conversion efficiency. Finally, the paper connecting with the practice establishes policies and legislation to the use of encourage renewable energy, development strategy, basic applied research etc.

  19. Meta-analysis of the Brief Psychiatric Rating Scale Factor Structure

    ERIC Educational Resources Information Center

    Shafer, Alan

    2005-01-01

    A meta-analysis (N=17,620; k=26) of factor analyses of the Brief Psychiatric Rating Scale (BPRS) was conducted. Analysis of the 12 items from Overall et al.'s (J. E. Overall, L. E. Hollister, & P. Pichot, 1974) 4 subscales found support for his 4 subscales. Analysis of all 18 BPRS items found 4 components similar to those of Overall et al. In a…

  20. Validation of the Weight Concerns Scale Applied to Brazilian University Students.

    PubMed

    Dias, Juliana Chioda Ribeiro; da Silva, Wanderson Roberto; Maroco, João; Campos, Juliana Alvares Duarte Bonini

    2015-06-01

    The aim of this study was to evaluate the validity and reliability of the Portuguese version of the Weight Concerns Scale (WCS) when applied to Brazilian university students. The scale was completed by 1084 university students from Brazilian public education institutions. A confirmatory factor analysis was conducted. The stability of the model in independent samples was assessed through multigroup analysis, and the invariance was estimated. Convergent, concurrent, divergent, and criterion validities as well as internal consistency were estimated. Results indicated that the one-factor model presented an adequate fit to the sample and values of convergent validity. The concurrent validity with the Body Shape Questionnaire and divergent validity with the Maslach Burnout Inventory for Students were adequate. Internal consistency was adequate, and the factorial structure was invariant in independent subsamples. The results present a simple and short instrument capable of precisely and accurately assessing concerns with weight among Brazilian university students. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A Human Factors Analysis of EVA Time Requirements

    NASA Technical Reports Server (NTRS)

    Pate, Dennis W.

    1997-01-01

    Human Factors Engineering (HFE) is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. During the summer of 1995, a human factors motion and time study was initiated with the goals of developing a database of EVA task times and developing a method of utilizing the database to predict how long an EVA should take. Initial development relied on the EVA activities performed during the STS-61 (Hubble) mission. The first step of the study was to become familiar with EVA's, the previous task-time studies, and documents produced on EVA's. After reviewing these documents, an initial set of task primitives and task-time modifiers was developed. Data was collected from videotaped footage of two entire STS-61 EVA missions and portions of several others, each with two EVA astronauts. Feedback from the analysis of the data was used to further refine the primitives and modifiers used. The project was continued during the summer of 1996, during which data on human errors was also collected and analyzed. Additional data from the STS-71 mission was also collected. Analysis of variance techniques for categorical data was used to determine which factors may affect the primitive times and how much of an effect they have. Probability distributions for the various task were also generated. Further analysis of the modifiers and interactions is planned.

  2. Detecting Outliers in Factor Analysis Using the Forward Search Algorithm

    ERIC Educational Resources Information Center

    Mavridis, Dimitris; Moustaki, Irini

    2008-01-01

    In this article we extend and implement the forward search algorithm for identifying atypical subjects/observations in factor analysis models. The forward search has been mainly developed for detecting aberrant observations in regression models (Atkinson, 1994) and in multivariate methods such as cluster and discriminant analysis (Atkinson, Riani,…

  3. Using fault tree analysis to identify contributing factors to engulfment in flowing grain in on-farm grain bins.

    PubMed

    Kingman, D M; Field, W E

    2005-11-01

    Findings reported by researchers at Illinois State University and Purdue University indicated that since 1980, an average of eight individuals per year have become engulfed and died in farm grain bins in the U.S. and Canada and that all these deaths are significant because they are believed to be preventable. During a recent effort to develop intervention strategies and recommendations for an ASAE farm grain bin safety standard, fault tree analysis (FTA) was utilized to identify contributing factors to engulfments in grain stored in on-farm grain bins. FTA diagrams provided a spatial perspective of the circumstances that occurred prior to engulfment incidents, a perspective never before presented in other hazard analyses. The FTA also demonstrated relationships and interrelationships of the contributing factors. FTA is a useful tool that should be applied more often in agricultural incident investigations to assist in the more complete understanding of the problem studied.

  4. Quantitative Analysis of Guanine Nucleotide Exchange Factors (GEFs) as Enzymes

    PubMed Central

    Randazzo, Paul A; Jian, Xiaoying; Chen, Pei-Wen; Zhai, Peng; Soubias, Olivier; Northup, John K

    2014-01-01

    The proteins that possess guanine nucleotide exchange factor (GEF) activity, which include about ~800 G protein coupled receptors (GPCRs),1 15 Arf GEFs,2 81 Rho GEFs,3 8 Ras GEFs,4 and others for other families of GTPases,5 catalyze the exchange of GTP for GDP on all regulatory guanine nucleotide binding proteins. Despite their importance as catalysts, relatively few exchange factors (we are aware of only eight for ras superfamily members) have been rigorously characterized kinetically.5–13 In some cases, kinetic analysis has been simplistic leading to erroneous conclusions about mechanism (as discussed in a recent review14). In this paper, we compare two approaches for determining the kinetic properties of exchange factors: (i) examining individual equilibria, and; (ii) analyzing the exchange factors as enzymes. Each approach, when thoughtfully used,14,15 provides important mechanistic information about the exchange factors. The analysis as enzymes is described in further detail. With the focus on the production of the biologically relevant guanine nucleotide binding protein complexed with GTP (G•GTP), we believe it is conceptually simpler to connect the kinetic properties to cellular effects. Further, the experiments are often more tractable than those used to analyze the equilibrium system and, therefore, more widely accessible to scientists interested in the function of exchange factors. PMID:25332840

  5. Item response theory analysis applied to the Spanish version of the Personal Outcomes Scale.

    PubMed

    Guàrdia-Olmos, J; Carbó-Carreté, M; Peró-Cebollero, M; Giné, C

    2017-11-01

    The study of measurements of quality of life (QoL) is one of the great challenges of modern psychology and psychometric approaches. This issue has greater importance when examining QoL in populations that were historically treated on the basis of their deficiency, and recently, the focus has shifted to what each person values and desires in their life, as in cases of people with intellectual disability (ID). Many studies of QoL scales applied in this area have attempted to improve the validity and reliability of their components by incorporating various sources of information to achieve consistency in the data obtained. The adaptation of the Personal Outcomes Scale (POS) in Spanish has shown excellent psychometric attributes, and its administration has three sources of information: self-assessment, practitioner and family. The study of possible congruence or incongruence of observed distributions of each item between sources is therefore essential to ensure a correct interpretation of the measure. The aim of this paper was to analyse the observed distribution of items and dimensions from the three Spanish POS information sources cited earlier, using the item response theory. We studied a sample of 529 people with ID and their respective practitioners and family member, and in each case, we analysed items and factors using Samejima's model of polytomic ordinal scales. The results indicated an important number of items with differential effects regarding sources, and in some cases, they indicated significant differences in the distribution of items, factors and sources of information. As a result of this analysis, we must affirm that the administration of the POS, considering three sources of information, was adequate overall, but a correct interpretation of the results requires that it obtain much more information to consider, as well as some specific items in specific dimensions. The overall ratings, if these comments are considered, could result in bias. © 2017

  6. The contribution of psychological factors to recovery after mild traumatic brain injury: is cluster analysis a useful approach?

    PubMed

    Snell, Deborah L; Surgenor, Lois J; Hay-Smith, E Jean C; Williman, Jonathan; Siegert, Richard J

    2015-01-01

    Outcomes after mild traumatic brain injury (MTBI) vary, with slow or incomplete recovery for a significant minority. This study examines whether groups of cases with shared psychological factors but with different injury outcomes could be identified using cluster analysis. This is a prospective observational study following 147 adults presenting to a hospital-based emergency department or concussion services in Christchurch, New Zealand. This study examined associations between baseline demographic, clinical, psychological variables (distress, injury beliefs and symptom burden) and outcome 6 months later. A two-step approach to cluster analysis was applied (Ward's method to identify clusters, K-means to refine results). Three meaningful clusters emerged (high-adapters, medium-adapters, low-adapters). Baseline cluster-group membership was significantly associated with outcomes over time. High-adapters appeared recovered by 6-weeks and medium-adapters revealed improvements by 6-months. The low-adapters continued to endorse many symptoms, negative recovery expectations and distress, being significantly at risk for poor outcome more than 6-months after injury (OR (good outcome) = 0.12; CI = 0.03-0.53; p < 0.01). Cluster analysis supported the notion that groups could be identified early post-injury based on psychological factors, with group membership associated with differing outcomes over time. Implications for clinical care providers regarding therapy targets and cases that may benefit from different intensities of intervention are discussed.

  7. Limit Cycle Analysis Applied to the Oscillations of Decelerating Blunt-Body Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; Queen, Eric M.

    2008-01-01

    Many blunt-body entry vehicles have nonlinear dynamic stability characteristics that produce self-limiting oscillations in flight. Several different test techniques can be used to extract dynamic aerodynamic coefficients to predict this oscillatory behavior for planetary entry mission design and analysis. Most of these test techniques impose boundary conditions that alter the oscillatory behavior from that seen in flight. Three sets of test conditions, representing three commonly used test techniques, are presented to highlight these effects. Analytical solutions to the constant-coefficient planar equations-of-motion for each case are developed to show how the same blunt body behaves differently depending on the imposed test conditions. The energy equation is applied to further illustrate the governing dynamics. Then, the mean value theorem is applied to the energy rate equation to find the effective damping for an example blunt body with nonlinear, self-limiting dynamic characteristics. This approach is used to predict constant-energy oscillatory behavior and the equilibrium oscillation amplitudes for the various test conditions. These predictions are verified with planar simulations. The analysis presented provides an overview of dynamic stability test techniques and illustrates the effects of dynamic stability, static aerodynamics and test conditions on observed dynamic motions. It is proposed that these effects may be leveraged to develop new test techniques and refine test matrices in future tests to better define the nonlinear functional forms of blunt body dynamic stability curves.

  8. Pharmaceutical strategic purchasing requirements in Iran: Price interventions and the related effective factors.

    PubMed

    Bastani, Peivand; Dinarvand, Rasoul; SamadBeik, Mahnaz; Pourmohammadi, Kimia

    2016-01-01

    Pharmaceutical access for the poor is an essential factor in developing countries that can be improved through strategic purchasing. This study was conducted to identify the elements affecting price in order to enable insurance organizations to put strategic purchasing into practice. This was a qualitative study conducted through content analysis with an inductive approach applying a five-stage framework analysis (familiarization, identifying a thematic framework, indexing, mapping, and interpretation). Data analysis was started right after transcribing each interview applying ATLAS.ti. Data were saturated after 32 semi-structured interviews by experts. These key informants were selected purposefully and through snowball sampling. Findings showed that there are four main themes as Pharmaceutical Strategic Purchasing Requirements in Iran as follows essential and structural factors, international factors, economical factors, and legal factors. Moreover, totally 14 related sub-themes were extracted in this area as the main effective variables. It seems that paying adequate attention to the four present themes and 14 sub-themes affecting price can enable health system policy-makers of developing countries like Iran to make the best decisions through strategic purchasing of drugs by the main insurers in order to improve access and health in the country.

  9. Students' motivation to study dentistry in Malaysia: an analysis using confirmatory factor analysis.

    PubMed

    Musa, Muhd Firdaus Che; Bernabé, Eduardo; Gallagher, Jennifer E

    2015-06-12

    Malaysia has experienced a significant expansion of dental schools over the past decade. Research into students' motivation may inform recruitment and retention of the future dental workforce. The objectives of this study were to explore students' motivation to study dentistry and whether that motivation varied by students' and school characteristics. All 530 final-year students in 11 dental schools (6 public and 5 private) in Malaysia were invited to participate at the end of 2013. The self-administered questionnaire, developed at King's College London, collected information on students' motivation to study dentistry and demographic background. Responses on students' motivation were collected using five-point ordinal scales. Confirmatory factor analysis (CFA) was used to evaluate the underlying structure of students' motivation to study dentistry. Multivariate analysis of variance (MANOVA) was used to compare factor scores for overall motivation and sub-domains by students' and school characteristics. Three hundred and fifty-six final-year students in eight schools (all public and two private) participated in the survey, representing an 83% response rate for these schools and 67% of all final-year students nationally. The majority of participants were 24 years old (47%), female (70%), Malay (56%) and from middle-income families (41%) and public schools (78%). CFA supported a model with five first-order factors (professional job, healthcare and people, academic, careers advising and family and friends) which were linked to a single second-order factor representing overall students' motivation. Academic factors and healthcare and people had the highest standardized factor loadings (0.90 and 0.71, respectively), suggesting they were the main motivation to study dentistry. MANOVA showed that students from private schools had higher scores for healthcare and people than those in public schools whereas Malay students had lower scores for family and friends than those

  10. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    PubMed

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  11. Error analysis for intrinsic quality factor measurement in superconducting radio frequency resonators

    DOE PAGES

    Melnychuk, O.; Grassellino, A.; Romanenko, A.

    2014-12-19

    In this paper, we discuss error analysis for intrinsic quality factor (Q₀) and accelerating gradient (E acc ) measurements in superconducting radio frequency (SRF) resonators. The analysis is applicable for cavity performance tests that are routinely performed at SRF facilities worldwide. We review the sources of uncertainties along with the assumptions on their correlations and present uncertainty calculations with a more complete procedure for treatment of correlations than in previous publications [T. Powers, in Proceedings of the 12th Workshop on RF Superconductivity, SuP02 (Elsevier, 2005), pp. 24–27]. Applying this approach to cavity data collected at Vertical Test Stand facility atmore » Fermilab, we estimated total uncertainty for both Q₀ and E acc to be at the level of approximately 4% for input coupler coupling parameter β₁ in the [0.5, 2.5] range. Above 2.5 (below 0.5) Q₀ uncertainty increases (decreases) with β₁ whereas E acc uncertainty, in contrast with results in Powers [in Proceedings of the 12th Workshop on RF Superconductivity, SuP02 (Elsevier, 2005), pp. 24–27], is independent of β₁. Overall, our estimated Q₀ uncertainty is approximately half as large as that in Powers [in Proceedings of the 12th Workshop on RF Superconductivity, SuP02 (Elsevier, 2005), pp. 24–27].« less

  12. A Multivariate Methodological Workflow for the Analysis of FTIR Chemical Mapping Applied on Historic Paint Stratigraphies

    PubMed Central

    Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene

    2017-01-01

    In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162

  13. An Analysis of Selected Factors Influencing Enrollment Patterns.

    ERIC Educational Resources Information Center

    Heck, James

    This report presents an analysis of factors influencing enrollment patterns at Lake City Community College (LCCC; Florida) and recommends ways to increase enrollments at the college. Section I reviews the methods of collecting data for the report, which included interviews with key college personnel, an examination of social indicators such as…

  14. Dynamic Factor Analysis Models with Time-Varying Parameters

    ERIC Educational Resources Information Center

    Chow, Sy-Miin; Zu, Jiyun; Shifren, Kim; Zhang, Guangjian

    2011-01-01

    Dynamic factor analysis models with time-varying parameters offer a valuable tool for evaluating multivariate time series data with time-varying dynamics and/or measurement properties. We use the Dynamic Model of Activation proposed by Zautra and colleagues (Zautra, Potter, & Reich, 1997) as a motivating example to construct a dynamic factor…

  15. Dynamic factor analysis of long-term growth trends of the intertidal seagrass Thalassia hemprichii in southern Taiwan

    NASA Astrophysics Data System (ADS)

    Kuo, Yi-Ming; Lin, Hsing-Juh

    2010-01-01

    We examined environmental factors which are most responsible for the 8-year temporal dynamics of the intertidal seagrass Thalassia hemprichii in southern Taiwan. A dynamic factor analysis (DFA), a dimension-reduction technique, was applied to identify common trends in a multivariate time series and the relationships between this series and interacting environmental variables. The results of dynamic factor models (DFMs) showed that the leaf growth rate of the seagrass was mainly influenced by salinity (Sal), tidal range (TR), turbidity ( K), and a common trend representing an unexplained variability in the observed time series. Sal was the primary variable that explained the temporal dynamics of the leaf growth rate compared to TR and K. K and TR had larger influences on the leaf growth rate in low- than in high-elevation beds. In addition to K, TR, and Sal, UV-B radiation (UV-B), sediment depth (SD), and a common trend accounted for long-term temporal variations of the above-ground biomass. Thus, K, TR, Sal, UV-B, and SD are the predominant environmental variables that described temporal growth variations of the intertidal seagrass T. hemprichii in southern Taiwan. In addition to environmental variables, human activities may be contributing to negative impacts on the seagrass beds; this human interference may have been responsible for the unexplained common trend in the DFMs. Due to successfully applying the DFA to analyze complicated ecological and environmental data in this study, important environmental variables and impacts of human activities along the coast should be taken into account when managing a coastal environment for the conservation of intertidal seagrass beds.

  16. What are applied ethics?

    PubMed

    Allhoff, Fritz

    2011-03-01

    This paper explores the relationships that various applied ethics bear to each other, both in particular disciplines and more generally. The introductory section lays out the challenge of coming up with such an account and, drawing a parallel with the philosophy of science, offers that applied ethics may either be unified or disunified. The second section develops one simple account through which applied ethics are unified, vis-à-vis ethical theory. However, this is not taken to be a satisfying answer, for reasons explained. In the third section, specific applied ethics are explored: biomedical ethics; business ethics; environmental ethics; and neuroethics. These are chosen not to be comprehensive, but rather for their traditions or other illustrative purposes. The final section draws together the results of the preceding analysis and defends a disunity conception of applied ethics.

  17. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.

    PubMed

    Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R

    2018-03-01

    Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.

  18. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    PubMed

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  19. Recovery of Weak Factor Loadings When Adding the Mean Structure in Confirmatory Factor Analysis: A Simulation Study

    PubMed Central

    Ximénez, Carmen

    2016-01-01

    This article extends previous research on the recovery of weak factor loadings in confirmatory factor analysis (CFA) by exploring the effects of adding the mean structure. This issue has not been examined in previous research. This study is based on the framework of Yung and Bentler (1999) and aims to examine the conditions that affect the recovery of weak factor loadings when the model includes the mean structure, compared to analyzing the covariance structure alone. A simulation study was conducted in which several constraints were defined for one-, two-, and three-factor models. Results show that adding the mean structure improves the recovery of weak factor loadings and reduces the asymptotic variances for the factor loadings, particularly for the models with a smaller number of factors and a small sample size. Therefore, under certain circumstances, modeling the means should be seriously considered for covariance models containing weak factor loadings. PMID:26779071

  20. Affective Outcomes of Schooling: Full-Information Item Factor Analysis of a Student Questionnaire.

    ERIC Educational Resources Information Center

    Muraki, Eiji; Engelhard, George, Jr.

    Recent developments in dichotomous factor analysis based on multidimensional item response models (Bock and Aitkin, 1981; Muthen, 1978) provide an effective method for exploring the dimensionality of questionnaire items. Implemented in the TESTFACT program, this "full information" item factor analysis accounts not only for the pairwise joint…

  1. Influence of applied quantity of sunscreen products on the sun protection factor--a multicenter study organized by the DGK Task Force Sun Protection.

    PubMed

    Bimczok, R; Gers-Barlag, H; Mundt, C; Klette, E; Bielfeldt, S; Rudolph, T; Pflucker, F; Heinrich, U; Tronnier, H; Johncock, W; Klebon, B; Westenfelder, H; Flosser-Muller, H; Jenni, K; Kockott, D; Lademann, J; Herzog, B; Rohr, M

    2007-01-01

    It is often debated that the protection against solar-induced erythema under real conditions is dependent upon the amount of sunscreen applied. It is believed that when too little is applied a lower sun protection than indicated on the label will result. The aim of this study was to quantify this effect. In this multicenter study, the influence of three different amounts (0.5, 1.0, 2.0 mg/cm(2)) of three commercial sunscreen products in three reliable test centers was investigated according to the test protocol of The International Sun Protection Factor Test Method. The main result was a linear dependence of the SPF on the quantity applied. Taking into consideration the volunteer-specific variations, an exponential dependence of confidence interval of the in vivo SPF and amount applied was found. The highest amount applied (2.0 mg/cm(2)) was linked to the lowest confidence intervals. Thus, from the point of view of producing reliable and reproducible in vivo results under laboratory conditions, the recommendation of this multicenter study is an application quantity of 2.0 mg/cm(2).

  2. A Study of Algorithms for Covariance Structure Analysis with Specific Comparisons Using Factor Analysis.

    ERIC Educational Resources Information Center

    Lee, S. Y.; Jennrich, R. I.

    1979-01-01

    A variety of algorithms for analyzing covariance structures are considered. Additionally, two methods of estimation, maximum likelihood, and weighted least squares are considered. Comparisons are made between these algorithms and factor analysis. (Author/JKS)

  3. Analysis of multi-layered films. [determining dye densities by applying a regression analysis to the spectral response of the composite transparency

    NASA Technical Reports Server (NTRS)

    Scarpace, F. L.; Voss, A. W.

    1973-01-01

    Dye densities of multi-layered films are determined by applying a regression analysis to the spectral response of the composite transparency. The amount of dye in each layer is determined by fitting the sum of the individual dye layer densities to the measured dye densities. From this, dye content constants are calculated. Methods of calculating equivalent exposures are discussed. Equivalent exposures are a constant amount of energy over a limited band-width that will give the same dye content constants as the real incident energy. Methods of using these equivalent exposures for analysis of photographic data are presented.

  4. An Analysis of Factors that Influence Enlistment Decisions in the U.S. Army

    DTIC Science & Technology

    1998-03-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California CM THESIS AN ANALYSIS OF FACTORS THAT INFLUENCE ENLISTMENT DECISIONS IN THE U.S. ARMY by Young...TITLE AND SUBTITLE : AN ANALYSIS OF FACTORS THAT INFLUENCE ENLISTMENT DECISIONS IN THE U.S. ARMY 6. AUTHOR(S) Oh, Young Yeol 7...200 words) The purpose of this thesis is to analyze factors that influence decisions to enlist in the U.S. Army. This thesis uses 1997 New Recruit

  5. Teacher's Corner: Examining Identification Issues in Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Marcoulides, George A.

    2006-01-01

    One hundred years have passed since the birth of factor analysis, during which time there have been some major developments and extensions to the methodology. Unfortunately, one issue where the widespread accumulation of knowledge has been rather slow concerns identification. This article provides a didactic discussion of the topic in an attempt…

  6. Modular Open-Source Software for Item Factor Analysis

    ERIC Educational Resources Information Center

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  7. Perception on obesity among university students: A case study using factor analysis

    NASA Astrophysics Data System (ADS)

    Hassan, Suriani; Rahman, Nur Amira Abdol; Ghazali, Khadizah; Ismail, Norlita; Budin, Kamsia

    2014-07-01

    The purpose of this study was to examine the university students' perceptions on obesity and to compare the difference in mean scores factor based on demographic factors. Data was collected randomly using questionnaires. There were 321 university students participated in this study. Descriptive statistics, factor analysis, normality test, independent t test, one-way ANOVA and non-parametric tests were used in this study. Factor analysis results managed to retrieve three new factors namely impact of the health, impact of the physical appearance and personal factors. The study found that Science students have higher awareness and perceptions than Art students on Factor 1, impact of the health towards overweight problems and obesity. The findings of the study showed students, whose family background has obesity problem have higher awareness and perceptions than students' whose family background has no obesity problem on Factor 1, impact of the health towards overweight problems and obesity. The study also found that students' whose father with primary school level had the lowest awareness and perceptions on Factor 2, impact of the physical appearance towards overweight problems and obesity than other students whose father with higher academic level.

  8. Study of risk factors for gastric cancer by populational databases analysis

    PubMed Central

    Ferrari, Fangio; Reis, Marco Antonio Moura

    2013-01-01

    AIM: To study the association between the incidence of gastric cancer and populational exposure to risk/protective factors through an analysis of international databases. METHODS: Open-access global databases concerning the incidence of gastric cancer and its risk/protective factors were identified through an extensive search on the Web. As its distribution was neither normal nor symmetric, the cancer incidence of each country was categorized according to ranges of percentile distribution. The association of each risk/protective factor with exposure was measured between the extreme ranges of the incidence of gastric cancer (under the 25th percentile and above the 75th percentile) by the use of the Mann-Whitney test, considering a significance level of 0.05. RESULTS: A variable amount of data omission was observed among all of the factors under study. A weak or nonexistent correlation between the incidence of gastric cancer and the study variables was shown by a visual analysis of scatterplot dispersion. In contrast, an analysis of categorized incidence revealed that the countries with the highest human development index (HDI) values had the highest rates of obesity in males and the highest consumption of alcohol, tobacco, fruits, vegetables and meat, which were associated with higher incidences of gastric cancer. There was no significant difference for the risk factors of obesity in females and fish consumption. CONCLUSION: Higher HDI values, coupled with a higher prevalence of male obesity and a higher per capita consumption of alcohol, tobacco, fruits, vegetables and meat, are associated with a higher incidence of gastric cancer based on an analysis of populational global data. PMID:24409066

  9. An Evaluation of the Effects of Variable Sampling on Component, Image, and Factor Analysis.

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Fava, Joseph L.

    1987-01-01

    Principal component analysis, image component analysis, and maximum likelihood factor analysis were compared to assess the effects of variable sampling. Results with respect to degree of saturation and average number of variables per factor were clear and dramatic. Differential effects on boundary cases and nonconvergence problems were also found.…

  10. In Dreams Begin Responsibility: Why and How to Measure the Quality of Graduate Training in Applied Behavior Analysis.

    PubMed

    Critchfield, Thomas S

    2015-10-01

    Although no one knows just how effective graduate training may be in creating effective practitioners of applied behavior analysis, there are plenty of logical and historical reasons to think that not all practitioners are equally competent. I detail some of those reasons and explain why practitioner effectiveness may be a more pressing worry now than in the past. Because ineffective practitioners harm the profession, rigorous mechanisms are needed for evaluating graduate training programs in terms of the field effectiveness of their practitioners. Accountability of this nature, while difficult to arrange, would make applied behavior analysis nearly unique among professions, would complement existing quality control processes, and would help to protect the positive reputation and vigorous consumer demand that the profession currently enjoys.

  11. A factor analysis of the SSQ (Speech, Spatial, and Qualities of Hearing Scale).

    PubMed

    Akeroyd, Michael A; Guy, Fiona H; Harrison, Dawn L; Suller, Sharon L

    2014-02-01

    The speech, spatial, and qualities of hearing questionnaire (SSQ) is a self-report test of auditory disability. The 49 items ask how well a listener would do in many complex listening situations illustrative of real life. The scores on the items are often combined into the three main sections or into 10 pragmatic subscales. We report here a factor analysis of the SSQ that we conducted to further investigate its statistical properties and to determine its structure. Statistical factor analysis of questionnaire data, using parallel analysis to determine the number of factors to retain, oblique rotation of factors, and a bootstrap method to estimate the confidence intervals. 1220 people who have attended MRC IHR over the last decade. We found three clear factors, essentially corresponding to the three main sections of the SSQ. They are termed "speech understanding", "spatial perception", and "clarity, separation, and identification". Thirty-five of the SSQ questions were included in the three factors. There was partial evidence for a fourth factor, "effort and concentration", representing two more questions. These results aid in the interpretation and application of the SSQ and indicate potential methods for generating average scores.

  12. Psychological Factors and Conditioned Pain Modulation: A Meta-Analysis.

    PubMed

    Nahman-Averbuch, Hadas; Nir, Rony-Reuven; Sprecher, Elliot; Yarnitsky, David

    2016-06-01

    Conditioned pain modulation (CPM) responses may be affected by psychological factors such as anxiety, depression, and pain catastrophizing; however, most studies on CPM do not address these relations as their primary outcome. The aim of this meta-analysis was to analyze the findings regarding the associations between CPM responses and psychological factors in both pain-free individuals and pain patients. After a comprehensive PubMed search, 37 articles were found to be suitable for inclusion. Analyses used DerSimonian and Laird's random-effects model on Fisher's z-transforms of correlations; potential publication bias was tested using funnel plots and Egger's regression test for funnel plot asymmetry. Six meta-analyses were performed examining the correlations between anxiety, depression, and pain catastrophizing, and CPM responses in healthy individuals and pain patients. No significant correlations between CPM responses and any of the examined psychological factors were found. However, a secondary analysis, comparing modality-specific CPM responses and psychological factors in healthy individuals, revealed the following: (1) pressure-based CPM responses were correlated with anxiety (grand mean correlation in original units r=-0.1087; 95% confidence limits, -0.1752 to -0.0411); (2) heat-based CPM was correlated with depression (r=0.2443; 95% confidence limits, 0.0150 to 0.4492); and (3) electrical-based CPM was correlated with pain catastrophizing levels (r=-0.1501; 95% confidence limits, -0.2403 to -0.0574). Certain psychological factors seem to be associated with modality-specific CPM responses in healthy individuals. This potentially supports the notion that CPM paradigms evoked by different stimulation modalities represent different underlying mechanisms.

  13. Exploring factors that influence work analysis data: A meta-analysis of design choices, purposes, and organizational context.

    PubMed

    DuVernet, Amy M; Dierdorff, Erich C; Wilson, Mark A

    2015-09-01

    Work analysis is fundamental to designing effective human resource systems. The current investigation extends previous research by identifying the differential effects of common design decisions, purposes, and organizational contexts on the data generated by work analyses. The effects of 19 distinct factors that span choices of descriptor, collection method, rating scale, and data source, as well as project purpose and organizational features, are explored. Meta-analytic results cumulated from 205 articles indicate that many of these variables hold significant consequences for work analysis data. Factors pertaining to descriptor choice, collection method, rating scale, and the purpose for conducting the work analysis each showed strong associations with work analysis data. The source of the work analysis information and organizational context in which it was conducted displayed fewer relationships. Findings can be used to inform choices work analysts make about methodology and postcollection evaluations of work analysis information. (c) 2015 APA, all rights reserved).

  14. A Confirmatory Factor Analysis of the Structure of Abbreviated Math Anxiety Scale

    PubMed Central

    Farrokhi, Farahman

    2011-01-01

    Objective The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Abbreviated Math Anxiety Scale (AMAS), proposed by Hopko, Mahadevan, Bare & Hunt. Method The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. The confirmatory factor analysis (CFA) was carried out to determine the factor structures of the Persian version of AMAS. Results As expected, the two-factor solution provided a better fit to the data than a single factor. Moreover, multi-group analyses showed that this two-factor structure was invariant across sex. Hence, AMAS provides an equally valid measure for use among college students. Conclusions Brief AMAS demonstrates adequate reliability and validity. The AMAS scores can be used to compare symptoms of math anxiety between male and female students. The study both expands and adds support to the existing body of math anxiety literature. PMID:22952521

  15. Management of extramedullary plasmacytoma: Role of radiotherapy and prognostic factor analysis in 55 patients.

    PubMed

    Wen, Ge; Wang, Weihu; Zhang, Yujing; Niu, Shaoqing; Li, Qiwen; Li, Yexiong

    2017-10-01

    To investigate potential prognostic factors affecting patient outcomes and to evaluate the optimal methods and effects of radiotherapy (RT) in the management of extramedullary plasmacytoma (EMP). Data from 55 patients with EMP between November 1999 and August 2015 were collected. The median age was 51 (range, 22-77) years. The median tumor size was 3.5 (range, 1.0-15.0) cm. The median applied dose was 50.0 (range, 30.0-70.0) Gy. Thirty-nine patients (70.9%) presented with disease in the head or neck region. Twelve patients received RT alone, 9 received surgery (S) alone, 3 received chemotherapy (CT) alone, and 3 patients did not receive any treatment. Combination therapies were applied in 28 patients. The median follow-up duration was 56 months. The 5-year local recurrence-free survival (LRFS), multiple myeloma-free survival (MMFS), progression-free survival (PFS) and overall survival (OS) rates were 79.8%, 78.6%, 65.2% and 76.0%, respectively. Univariate analysis revealed that RT was a favourable factor for all examined endpoints. Furthermore, head and neck EMPs were associated with superior LRFS, MMFS and PFS. Tumor size <4 cm was associated with superior MMFS, PFS and OS; serum M protein negativity was associated with superior MMFS and PFS; age ≥50 years and local recurrence were associated with poor MMFS. The dose ≥45 Gy group exhibited superior 5-year LRFS, MMFS and PFS rates (94.7%, 94.4%, 90.0%, respectively), while the corresponding values for the dose <45 Gy group were 62.5% (P=0.008), 53.3% (P=0.036) and 41.7% (P<0.001). Involved-site RT of at least 45 Gy should be considered for EMP. Furthermore, patients with head and neck EMP, tumor size <4 cm, age <50 years and serum M protein negativity had better outcomes.

  16. Analysis of risk factors in the development of retinopathy of prematurity.

    PubMed

    Knezević, Sanja; Stojanović, Nadezda; Oros, Ana; Savić, Dragana; Simović, Aleksandra; Knezević, Jasmina

    2011-01-01

    Retinopathy of prematurity (ROP) is a multifactorial disease that occurs most frequently in very small and very sick preterm infants, and it has been identified as the major cause of childhood blindness. The aim of this study was to evaluate ROP incidence and risk factors associated with varying degrees of illness. The study was conducted at the Centre for Neonatology, Paediatric Clinic of the Clinical Centre Kragujevac, Serbia, in the period from June 2006 to December 2008. Ophthalmologic screening was performed in all children with body weight lower than 2000 g or gestational age lower than 36 weeks. We analyzed eighteen postnatal and six perinatal risk factors and the group correlations for each of the risk factors. Out of 317 children that were screened, 56 (17.7%) developed a mild form of ROP, while 68 (21.5%) developed a severe form. Univariate analysis revealed a large number of statistically significant risk factors for the development of ROP, especially the severe form. Multivariate logistical analysis further separated two independent risk factors: small birth weight (p = 0.001) and damage of central nervous system (p = 0.01). Independent risk factors for transition from mild to severe forms of ROP were identified as: small birth weight (p = 0.05) and perinatal risk factors (p = 0.02). Small birth weight and central nervous system damage were risk factors for the development of ROP, perinatal risk factors were identified as significant for transition from mild to severe form of ROP.

  17. Cervical spine mobilisation forces applied by physiotherapy students.

    PubMed

    Snodgrass, Suzanne J; Rivett, Darren A; Robertson, Val J; Stojanovski, Elizabeth

    2010-06-01

    Postero-anterior (PA) mobilisation is commonly used in cervical spine treatment and included in physiotherapy curricula. The manual forces that students apply while learning cervical mobilisation are not known. Quantifying these forces informs the development of strategies for learning to apply cervical mobilisation effectively and safely. This study describes the mechanical properties of cervical PA mobilisation techniques applied by students, and investigates factors associated with force application. Physiotherapy students (n=120) mobilised one of 32 asymptomatic subjects. Students applied Grades I to IV central and unilateral PA mobilisation to C2 and C7 of one asymptomatic subject. Manual forces were measured in three directions using an instrumented treatment table. Spinal stiffness of mobilised subjects was measured at C2 and C7 using a device that applied a standard oscillating force while measuring this force and its concurrent displacement. Analysis of variance was used to determine differences between techniques and grades, intraclass correlation coefficients (ICC) were used to calculate the inter- and intrastudent repeatability of forces, and linear regression was used to determine the associations between applied forces and characteristics of students and mobilised subjects. Mobilisation forces increased from Grades I to IV (highest mean peak force, Grade IV C7 central PA technique: 63.7N). Interstudent reliability was poor [ICC(2,1)=0.23, 95% confidence interval (CI) 0.14 to 0.43], but intrastudent repeatability of forces was somewhat better (0.83, 95% CI 0.81 to 0.86). Higher applied force was associated with greater C7 stiffness, increased frequency of thumb pain, male gender of the student or mobilised subject, and a student being earlier in their learning process. Lower forces were associated with greater C2 stiffness. This study describes the cervical mobilisation forces applied by students, and the characteristics of the student and mobilised

  18. The Analysis of the Impact of Individual Weighting Factor on Individual Scores

    ERIC Educational Resources Information Center

    Kilic, Gulsen Bagci; Cakan, Mehtap

    2006-01-01

    In this study, category-based self and peer assessment were applied twice in a semester in an Elementary Science Teaching Methods course in order to assess individual contributions of group members to group projects as well as to analyze the impact of Individual Weighting Factors (IWF) on individual scores and individual grades. IWF were…

  19. Moment analysis method as applied to the 2S --> 2P transition in cryogenic alkali metal/rare gas matrices.

    PubMed

    Terrill Vosbein, Heidi A; Boatz, Jerry A; Kenney, John W

    2005-12-22

    The moment analysis method (MA) has been tested for the case of 2S --> 2P ([core]ns1 --> [core]np1) transitions of alkali metal atoms (M) doped into cryogenic rare gas (Rg) matrices using theoretically validated simulations. Theoretical/computational M/Rg system models are constructed with precisely defined parameters that closely mimic known M/Rg systems. Monte Carlo (MC) techniques are then employed to generate simulated absorption and magnetic circular dichroism (MCD) spectra of the 2S --> 2P M/Rg transition to which the MA method can be applied with the goal of seeing how effective the MA method is in re-extracting the M/Rg system parameters from these known simulated systems. The MA method is summarized in general, and an assessment is made of the use of the MA method in the rigid shift approximation typically used to evaluate M/Rg systems. The MC-MCD simulation technique is summarized, and validating evidence is presented. The simulation results and the assumptions used in applying MA to M/Rg systems are evaluated. The simulation results on Na/Ar demonstrate that the MA method does successfully re-extract the 2P spin-orbit coupling constant and Landé g-factor values initially used to build the simulations. However, assigning physical significance to the cubic and noncubic Jahn-Teller (JT) vibrational mode parameters in cryogenic M/Rg systems is not supported.

  20. Population-Based Questionnaire Survey on Health Effects of Aircraft Noise on Residents Living around U.S. Airfields in the RYUKYUS—PART II: AN Analysis of the Discriminant Score and the Factor Score

    NASA Astrophysics Data System (ADS)

    HIRAMATSU, K.; MATSUI, T.; MIYAKITA, T.; ITO, A.; TOKUYAMA, T.; OSADA, Y.; YAMAMOTO, T.

    2002-02-01

    Discriminant function values of psychosomatics and neurosis are calculated using the 12 scale scores of the Todai Health Index, a general health questionnaire, obtained in the survey done around the Kadena and Futenma U.S. airfields in Okinawa, Japan. The total number of answers available for the analysis is 6301. Factor analysis is applied to the 12 scale scores by means of the principal factor method, and Oblimin rotation is done because the factors extracted are considered likely to correlate with each other to a greater or lesser extent. The logistic regression analysis is made with the independent variables of discriminant function (DF) values and factor scores and with the dependent variables of Ldn, age (six levels), sex, occupation (four categories) and the interaction of age and sex. Results indicate that the odds ratio of the DF values regarding psychosomatic disorder and of the score of somatic factor have clear dose-response relationship. The odds ratios of the DF value of neurosis and of the score of the mental factor increase in the area where noise exposure is very intense.