Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences
Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric
2016-01-01
Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517
ERIC Educational Resources Information Center
Raykov, Tenko; Little, Todd D.
1999-01-01
Describes a method for evaluating results of Procrustean rotation to a target factor pattern matrix in exploratory factor analysis. The approach, based on the bootstrap method, yields empirical approximations of the sampling distributions of: (1) differences between target elements and rotated factor pattern matrices; and (2) the overall…
Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling
NASA Astrophysics Data System (ADS)
Wada, Yoshihisa; Tsuji, Hiroshi
In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.
Boundary formulations for sensitivity analysis without matrix derivatives
NASA Technical Reports Server (NTRS)
Kane, J. H.; Guru Prasad, K.
1993-01-01
A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.
Determination of apparent coupling factors for adhesive bonded acrylic plates using SEAL approach
NASA Astrophysics Data System (ADS)
Pankaj, Achuthan. C.; Shivaprasad, M. V.; Murigendrappa, S. M.
2018-04-01
Apparent coupling loss factors (CLF) and velocity responses has been computed for two lap joined adhesive bonded plates using finite element and experimental statistical energy analysis like approach. A finite element model of the plates has been created using ANSYS software. The statistical energy parameters have been computed using the velocity responses obtained from a harmonic forced excitation analysis. Experiments have been carried out for two different cases of adhesive bonded joints and the results have been compared with the apparent coupling factors and velocity responses obtained from finite element analysis. The results obtained from the studies signify the importance of modeling of adhesive bonded joints in computation of the apparent coupling factors and its further use in computation of energies and velocity responses using statistical energy analysis like approach.
Bayesian structural equation modeling: a more flexible representation of substantive theory.
Muthén, Bengt; Asparouhov, Tihomir
2012-09-01
This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed Bayesian approach is particularly beneficial in applications where parameters are added to a conventional model such that a nonidentified model is obtained if maximum-likelihood estimation is applied. This approach is useful for measurement aspects of latent variable modeling, such as with confirmatory factor analysis, and the measurement part of structural equation modeling. Two application areas are studied, cross-loadings and residual correlations in confirmatory factor analysis. An example using a full structural equation model is also presented, showing an efficient way to find model misspecification. The approach encompasses 3 elements: model testing using posterior predictive checking, model estimation, and model modification. Monte Carlo simulations and real data are analyzed using Mplus. The real-data analyses use data from Holzinger and Swineford's (1939) classic mental abilities study, Big Five personality factor data from a British survey, and science achievement data from the National Educational Longitudinal Study of 1988.
Adjusted Analyses in Studies Addressing Therapy and Harm: Users' Guides to the Medical Literature.
Agoritsas, Thomas; Merglen, Arnaud; Shah, Nilay D; O'Donnell, Martin; Guyatt, Gordon H
2017-02-21
Observational studies almost always have bias because prognostic factors are unequally distributed between patients exposed or not exposed to an intervention. The standard approach to dealing with this problem is adjusted or stratified analysis. Its principle is to use measurement of risk factors to create prognostically homogeneous groups and to combine effect estimates across groups.The purpose of this Users' Guide is to introduce readers to fundamental concepts underlying adjustment as a way of dealing with prognostic imbalance and to the basic principles and relative trustworthiness of various adjustment strategies.One alternative to the standard approach is propensity analysis, in which groups are matched according to the likelihood of membership in exposed or unexposed groups. Propensity methods can deal with multiple prognostic factors, even if there are relatively few patients having outcome events. However, propensity methods do not address other limitations of traditional adjustment: investigators may not have measured all relevant prognostic factors (or not accurately), and unknown factors may bias the results.A second approach, instrumental variable analysis, relies on identifying a variable associated with the likelihood of receiving the intervention but not associated with any prognostic factor or with the outcome (other than through the intervention); this could mimic randomization. However, as with assumptions of other adjustment approaches, it is never certain if an instrumental variable analysis eliminates bias.Although all these approaches can reduce the risk of bias in observational studies, none replace the balance of both known and unknown prognostic factors offered by randomization.
Shaik, Shaffi Ahamed; Almarzuqi, Ahmed; Almogheer, Rakan; Alharbi, Omar; Jalal, Abdulaziz; Alorainy, Majed
2017-08-17
To assess learning approaches of 1st, 2nd, and 3rd-year medical students by using revised two-factor study process questionnaire, and to assess reliability and validity of the questionnaire. This cross-sectional study was conducted at the College of Medicine, Riyadh, Saudi Arabia in 2014. The revised two-factor study process questionnaire (R-SPQ-2F) was completed by 610 medical students of both genders, from foundation (first year), central nervous system (second year), medicine and surgery (third year) courses. The study process was evaluated by computing mean scores of two research study approaches (deep & surface) using student's t-test and one-way analysis of variance. The internal consistency and construct validity of the questionnaire were assessed using Cronbach's α and factor analysis. The mean score of deep approach was significantly higher than the surface approach among participants(t (770) =7.83, p= 0.000) for the four courses. The mean scores of deep approach were significantly higher among participants with higher grade point average (F (2,768) =13.31, p=0.001) along with more number of study hours by participants (F (2,768) =20.08, p=0.001). The Cronbach's α-values of items at 0.70 indicate the good internal consistency of questionnaire used. Factor analysis confirms two factors (deep and surface approaches) of R-SPQ-2F. The deep approach to learning was the primary approach among 1st, 2nd and 3rd-year King Saud University medical students. This study confirms reliability and validity of the revised two-factor study process questionnaire. Medical educators could use the results of such studies to make required changes in the curriculum.
Analysis of the regulation of viral transcription.
Gloss, Bernd; Kalantari, Mina; Bernard, Hans-Ulrich
2005-01-01
Despite the small genomes and number of genes of papillomaviruses, regulation of their transcription is very complex and governed by numerous transcription factors, cis-responsive elements, and epigenetic phenomena. This chapter describes the strategies of how one can approach a systematic analysis of these factors, elements, and mechanisms. From the numerous different techniques useful for studying transcription, we describe in detail three selected protocols of approaches that have been relevant in shaping our knowledge of human papillomavirus transcription. These are DNAse I protection ("footprinting") for location of transcription-factor binding sites, electrophoretic mobility shifts ("gelshifts") for analysis of bound transcription factors, and bisulfite sequencing for analysis of DNA methylation as a prerequisite for epigenetic transcriptional regulation.
A Comparison of Component and Factor Patterns: A Monte Carlo Approach.
ERIC Educational Resources Information Center
Velicer, Wayne F.; And Others
1982-01-01
Factor analysis, image analysis, and principal component analysis are compared with respect to the factor patterns they would produce under various conditions. The general conclusion that is reached is that the three methods produce results that are equivalent. (Author/JKS)
NASA Astrophysics Data System (ADS)
Razavi, S.; Gupta, H. V.
2014-12-01
Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.
Evaluation of Parallel Analysis Methods for Determining the Number of Factors
ERIC Educational Resources Information Center
Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.
2010-01-01
Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2017-01-01
This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.
Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach
NASA Astrophysics Data System (ADS)
Jain, Vineet; Raj, Tilak
2014-09-01
Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.
NASA Astrophysics Data System (ADS)
Zumpano, Veronica; Balteanu, Dan; Mazzorana, Bruno; Micu, Mihai
2014-05-01
It is increasingly important to provide to stakeholders tools that will enable them to better understand what is the state of the environment in which they live and manage and to help them to make decisions that aim to minimize the consequences of hydro-meteorological hazards. Very often, however, quantitative studies, especially for large areas, are difficult to perform. This is due to the fact that unfortunately isn't often possible to have the numerous data required to perform the analysis. In addition it has been proven that in scenario analysis, often deterministic approaches are not able to detect some features of the system revealing unexpected behaviors, and resulting in underestimation or omission of some impact factors. Here are presented some preliminary results obtained applying Formative Scenario Analysis that can be considered a possible solution for landslide risk analysis in cases where the data needed even if existent are not available. This method is an expert based approach that integrates intuitions and qualitative evaluations of impact factors with the quantitative analysis of relations between these factors: a group of experts with different but pertinent expertise, determine (by a rating procedure) quantitative relations between these factors, then through mathematical operations the scenarios describing a certain state of the system are obtained. The approach is applied to Buzau County (Romania), an area belonging to the Curvature Romanian Carpathians and Subcarpathians, a region strongly affected by environmental hazards. The region has been previously involved in numerous episodes of severe hydro-meteorological events that caused considerable damages (1975, 2005, 2006). In this application we are referring only to one type of landslides that can be described as shallow and medium-seated with a (mainly) translational movement that can go from slide to flow. The material involved can be either soil, debris or a mixture of both, in Romanian literature these typical movements has been described as alunecare curgatoare. The Formative Scenario Analysis approach will be applied for each component of risk (H,V,and A) and then the acquired states will be combined in order to obtain for obtaining a series of alternatives scenarios for risk. The approach is structured in two main sections corresponding to a level of influence of conditioning factors and a response. In this latter are obtained the results of the formative scenario approach trained with the conditioning factors of the first level. These factors are divided in two subsets representing 2 levels of influences, k=1 comprises the global factors while in k=2 one finds local factors. In order to include uncertainty estimation within the analysis the method of knowledge representation type-1 fuzzy sets is introduced and hence decisions made by experts on certain events are expressed in terms of triangular fuzzy numbers.
Kumar, Keshav
2017-11-01
Multivariate curve resolution alternating least square (MCR-ALS) analysis is the most commonly used curve resolution technique. The MCR-ALS model is fitted using the alternate least square (ALS) algorithm that needs initialisation of either contribution profiles or spectral profiles of each of the factor. The contribution profiles can be initialised using the evolve factor analysis; however, in principle, this approach requires that data must belong to the sequential process. The initialisation of the spectral profiles are usually carried out using the pure variable approach such as SIMPLISMA algorithm, this approach demands that each factor must have the pure variables in the data sets. Despite these limitations, the existing approaches have been quite a successful for initiating the MCR-ALS analysis. However, the present work proposes an alternate approach for the initialisation of the spectral variables by generating the random variables in the limits spanned by the maxima and minima of each spectral variable of the data set. The proposed approach does not require that there must be pure variables for each component of the multicomponent system or the concentration direction must follow the sequential process. The proposed approach is successfully validated using the excitation-emission matrix fluorescence data sets acquired for certain fluorophores with significant spectral overlap. The calculated contribution and spectral profiles of these fluorophores are found to correlate well with the experimental results. In summary, the present work proposes an alternate way to initiate the MCR-ALS analysis.
Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc
2014-01-01
We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.
Item Factor Analysis: Current Approaches and Future Directions
ERIC Educational Resources Information Center
Wirth, R. J.; Edwards, Michael C.
2007-01-01
The rationale underlying factor analysis applies to continuous and categorical variables alike; however, the models and estimation methods for continuous (i.e., interval or ratio scale) data are not appropriate for item-level data that are categorical in nature. The authors provide a targeted review and synthesis of the item factor analysis (IFA)…
Stabilization and robustness of non-linear unity-feedback system - Factorization approach
NASA Technical Reports Server (NTRS)
Desoer, C. A.; Kabuli, M. G.
1988-01-01
The paper is a self-contained discussion of a right factorization approach in the stability analysis of the nonlinear continuous-time or discrete-time, time-invariant or time-varying, well-posed unity-feedback system S1(P, C). It is shown that a well-posed stable feedback system S1(P, C) implies that P and C have right factorizations. In the case where C is stable, P has a normalized right-coprime factorization. The factorization approach is used in stabilization and simultaneous stabilization results.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Nagengast, Benjamin; Morin, Alexandre J. S.
2013-01-01
This substantive-methodological synergy applies evolving approaches to factor analysis to substantively important developmental issues of how five-factor-approach (FFA) personality measures vary with gender, age, and their interaction. Confirmatory factor analyses (CFAs) conducted at the item level often do not support a priori FFA structures, due…
ERIC Educational Resources Information Center
Rodrigues Jr., Jose Florencio; Lehmann, Angela Valeria Levay; Fleith, Denise De Souza
2005-01-01
Building on previous studies centred on the interaction between adviser and advisee in masters thesis projects, in which a qualitative approach was used, the present study uses factor analysis to identify the factors that determine either a successful or unsuccessful outcome for the masters thesis project. There were five factors relating to the…
Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.
Stankov, L
1979-07-01
The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.
A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis
ERIC Educational Resources Information Center
Edwards, Michael C.
2010-01-01
Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…
Developing Multidimensional Likert Scales Using Item Factor Analysis: The Case of Four-Point Items
ERIC Educational Resources Information Center
Asún, Rodrigo A.; Rdz-Navarro, Karina; Alvarado, Jesús M.
2016-01-01
This study compares the performance of two approaches in analysing four-point Likert rating scales with a factorial model: the classical factor analysis (FA) and the item factor analysis (IFA). For FA, maximum likelihood and weighted least squares estimations using Pearson correlation matrices among items are compared. For IFA, diagonally weighted…
The Dispositions for Culturally Responsive Pedagogy Scale
ERIC Educational Resources Information Center
Whitaker, Manya C.; Valtierra, Kristina Marie
2018-01-01
Purpose: The purpose of this study is to develop and validate the dispositions for culturally responsive pedagogy scale (DCRPS). Design/methodology/approach: Scale development consisted of a six-step process including item development, expert review, exploratory factor analysis, factor interpretation, confirmatory factor analysis and convergent…
Historical Evolution of Old-Age Mortality and New Approaches to Mortality Forecasting
Gavrilov, Leonid A.; Gavrilova, Natalia S.; Krut'ko, Vyacheslav N.
2017-01-01
Knowledge of future mortality levels and trends is important for actuarial practice but poses a challenge to actuaries and demographers. The Lee-Carter method, currently used for mortality forecasting, is based on the assumption that the historical evolution of mortality at all age groups is driven by one factor only. This approach cannot capture an additive manner of mortality decline observed before the 1960s. To overcome the limitation of the one-factor model of mortality and to determine the true number of factors underlying mortality changes over time, we suggest a new approach to mortality analysis and forecasting based on the method of latent variable analysis. The basic assumption of this approach is that most variation in mortality rates over time is a manifestation of a small number of latent variables, variation in which gives rise to the observed mortality patterns. To extract major components of mortality variation, we apply factor analysis to mortality changes in developed countries over the period of 1900–2014. Factor analysis of time series of age-specific death rates in 12 developed countries (data taken from the Human Mortality Database) identified two factors capable of explaining almost 94 to 99 percent of the variance in the temporal changes of adult death rates at ages 25 to 85 years. Analysis of these two factors reveals that the first factor is a “young-age” or background factor with high factor loadings at ages 30 to 45 years. The second factor can be called an “oldage” or senescent factor because of high factor loadings at ages 65 to 85 years. It was found that the senescent factor was relatively stable in the past but now is rapidly declining for both men and women. The decline of the senescent factor is faster for men, although in most countries, it started almost 30 years later. Factor analysis of time series of age-specific death rates conducted for the oldest-old ages (65 to 100 years) found two factors explaining variation of mortality at extremely old ages in the United States. The first factor is comparable to the senescent factor found for adult mortality. The second factor, however, is specific to extreme old ages (96 to 100 years) and shows peaks in 1960 and 2000. Although mortality below 90 to 95 years shows a steady decline with time driven by the senescent factor, mortality of centenarians does not decline and remains relatively stable. The approach suggested in this paper has several advantages. First, it is able to determine the total number of independent factors affecting mortality changes over time. Second, this approach allows researchers to determine the time interval in which underlying factors remain stable or undergo rapid changes. Most methods of mortality projections are not able to identify the best base period for mortality projections, attempting to use the longest-possible time period instead. We observe that the senescent factor of mortality continues to decline, and this decline does not demonstrate any indications of slowing down. At the same time, mortality of centenarians does not decline and remains stable. The lack of mortality decline at extremely old ages may diminish anticipated longevity gains in the future. PMID:29170765
Experienced quality factors: qualitative evaluation approach to audiovisual quality
NASA Astrophysics Data System (ADS)
Jumisko-Pyykkö, Satu; Häkkinen, Jukka; Nyman, Göte
2007-02-01
Subjective evaluation is used to identify impairment factors of multimedia quality. The final quality is often formulated via quantitative experiments, but this approach has its constraints, as subject's quality interpretations, experiences and quality evaluation criteria are disregarded. To identify these quality evaluation factors, this study examined qualitatively the criteria participants used to evaluate audiovisual video quality. A semi-structured interview was conducted with 60 participants after a subjective audiovisual quality evaluation experiment. The assessment compared several, relatively low audio-video bitrate ratios with five different television contents on mobile device. In the analysis, methodological triangulation (grounded theory, Bayesian networks and correspondence analysis) was applied to approach the qualitative quality. The results showed that the most important evaluation criteria were the factors of visual quality, contents, factors of audio quality, usefulness - followability and audiovisual interaction. Several relations between the quality factors and the similarities between the contents were identified. As a research methodological recommendation, the focus on content and usage related factors need to be further examined to improve the quality evaluation experiments.
Euerby, Adam; Burns, Catherine M
2014-03-01
Increasingly, people work in socially networked environments. With growing adoption of enterprise social network technologies, supporting effective social community is becoming an important factor in organizational success. Relatively few human factors methods have been applied to social connection in communities. Although team methods provide a contribution, they do not suit design for communities. Wenger's community of practice concept, combined with cognitive work analysis, provided one way of designing for community. We used a cognitive work analysis approach modified with principles for supporting communities of practice to generate a new website design. Over several months, the community using the site was studied to examine their degree of social connectedness and communication levels. Social network analysis and communications analysis, conducted at three different intervals, showed increases in connections between people and between people and organizations, as well as increased communication following the launch of the new design. In this work, we suggest that human factors approaches can be effective in social environments, when applied considering social community principles. This work has implications for the development of new human factors methods as well as the design of interfaces for sociotechnical systems that have community building requirements.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Veawab, A.
2013-03-01
This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.
A computational intelligent approach to multi-factor analysis of violent crime information system
NASA Astrophysics Data System (ADS)
Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing
2017-02-01
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Carvalho, Carolina Abreu de; Fonsêca, Poliana Cristina de Almeida; Nobre, Luciana Neri; Priore, Silvia Eloiza; Franceschini, Sylvia do Carmo Castro
2016-01-01
The objective of this study is to provide guidance for identifying dietary patterns using the a posteriori approach, and analyze the methodological aspects of the studies conducted in Brazil that identified the dietary patterns of children. Articles were selected from the Latin American and Caribbean Literature on Health Sciences, Scientific Electronic Library Online and Pubmed databases. The key words were: Dietary pattern; Food pattern; Principal Components Analysis; Factor analysis; Cluster analysis; Reduced rank regression. We included studies that identified dietary patterns of children using the a posteriori approach. Seven studies published between 2007 and 2014 were selected, six of which were cross-sectional and one cohort, Five studies used the food frequency questionnaire for dietary assessment; one used a 24-hour dietary recall and the other a food list. The method of exploratory approach used in most publications was principal components factor analysis, followed by cluster analysis. The sample size of the studies ranged from 232 to 4231, the values of the Kaiser-Meyer-Olkin test from 0.524 to 0.873, and Cronbach's alpha from 0.51 to 0.69. Few Brazilian studies identified dietary patterns of children using the a posteriori approach and principal components factor analysis was the technique most used.
ERIC Educational Resources Information Center
Lorenzo-Seva, Urbano; Ferrando, Pere J.
2013-01-01
FACTOR 9.2 was developed for three reasons. First, exploratory factor analysis (FA) is still an active field of research although most recent developments have not been incorporated into available programs. Second, there is now renewed interest in semiconfirmatory (SC) solutions as suitable approaches to the complex structures are commonly found…
Herskind, Carsten; Talbot, Christopher J.; Kerns, Sarah L.; Veldwijk, Marlon R.; Rosenstein, Barry S.; West, Catharine M. L.
2016-01-01
Adverse reactions in normal tissue after radiotherapy (RT) limit the dose that can be given to tumour cells. Since 80% of individual variation in clinical response is estimated to be caused by patient-related factors, identifying these factors might allow prediction of patients with increased risk of developing severe reactions. While inactivation of cell renewal is considered a major cause of toxicity in early-reacting normal tissues, complex interactions involving multiple cell types, cytokines, and hypoxia seem important for late reactions. Here, we review ‘omics’ approaches such as screening of genetic polymorphisms or gene expression analysis, and assess the potential of epigenetic factors, posttranslational modification, signal transduction, and metabolism. Furthermore, functional assays have suggested possible associations with clinical risk of adverse reaction. Pathway analysis incorporating different ‘omics’ approaches may be more efficient in identifying critical pathways than pathway analysis based on single ‘omics’ data sets. Integrating these pathways with functional assays may be powerful in identifying multiple subgroups of RT patients characterized by different mechanisms. Thus ‘omics’ and functional approaches may synergize if they are integrated into radiogenomics ‘systems biology’ to facilitate the goal of individualised radiotherapy. PMID:26944314
Golino, Hudson F.; Epskamp, Sacha
2017-01-01
The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman’s eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study. PMID:28594839
Golino, Hudson F; Epskamp, Sacha
2017-01-01
The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman's eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study.
Analysis of case-only studies accounting for genotyping error.
Cheng, K F
2007-03-01
The case-only design provides one approach to assess possible interactions between genetic and environmental factors. It has been shown that if these factors are conditionally independent, then a case-only analysis is not only valid but also very efficient. However, a drawback of the case-only approach is that its conclusions may be biased by genotyping errors. In this paper, our main aim is to propose a method for analysis of case-only studies when these errors occur. We show that the bias can be adjusted through the use of internal validation data, which are obtained by genotyping some sampled individuals twice. Our analysis is based on a simple and yet highly efficient conditional likelihood approach. Simulation studies considered in this paper confirm that the new method has acceptable performance under genotyping errors.
NASA Astrophysics Data System (ADS)
Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.
2017-12-01
For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.
Integrating host, natural enemy, and other processes in population models of the pine sawfly
A. A. Sharov
1991-01-01
Explanation of population dynamics is one of the main problems in population ecology. There are two main approaches to the explanation: the factor approach and the dynamic approach. According to the first, an explanation is obtained when the effect of various environmental factors on population density is revealed. Such analysis is performed using well developed...
USDA-ARS?s Scientific Manuscript database
Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...
Structured plant metabolomics for the simultaneous exploration of multiple factors.
Vasilev, Nikolay; Boccard, Julien; Lang, Gerhard; Grömping, Ulrike; Fischer, Rainer; Goepfert, Simon; Rudaz, Serge; Schillberg, Stefan
2016-11-17
Multiple factors act simultaneously on plants to establish complex interaction networks involving nutrients, elicitors and metabolites. Metabolomics offers a better understanding of complex biological systems, but evaluating the simultaneous impact of different parameters on metabolic pathways that have many components is a challenging task. We therefore developed a novel approach that combines experimental design, untargeted metabolic profiling based on multiple chromatography systems and ionization modes, and multiblock data analysis, facilitating the systematic analysis of metabolic changes in plants caused by different factors acting at the same time. Using this method, target geraniol compounds produced in transgenic tobacco cell cultures were grouped into clusters based on their response to different factors. We hypothesized that our novel approach may provide more robust data for process optimization in plant cell cultures producing any target secondary metabolite, based on the simultaneous exploration of multiple factors rather than varying one factor each time. The suitability of our approach was verified by confirming several previously reported examples of elicitor-metabolite crosstalk. However, unravelling all factor-metabolite networks remains challenging because it requires the identification of all biochemically significant metabolites in the metabolomics dataset.
Bayesian Structural Equation Modeling: A More Flexible Representation of Substantive Theory
ERIC Educational Resources Information Center
Muthen, Bengt; Asparouhov, Tihomir
2012-01-01
This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed…
Approaches of researches in medical geography in Poland and Ukraine
NASA Astrophysics Data System (ADS)
Pantylej, Wiktoria
2008-01-01
This paper deals with the historical review of medical geography in the world, in Poland and in Ukraine. There are different approaches in medical geography: according to the research subject (ecological and economic approaches) and according to the current affairs of research (approach concerns sexuality, the age of the population and accordingly, accessibility of health care services to the population). To the author's mind, the most perspective approaches in medical geography in Poland and Ukraine are as follows: - integrative - dedicated to the health status of the population in connection with the quality and life level; - mathematical-statistical - connected with the problem of synthetic indexes of health status of the populations and factors influencing it, and with the problem of economic value of health and life of the population; - social-economic - the analysis of the influence of socioeconomic factors (such as wealth measure, rate of unemployment, work conditions and others) on public health; - ecological - connected with the researches dedicated to the analysis of environmental impact on public health status of the population; - demographical - the analysis of demographical factors of forming public health status; - social-psychological - health culture of the population, perception of the own health/morbidity and health care systems existing in different countries.
Augustin, Regina; Lichtenthaler, Stefan F.; Greeff, Michael; Hansen, Jens; Wurst, Wolfgang; Trümbach, Dietrich
2011-01-01
The molecular mechanisms and genetic risk factors underlying Alzheimer's disease (AD) pathogenesis are only partly understood. To identify new factors, which may contribute to AD, different approaches are taken including proteomics, genetics, and functional genomics. Here, we used a bioinformatics approach and found that distinct AD-related genes share modules of transcription factor binding sites, suggesting a transcriptional coregulation. To detect additional coregulated genes, which may potentially contribute to AD, we established a new bioinformatics workflow with known multivariate methods like support vector machines, biclustering, and predicted transcription factor binding site modules by using in silico analysis and over 400 expression arrays from human and mouse. Two significant modules are composed of three transcription factor families: CTCF, SP1F, and EGRF/ZBPF, which are conserved between human and mouse APP promoter sequences. The specific combination of in silico promoter and multivariate analysis can identify regulation mechanisms of genes involved in multifactorial diseases. PMID:21559189
Sociotechnical attributes of safe and unsafe work systems.
Kleiner, Brian M; Hettinger, Lawrence J; DeJoy, David M; Huang, Yuang-Hsiang; Love, Peter E D
2015-01-01
Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social-organisational and technical-work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human-system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human-systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social--organisational and technology--work process factors as they impact work system analysis, design and operation.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
Cook, David A; Castillo, Richmond M; Gas, Becca; Artino, Anthony R
2017-10-01
Measurement of motivation and cognitive load has potential value in health professions education. Our objective was to evaluate the validity of scores from Dweck's Implicit Theories of Intelligence Scale (ITIS), Elliot's Achievement Goal Questionnaire-Revised (AGQ-R) and Leppink's cognitive load index (CLI). This was a validity study evaluating internal structure using reliability and factor analysis, and relationships with other variables using the multitrait-multimethod matrix. Two hundred and thirty-two secondary school students participated in a medical simulation-based training activity at an academic medical center. Pre-activity ITIS (implicit theory [mindset] domains: incremental, entity) and AGQ-R (achievement goal domains: mastery-approach, mastery-avoidance, performance-approach, performance-avoidance), post-activity CLI (cognitive load domains: intrinsic, extrinsic, germane) and task persistence (self-directed repetitions on a laparoscopic surgery task) were measured. Internal consistency reliability (Cronbach's alpha) was > 0.70 for all domain scores except AGQ-R performance-avoidance (alpha 0.68) and CLI extrinsic load (alpha 0.64). Confirmatory factor analysis of ITIS and CLI scores demonstrated acceptable model fit. Confirmatory factor analysis of AGQ-R scores demonstrated borderline fit, and exploratory factor analysis suggested a three-domain model for achievement goals (mastery-approach, performance and avoidance). Correlations among scores from conceptually-related domains generally aligned with expectations, as follows: ITIS incremental and entity, r = -0.52; AGQ-R mastery-avoidance and performance-avoidance, r = 0.71; mastery-approach and performance-approach, r = 0.55; performance-approach and performance-avoidance, r = 0.43; mastery-approach and mastery-avoidance, r = 0.36; CLI germane and extrinsic, r = -0.35; ITIS incremental and AGQ-R mastery-approach, r = 0.34; ITIS incremental and CLI germane, r = 0.44; AGQ-R mastery-approach and CLI germane, r = 0.48 (all p < 0.001). We found no correlation between the number of task repetitions (i.e. persistence) and mastery-approach scores, r = -0.01. ITIS and CLI scores had appropriate internal structures and relationships with other variables. AGQ-R scores fit a three-factor (not four-factor) model that collapsed avoidance into one domain, although relationships of other variables with the original four domain scores generally aligned with expectations. Mastery goals are positively correlated with germane cognitive load. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Analysis of time domain reflectometry data from LTPP seasonal monitoring program test sections
DOT National Transportation Integrated Search
1996-07-01
This report documents an approach for designing an Advanced Traffic Management System (ATMS) from a human factors perspective. In designing the ATMS from a human factors perspective, a user-centered top-down system analysis was conducted. Methodologi...
Using Refined Regression Analysis To Assess The Ecological Services Of Restored Wetlands
A hierarchical approach to regression analysis of wetland water treatment was conducted to determine which factors are the most appropriate for characterizing wetlands of differing structure and function. We used this approach in an effort to identify the types and characteristi...
Scalable non-negative matrix tri-factorization.
Čopar, Andrej; Žitnik, Marinka; Zupan, Blaž
2017-01-01
Matrix factorization is a well established pattern discovery tool that has seen numerous applications in biomedical data analytics, such as gene expression co-clustering, patient stratification, and gene-disease association mining. Matrix factorization learns a latent data model that takes a data matrix and transforms it into a latent feature space enabling generalization, noise removal and feature discovery. However, factorization algorithms are numerically intensive, and hence there is a pressing challenge to scale current algorithms to work with large datasets. Our focus in this paper is matrix tri-factorization, a popular method that is not limited by the assumption of standard matrix factorization about data residing in one latent space. Matrix tri-factorization solves this by inferring a separate latent space for each dimension in a data matrix, and a latent mapping of interactions between the inferred spaces, making the approach particularly suitable for biomedical data mining. We developed a block-wise approach for latent factor learning in matrix tri-factorization. The approach partitions a data matrix into disjoint submatrices that are treated independently and fed into a parallel factorization system. An appealing property of the proposed approach is its mathematical equivalence with serial matrix tri-factorization. In a study on large biomedical datasets we show that our approach scales well on multi-processor and multi-GPU architectures. On a four-GPU system we demonstrate that our approach can be more than 100-times faster than its single-processor counterpart. A general approach for scaling non-negative matrix tri-factorization is proposed. The approach is especially useful parallel matrix factorization implemented in a multi-GPU environment. We expect the new approach will be useful in emerging procedures for latent factor analysis, notably for data integration, where many large data matrices need to be collectively factorized.
Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L
2017-07-01
To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.
Network Analysis: A Novel Approach to Understand Suicidal Behaviour
de Beurs, Derek
2017-01-01
Although suicide is a major public health issue worldwide, we understand little of the onset and development of suicidal behaviour. Suicidal behaviour is argued to be the end result of the complex interaction between psychological, social and biological factors. Epidemiological studies resulted in a range of risk factors for suicidal behaviour, but we do not yet understand how their interaction increases the risk for suicidal behaviour. A new approach called network analysis can help us better understand this process as it allows us to visualize and quantify the complex association between many different symptoms or risk factors. A network analysis of data containing information on suicidal patients can help us understand how risk factors interact and how their interaction is related to suicidal thoughts and behaviour. A network perspective has been successfully applied to the field of depression and psychosis, but not yet to the field of suicidology. In this theoretical article, I will introduce the concept of network analysis to the field of suicide prevention, and offer directions for future applications and studies.
Contributions to the Underlying Bivariate Normal Method for Factor Analyzing Ordinal Data
ERIC Educational Resources Information Center
Xi, Nuo; Browne, Michael W.
2014-01-01
A promising "underlying bivariate normal" approach was proposed by Jöreskog and Moustaki for use in the factor analysis of ordinal data. This was a limited information approach that involved the maximization of a composite likelihood function. Its advantage over full-information maximum likelihood was that very much less computation was…
Britton, Gary I.; Davey, Graham C. L.
2017-01-01
Emerging evidence suggests that many of the clinical constructs used to help understand and explain obsessive-compulsive (OC) symptoms, and negative mood, may be causally interrelated. One approach to understanding this interrelatedness is a motivational systems approach. This approach suggests that rather than considering clinical constructs and negative affect as separable entities, they are all features of an integrated threat management system, and as such are highly coordinated and interdependent. The aim of the present study was to examine if clinical constructs related to OC symptoms and negative mood are best treated as separable or, alternatively, if these clinical constructs and negative mood are best seen as indicators of an underlying superordinate variable, as would be predicted by a motivational systems approach. A sample of 370 student participants completed measures of mood and the clinical constructs of inflated responsibility, intolerance of uncertainty, not just right experiences, and checking stop rules. An exploratory factor analysis suggested two plausible factor structures, one where all construct items and negative mood items loaded onto one underlying superordinate variable, and a second structure comprising of five factors, where each item loaded onto a factor representative of what the item was originally intended to measure. A confirmatory factor analysis showed that the five factor model was preferential to the one factor model, suggesting the four constructs and negative mood are best conceptualized as separate variables. Given the predictions of a motivational systems approach were not supported in the current study, other possible explanations for the causal interrelatedness between clinical constructs and negative mood are discussed. PMID:28959224
NASA Astrophysics Data System (ADS)
Zheng, Lanqin; Dong, Yan; Huang, Ronghuai; Chang, Chun-Yen; Bhagat, Kaushal Kumar
2018-01-01
The purpose of this study was to examine the relations between primary school students' conceptions of, approaches to, and self-efficacy in learning science in Mainland China. A total of 1049 primary school students from Mainland China participated in this study. Three instruments were adapted to measure students' conceptions of learning science, approaches to learning science, and self-efficacy. The exploratory factor analysis and confirmatory factor analysis were adopted to validate three instruments. The path analysis was employed to understand the relationships between conceptions of learning science, approaches to learning science, and self-efficacy. The findings indicated that students' lower level conceptions of learning science positively influenced their surface approaches in learning science. Higher level conceptions of learning science had a positive influence on deep approaches and a negative influence on surface approaches to learning science. Furthermore, self-efficacy was also a hierarchical construct and can be divided into the lower level and higher level. Only students' deep approaches to learning science had a positive influence on their lower and higher level of self-efficacy in learning science. The results were discussed in the context of the implications for teachers and future studies.
Marital dissolution: an economic analysis.
Hunter, K A
1984-01-01
A longitudinal analysis of factors affecting marital dissolution in the United States is presented using data from the Coleman-Rossi Retrospective Life History. Factors considered include labor force participation of both spouses, wage growth, size of family unit, age at marriage, and educational status. The study is based on the economic analysis approach developed by Gary S. Becker and others.
Data analysis strategies for reducing the influence of the bias in cross-cultural research.
Sindik, Josko
2012-03-01
In cross-cultural research, researchers have to adjust the constructs and associated measurement instruments that have been developed in one culture and then imported for use in another culture. Importing concepts from other cultures is often simply reduced to language adjustment of the content in the items of the measurement instruments that define a certain (psychological) construct. In the context of cross-cultural research, test bias can be defined as a generic term for all nuisance factors that threaten the validity of cross-cultural comparisons. Bias can be an indicator that instrument scores based on the same items measure different traits and characteristics across different cultural groups. To reduce construct, method and item bias,the researcher can consider these strategies: (1) simply comparing average results in certain measuring instruments; (2) comparing only the reliability of certain dimensions of the measurement instruments, applied to the "target" and "source" samples of participants, i.e. from different cultures; (3) comparing the "framed" factor structure (fixed number of factors) of the measurement instruments, applied to the samples from the "target" and "source" cultures, using explorative factor analysis strategy on separate samples; (4) comparing the complete constructs ("unframed" factor analysis, i.e. unlimited number of factors) in relation to their best psychometric properties and the possibility of interpreting (best suited to certain cultures, applying explorative strategy of factor analysis); or (5) checking the similarity of the constructs in the samples from different cultures (using structural equation modeling approach). Each approach has its advantages and disadvantages. The advantages and lacks of each approach are discussed.
NASA Astrophysics Data System (ADS)
Kamano, Hiroyuki
2018-05-01
We give an overview of our recent efforts to extract electromagnetic transition form factors for N^* and Δ^* baryon resonances through a global analysis of the single-pion electroproductions off the proton within the ANL-Osaka dynamical coupled-channels approach. Preliminary results for the extracted form factors associated with Δ(1232)3/2^+ and the Roper resonance are presented, with emphasis on the complex-valued nature of the transition form factors defined by poles.
ERIC Educational Resources Information Center
Ferrando, Pere J.
2008-01-01
This paper develops results and procedures for obtaining linear composites of factor scores that maximize: (a) test information, and (b) validity with respect to external variables in the multiple factor analysis (FA) model. I treat FA as a multidimensional item response theory model, and use Ackerman's multidimensional information approach based…
[Approach to the Development of Mind and Persona].
Sawaguchi, Toshiko
2018-01-01
To access medical specialists by health specialists working in the regional health field, the possibility of utilizing the voice approach for dissociative identity disorder (DID) patients as a health assessment for medical access (HAMA) was investigated. The first step is to investigate whether the plural personae in a single DID patient can be discriminated by voice analysis. Voices of DID patients including these with different personae were extracted from YouTube and were analysed using the software PRAAT with basic frequency, oral factors, chin factors and tongue factors. In addition, RAKUGO story teller voices made artificially and dramatically were analysed in the same manner. Quantitive and qualitative analysis method were carried out and nested logistic regression and a nested generalized linear model was developed. The voice from different personae in one DID patient could be visually and easily distinquished using basic frequency curve, cluster analysis and factor analysis. In the canonical analysis, only Roy's maximum root was <0.01. In the nested generalized linear model, the model using a standard deviation (SD) indicator fit best and some other possibilities are shown here. In DID patients, the short transition time among plural personae could guide to the risky situation such as suicide. So if the voice approach can show the time threshold of changes between the different personae, it would be useful as an Access Assessment in the form of a simple HAMA.
We advocate an approach to reduce the anticipated increase in stormwater runoff from conventional development by demonstrating a low-impact development that incorporates hydrologic factors into an expanded land suitability analysis. This methodology was applied to a 3 hectare exp...
Zhang, Yan; Zhong, Ming
2013-01-01
Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883
Sociotechnical attributes of safe and unsafe work systems
Kleiner, Brian M.; Hettinger, Lawrence J.; DeJoy, David M.; Huang, Yuang-Hsiang; Love, Peter E.D.
2015-01-01
Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social–organisational and technical–work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human–system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human–systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. Practitioner Summary: The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social–organisational and technology–work process factors as they impact work system analysis, design and operation. PMID:25909756
Azadeh, Ali; Salehi, Vahid; Mirzayi, Mahsa
2016-12-01
Resilience engineering (RE) is a new paradigm that can control incidents and reduce their consequences. Integrated RE includes four new factors-self-organization, teamwork, redundancy, and fault-tolerance-in addition to conventional RE factors. This study aimed to evaluate the impacts of these four factors on RE and determine the most efficient factor in an uncertain environment. The required data were collected through a questionnaire in a petrochemical plant in June 2013. The questionnaire was completed by 115 respondents including 37 managers and 78 operators. Fuzzy data envelopment analysis was used in different α-cuts in order to calculate the impact of each factor. Analysis of variance was employed to compare the efficiency score means of the four above-mentioned factors. The results showed that as α approached 0 and the system became fuzzier (α = 0.3 and α = 0.1), teamwork played a significant role and had the highest impact on the resilient system. In contrast, as α approached 1 and the fuzzy system went toward a certain mode (α = 0.9 and α = 1), redundancy had a vital role in the selected resilient system. Therefore, redundancy and teamwork were the most efficient factors. The approach developed in this study could be used for identifying the most important factors in such environments. The results of this study may help managers to have better understanding of weak and strong points in such industries.
A Factor Analytic and Regression Approach to Functional Age: Potential Effects of Race.
ERIC Educational Resources Information Center
Colquitt, Alan L.; And Others
Factor analysis and multiple regression are two major approaches used to look at functional age, which takes account of the extensive variation in the rate of physiological and psychological maturation throughout life. To examine the role of racial or cultural influences on the measurement of functional age, a battery of 12 tests concentrating on…
ERIC Educational Resources Information Center
Wang, Lijuan; Ha, Amy S.
2012-01-01
This study aims to examine the factors influencing pre-service Physical Education (PE) teachers' perception of a specific constructivist approach--Teaching Games for Understanding (TGfU) in Hong Kong. By adopting a qualitative approach, 20 pre-service PE teachers were recruited for individual semi-structured interviews. Deductive data analysis was…
ERIC Educational Resources Information Center
Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo
2012-01-01
A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…
A phasor approach analysis of multiphoton FLIM measurements of three-dimensional cell culture models
NASA Astrophysics Data System (ADS)
Lakner, P. H.; Möller, Y.; Olayioye, M. A.; Brucker, S. Y.; Schenke-Layland, K.; Monaghan, M. G.
2016-03-01
Fluorescence lifetime imaging microscopy (FLIM) is a useful approach to obtain information regarding the endogenous fluorophores present in biological samples. The concise evaluation of FLIM data requires the use of robust mathematical algorithms. In this study, we developed a user-friendly phasor approach for analyzing FLIM data and applied this method on three-dimensional (3D) Caco-2 models of polarized epithelial luminal cysts in a supporting extracellular matrix environment. These Caco-2 based models were treated with epidermal growth factor (EGF), to stimulate proliferation in order to determine if FLIM could detect such a change in cell behavior. Autofluorescence from nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) in luminal Caco-2 cysts was stimulated by 2-photon laser excitation. Using a phasor approach, the lifetimes of involved fluorophores and their contribution were calculated with fewer initial assumptions when compared to multiexponential decay fitting. The phasor approach simplified FLIM data analysis, making it an interesting tool for non-experts in numerical data analysis. We observed that an increased proliferation stimulated by EGF led to a significant shift in fluorescence lifetime and a significant alteration of the phasor data shape. Our data demonstrates that multiphoton FLIM analysis with the phasor approach is a suitable method for the non-invasive analysis of 3D in vitro cell culture models qualifying this method for monitoring basic cellular features and the effect of external factors.
Harrison, Jay M; Howard, Delia; Malven, Marianne; Halls, Steven C; Culler, Angela H; Harrigan, George G; Wolfinger, Russell D
2013-07-03
Compositional studies on genetically modified (GM) and non-GM crops have consistently demonstrated that their respective levels of key nutrients and antinutrients are remarkably similar and that other factors such as germplasm and environment contribute more to compositional variability than transgenic breeding. We propose that graphical and statistical approaches that can provide meaningful evaluations of the relative impact of different factors to compositional variability may offer advantages over traditional frequentist testing. A case study on the novel application of principal variance component analysis (PVCA) in a compositional assessment of herbicide-tolerant GM cotton is presented. Results of the traditional analysis of variance approach confirmed the compositional equivalence of the GM and non-GM cotton. The multivariate approach of PVCA provided further information on the impact of location and germplasm on compositional variability relative to GM.
The Meaning of Higher-Order Factors in Reflective-Measurement Models
ERIC Educational Resources Information Center
Eid, Michael; Koch, Tobias
2014-01-01
Higher-order factor analysis is a widely used approach for analyzing the structure of a multidimensional test. Whenever first-order factors are correlated researchers are tempted to apply a higher-order factor model. But is this reasonable? What do the higher-order factors measure? What is their meaning? Willoughby, Holochwost, Blanton, and Blair…
Exploring NASA Human Spaceflight and Pioneering Scenarios
NASA Technical Reports Server (NTRS)
Zapata, Edgar; Wilhite, Alan
2015-01-01
The life cycle cost analysis of space exploration scenarios is explored via a merger of (1) scenario planning, separating context and (2) modeling and analysis of specific content. Numerous scenarios are presented, leading to cross-cutting recommendations addressing life cycle costs, productivity, and approaches applicable to any scenarios. Approaches address technical and non-technical factors.
Factor selection for service quality evaluation: a hospital case study.
Ameryoun, Ahmad; Najafi, Seyedvahid; Nejati-Zarnaqi, Bayram; Khalilifar, Seyed Omid; Ajam, Mahdi; Ansarimoghadam, Ahmad
2017-02-13
Purpose The purpose of this paper is to develop a systematic approach to predict service quality dimension's influence on service quality using a novel analysis based on data envelopment and SERVQUAL. Design/methodology/approach To assess hospital service quality in Tehran, expectation and perception of those who received the services were evaluated using SERVQUAL. The hospital service quality dimensions were found by exploratory factor analysis (EFA). To compare customer expectation and perception, perceived service quality index (PSQI) was measured using a new method based on common weights. A novel sensitivity approach was used to test the service quality factor's impact on the PSQI. Findings A new service quality dimension named "trust in services" was found using EFA, which is not an original SERVQUAL factor. The approach was applied to assess the hospital's service quality. Since the PSQI value was 0.76 it showed that improvements are needed to meet customer expectations. The results showed the factor order that affect PSQI. "Trust in services" has the strongest influence on PSQI followed by "tangibles," "assurance," "empathy," and "responsiveness," respectively. Practical implications This work gives managers insight into service quality by following a systematic method; i.e., measuring perceived service quality from the customer viewpoint and service factors' impact on customer perception. Originality/value The procedure helps managers to select the required service quality dimensions which need improvement and predict their effects on customer perception.
The motion commotion: Human factors in transportation
NASA Technical Reports Server (NTRS)
Millar, A. E., Jr. (Editor); Rosen, R. L. (Editor); Gibson, J. D. (Editor); Crum, R. G. (Editor)
1972-01-01
The program for a systems approach to the problem of incorporating human factors in designing transportation systems is summarized. The importance of the human side of transportation is discussed along with the three major factors related to maintaining a mobile and quality life. These factors are (1) people, as individuals and groups, (2) society as a whole, and (3) the natural environment and man-made environs. The problems and bottlenecks are presented along with approaches to their solutions through systems analysis. Specific recommendations essential to achieving improved mobility within environmental constraints are presented.
Statistical analysis of microgravity experiment performance using the degrees of success scale
NASA Technical Reports Server (NTRS)
Upshaw, Bernadette; Liou, Ying-Hsin Andrew; Morilak, Daniel P.
1994-01-01
This paper describes an approach to identify factors that significantly influence microgravity experiment performance. Investigators developed the 'degrees of success' scale to provide a numerical representation of success. A degree of success was assigned to 293 microgravity experiments. Experiment information including the degree of success rankings and factors for analysis was compiled into a database. Through an analysis of variance, nine significant factors in microgravity experiment performance were identified. The frequencies of these factors are presented along with the average degree of success at each level. A preliminary discussion of the relationship between the significant factors and the degree of success is presented.
ERIC Educational Resources Information Center
Zhou, Wenxia; Sun, Jianmin; Guan, Yanjun; Li, Yuhui; Pan, Jingzhou
2013-01-01
The current research aimed to develop a multidimensional measure on the criteria of career success in a Chinese context. Items on the criteria of career success were obtained using a qualitative approach among 30 Chinese employees; exploratory factor analysis was conducted to select items and determine the factor structure among a new sample of…
ERIC Educational Resources Information Center
Rouder, Jeffrey N.; Morey, Richard D.; Province, Jordan M.
2013-01-01
Psi phenomena, such as mental telepathy, precognition, and clairvoyance, have garnered much recent attention. We reassess the evidence for psi effects from Storm, Tressoldi, and Di Risio's (2010) meta-analysis. Our analysis differs from Storm et al.'s in that we rely on Bayes factors, a Bayesian approach for stating the evidence from data for…
Return on Investment Analysis for the Almond Board of California
2004-06-01
general approach for the analysis is first to identify relevant factors concerning consumer behavior using exploratory factor analysis (EFA) and...That completed the intermediate stage of the conceptual model below, referring to the latent drivers of consumer behavior that affect the almond... consumer behavior remains a challenge that will have to be continuously addressed by the ABC management. Finally, to improve the methodology for
Multilevel Factor Analysis by Model Segregation: New Applications for Robust Test Statistics
ERIC Educational Resources Information Center
Schweig, Jonathan
2014-01-01
Measures of classroom environments have become central to policy efforts that assess school and teacher quality. This has sparked a wide interest in using multilevel factor analysis to test measurement hypotheses about classroom-level variables. One approach partitions the total covariance matrix and tests models separately on the…
The Structure of Character Strengths: Variable- and Person-Centered Approaches
Najderska, Małgorzata; Cieciuch, Jan
2018-01-01
This article examines the structure of character strengths (Peterson and Seligman, 2004) following both variable-centered and person-centered approaches. We used the International Personality Item Pool-Values in Action (IPIP-VIA) questionnaire. The IPIP-VIA measures 24 character strengths and consists of 213 direct and reversed items. The present study was conducted in a heterogeneous group of N = 908 Poles (aged 18–78, M = 28.58). It was part of a validation project of a Polish version of the IPIP-VIA questionnaire. The variable-centered approach was used to examine the structure of character strengths on both the scale and item levels. The scale-level results indicated a four-factor structure that can be interpreted based on four of the five personality traits from the Big Five theory (excluding neuroticism). The item-level analysis suggested a slightly different and limited set of character strengths (17 not 24). After conducting a second-order analysis, a four-factor structure emerged, and three of the factors could be interpreted as being consistent with the scale-level factors. Three character strength profiles were found using the person-centered approach. Two of them were consistent with alpha and beta personality metatraits. The structure of character strengths can be described by using categories from the Five Factor Model of personality and metatraits. They form factors similar to some personality traits and occur in similar constellations as metatraits. The main contributions of this paper are: (1) the validation of IPIP-VIA conducted in variable-centered approach in a new research group (Poles) using a different measurement instrument; (2) introducing the person-centered approach to the study of the structure of character strengths. PMID:29515482
Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter
2014-07-01
Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.
Toumi, Héla; Boumaiza, Moncef; Millet, Maurice; Radetski, Claudemir Marcos; Camara, Baba Issa; Felten, Vincent; Masfaraud, Jean-François; Férard, Jean-François
2018-04-19
We studied the combined acute effect (i.e., after 48 h) of deltamethrin (a pyrethroid insecticide) and malathion (an organophosphate insecticide) on Daphnia magna. Two approaches were used to examine the potential interaction effects of eight mixtures of deltamethrin and malathion: (i) calculation of mixture toxicity index (MTI) and safety factor index (SFI) and (ii) response surface methodology coupled with isobole-based statistical model (using generalized linear model). According to the calculation of MTI and SFI, one tested mixture was found additive while the two other tested mixtures were found no additive (MTI) or antagonistic (SFI), but these differences between index responses are only due to differences in terminology related to these two indexes. Through the surface response approach and isobologram analysis, we concluded that there was a significant antagonistic effect of the binary mixtures of deltamethrin and malathion that occurs on D. magna immobilization, after 48 h of exposure. Index approaches and surface response approach with isobologram analysis are complementary. Calculation of mixture toxicity index and safety factor index allows identifying punctually the type of interaction for several tested mixtures, while the surface response approach with isobologram analysis integrates all the data providing a global outcome about the type of interactive effect. Only the surface response approach and isobologram analysis allowed the statistical assessment of the ecotoxicological interaction. Nevertheless, we recommend the use of both approaches (i) to identify the combined effects of contaminants and (ii) to improve risk assessment and environmental management.
NASA Astrophysics Data System (ADS)
Razavi, S.; Gupta, H. V.
2015-12-01
Earth and environmental systems models (EESMs) are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. Complexity and dimensionality are manifested by introducing many different factors in EESMs (i.e., model parameters, forcings, boundary conditions, etc.) to be identified. Sensitivity Analysis (SA) provides an essential means for characterizing the role and importance of such factors in producing the model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to 'variogram analysis', that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are limiting cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.
Factor analysis of some socio-economic and demographic variables for Bangladesh.
Islam, S M
1986-01-01
The author carries out an exploratory factor analysis of some socioeconomic and demographic variables for Bangladesh using the classical or common factor approach with the varimax rotation method. The socioeconomic and demographic indicators used in this study include literacy, rate of growth, female employment, economic development, urbanization, population density, childlessness, sex ratio, proportion of women ever married, and fertility. The 18 administrative districts of Bangladesh constitute the unit of analysis. 3 common factors--modernization, fertility, and social progress--are identified in this study to explain the correlations among the set of selected socioeconomic and demographic variables.
"Revisiting" the South Pacific Approaches to Learning: A Confirmatory Factor Analysis Study
ERIC Educational Resources Information Center
Phan, Huy P.; Deo, Bisun
2008-01-01
There has been substantial research evidence concerning the learning approaches of students in Western and non-Western contexts. Nonetheless, it has been a decade since research in the South Pacific was conducted on the learning approaches of tertiary students. The present research examined the learning approaches of Fijian and other Pacific…
An integrated phenomic approach to multivariate allelic association
Medland, Sarah Elizabeth; Neale, Michael Churton
2010-01-01
The increased feasibility of genome-wide association has resulted in association becoming the primary method used to localize genetic variants that cause phenotypic variation. Much attention has been focused on the vast multiple testing problems arising from analyzing large numbers of single nucleotide polymorphisms. However, the inflation of experiment-wise type I error rates through testing numerous phenotypes has received less attention. Multivariate analyses can be used to detect both pleiotropic effects that influence a latent common factor, and monotropic effects that operate at a variable-specific levels, whilst controlling for non-independence between phenotypes. In this study, we present a maximum likelihood approach, which combines both latent and variable-specific tests and which may be used with either individual or family data. Simulation results indicate that in the presence of factor-level association, the combined multivariate (CMV) analysis approach performs well with a minimal loss of power as compared with a univariate analysis of a factor or sum score (SS). As the deviation between the pattern of allelic effects and the factor loadings increases, the power of univariate analyses of both factor and SSs decreases dramatically, whereas the power of the CMV approach is maintained. We show the utility of the approach by examining the association between dopamine receptor D2 TaqIA and the initiation of marijuana, tranquilizers and stimulants in data from the Add Health Study. Perl scripts that takes ped and dat files as input and produces Mx scripts and data for running the CMV approach can be downloaded from www.vipbg.vcu.edu/~sarahme/WriteMx. PMID:19707246
Risk Factors for Central Serous Chorioretinopathy: Multivariate Approach in a Case-Control Study.
Chatziralli, Irini; Kabanarou, Stamatina A; Parikakis, Efstratios; Chatzirallis, Alexandros; Xirou, Tina; Mitropoulos, Panagiotis
2017-07-01
The purpose of this prospective study was to investigate the potential risk factors associated independently with central serous retinopathy (CSR) in a Greek population, using multivariate approach. Participants in the study were 183 consecutive patients diagnosed with CSR and 183 controls, matched for age. All participants underwent complete ophthalmological examination and information regarding their sociodemographic, clinical, medical and ophthalmological history were recorded, so as to assess potential risk factors for CSR. Univariate and multivariate analysis was performed. Univariate analysis showed that male sex, high educational status, high income, alcohol consumption, smoking, hypertension, coronary heart disease, obstructive sleep apnea, autoimmune disorders, H. pylori infection, type A personality and stress, steroid use, pregnancy and hyperopia were associated with CSR, while myopia was found to protect from CSR. In multivariate analysis, alcohol consumption, hypertension, coronary heart disease and autoimmune disorders lost their significance, while the remaining factors were all independently associated with CSR. It is important to take into account the various risk factors for CSR, so as to define vulnerable groups and to shed light into the pathogenesis of the disease.
Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine
2015-03-01
Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data. © 2015 Wiley Periodicals, Inc.
He, M; Wang, H L; Yan, J Y; Xu, S W; Chen, W; Wang, J
2018-05-01
Objective: To compare the efficiency between the transhepatic hilar approach and conventional approach for the surgical treatment of Bismuth type Ⅲ and Ⅳ hilar cholangiocarcinoma. Methods: There were 42 consecutive patients with hilar cholangiocarcinoma of Bismuth type Ⅲ and Ⅳ who underwent surgical treatment at Department of Biliary-Pancreatic Surgery, Ren Ji Hospital, School of Medicine, Shanghai Jiao Tong University from January 2008 to December 2013.The transhepatic hilar approach was used in 19 patients and conventional approach was performed in 23 patients.There were no differences in clinical parameters between the two groups(all P >0.05). The t-test was used to analyze the measurement data, and the χ(2) test was used to analyze the count data.Kaplan-Meier analysis was used to analyze the survival period.Multivariate COX regression analysis was used to analyze the prognosis factors. Results: Among the 19 patients who underwent transhepatic hilar approach, 3 patients changed the operative planning after reevaluated by exposing the hepatic hilus.The intraoperative blood was 300(250-400)ml in the transhepatic hilar approach group, which was significantly less than the conventional approach group, 800(450-1 300)ml( t =4.276, P =0.00 1), meanwhile, the R0 resection rate was significantly higher in the transhepatic hilar approach group than in the conventional approach group(89.4% vs . 52.2; χ(2)=6.773, P =0.009) and the 3-year and 5-year cumulative survival rate was better in the transhepatic hilar approach group than in the conventional approach group(63.2% vs . 47.8%, 26.3% vs . 0; χ(2)=66.363, 127.185, P =0.000). On univariate analysis, transhepatic hilar approach, intraoperative blood loss, intraoperative blood transfusion, R0 resection and lymph node metastasis were significant risk factors for patient survival(all P <0.05). On multivariate analysis, use of transhepatic hilar approach, intraoperative blood loss, R0 resection and lymph node metastasis were significant independent risk factors for patient survival(all P <0.05). Conclusion: The transhepatic hilar approach is the preferred technique for surgical treatment for hilar cholangiocarcinoma because it can improve accuracy of surgical planning, safety of operation, R0 resection rate and survival rate compared with the conventional approach.
NASA Astrophysics Data System (ADS)
Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed
2013-11-01
In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.
ADAPTION OF NONSTANDARD PIPING COMPONENTS INTO PRESENT DAY SEISMIC CODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. T. Clark; M. J. Russell; R. E. Spears
2009-07-01
With spiraling energy demand and flat energy supply, there is a need to extend the life of older nuclear reactors. This sometimes requires that existing systems be evaluated to present day seismic codes. Older reactors built in the 1960s and early 1970s often used fabricated piping components that were code compliant during their initial construction time period, but are outside the standard parameters of present-day piping codes. There are several approaches available to the analyst in evaluating these non-standard components to modern codes. The simplest approach is to use the flexibility factors and stress indices for similar standard components withmore » the assumption that the non-standard component’s flexibility factors and stress indices will be very similar. This approach can require significant engineering judgment. A more rational approach available in Section III of the ASME Boiler and Pressure Vessel Code, which is the subject of this paper, involves calculation of flexibility factors using finite element analysis of the non-standard component. Such analysis allows modeling of geometric and material nonlinearities. Flexibility factors based on these analyses are sensitive to the load magnitudes used in their calculation, load magnitudes that need to be consistent with those produced by the linear system analyses where the flexibility factors are applied. This can lead to iteration, since the magnitude of the loads produced by the linear system analysis depend on the magnitude of the flexibility factors. After the loading applied to the nonstandard component finite element model has been matched to loads produced by the associated linear system model, the component finite element model can then be used to evaluate the performance of the component under the loads with the nonlinear analysis provisions of the Code, should the load levels lead to calculated stresses in excess of Allowable stresses. This paper details the application of component-level finite element modeling to account for geometric and material nonlinear component behavior in a linear elastic piping system model. Note that this technique can be applied to the analysis of B31 piping systems.« less
Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A
2010-11-01
Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.
NASA Astrophysics Data System (ADS)
Hussain, Nur Farahin Mee; Zahid, Zalina
2014-12-01
Nowadays, in the job market demand, graduates are expected not only to have higher performance in academic but they must also be excellent in soft skill. Problem-Based Learning (PBL) has a number of distinct advantages as a learning method as it can deliver graduates that will be highly prized by industry. This study attempts to determine the satisfaction level of engineering students on the PBL Approach and to evaluate their determinant factors. The Structural Equation Modeling (SEM) was used to investigate how the factors of Good Teaching Scale, Clear Goals, Student Assessment and Levels of Workload affected the student satisfaction towards PBL approach.
[Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].
Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina
2012-09-01
The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Fountas, Grigorios; Sarwar, Md Tawfiq; Anastasopoulos, Panagiotis Ch; Blatt, Alan; Majka, Kevin
2018-04-01
Traditional accident analysis typically explores non-time-varying (stationary) factors that affect accident occurrence on roadway segments. However, the impact of time-varying (dynamic) factors is not thoroughly investigated. This paper seeks to simultaneously identify pre-crash stationary and dynamic factors of accident occurrence, while accounting for unobserved heterogeneity. Using highly disaggregate information for the potential dynamic factors, and aggregate data for the traditional stationary elements, a dynamic binary random parameters (mixed) logit framework is employed. With this approach, the dynamic nature of weather-related, and driving- and pavement-condition information is jointly investigated with traditional roadway geometric and traffic characteristics. To additionally account for the combined effect of the dynamic and stationary factors on the accident occurrence, the developed random parameters logit framework allows for possible correlations among the random parameters. The analysis is based on crash and non-crash observations between 2011 and 2013, drawn from urban and rural highway segments in the state of Washington. The findings show that the proposed methodological framework can account for both stationary and dynamic factors affecting accident occurrence probabilities, for panel effects, for unobserved heterogeneity through the use of random parameters, and for possible correlation among the latter. The comparative evaluation among the correlated grouped random parameters, the uncorrelated random parameters logit models, and their fixed parameters logit counterpart, demonstrate the potential of the random parameters modeling, in general, and the benefits of the correlated grouped random parameters approach, specifically, in terms of statistical fit and explanatory power. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Hocevar, Dennis
The advantages of applying confirmatory factor analysis (CFA) to multitrait-multimethod (MTMM) data are widely recognized. However, because CFA as traditionally applied to MTMM data incorporates single indicators of each scale (i.e., each trait/method combination), important weaknesses are the failure to: (1) correct appropriately for measurement…
ERIC Educational Resources Information Center
Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee
2015-01-01
Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…
ERIC Educational Resources Information Center
Costello, Rebecca; Welch, S. A.
2014-01-01
This article describes a qualitative approach in understanding factors that are evident in effective online class communities. Instructors and students in the same class were asked about their perceptions regarding what constitutes an effective online experience. The analysis was done using both Herzberg's (1962, 1965) motivator-hygiene factors…
NASA Astrophysics Data System (ADS)
Azadeh, A.; Foroozan, H.; Ashjari, B.; Motevali Haghighi, S.; Yazdanparast, R.; Saberi, M.; Torki Nejad, M.
2017-10-01
ISs and ITs play a critical role in large complex gas corporations. Many factors such as human, organisational and environmental factors affect IS in an organisation. Therefore, investigating ISs success is considered to be a complex problem. Also, because of the competitive business environment and the high amount of information flow in organisations, new issues like resilient ISs and successful customer relationship management (CRM) have emerged. A resilient IS will provide sustainable delivery of information to internal and external customers. This paper presents an integrated approach to enhance and optimise the performance of each component of a large IS based on CRM and resilience engineering (RE) in a gas company. The enhancement of the performance can help ISs to perform business tasks efficiently. The data are collected from standard questionnaires. It is then analysed by data envelopment analysis by selecting the optimal mathematical programming approach. The selected model is validated and verified by principle component analysis method. Finally, CRM and RE factors are identified as influential factors through sensitivity analysis for this particular case study. To the best of our knowledge, this is the first study for performance assessment and optimisation of large IS by combined RE and CRM.
Parkinson's Disease Subtypes in the Oxford Parkinson Disease Centre (OPDC) Discovery Cohort.
Lawton, Michael; Baig, Fahd; Rolinski, Michal; Ruffman, Claudio; Nithi, Kannan; May, Margaret T; Ben-Shlomo, Yoav; Hu, Michele T M
2015-01-01
Within Parkinson's there is a spectrum of clinical features at presentation which may represent sub-types of the disease. However there is no widely accepted consensus of how best to group patients. Use a data-driven approach to unravel any heterogeneity in the Parkinson's phenotype in a well-characterised, population-based incidence cohort. 769 consecutive patients, with mean disease duration of 1.3 years, were assessed using a broad range of motor, cognitive and non-motor metrics. Multiple imputation was carried out using the chained equations approach to deal with missing data. We used an exploratory and then a confirmatory factor analysis to determine suitable domains to include within our cluster analysis. K-means cluster analysis of the factor scores and all the variables not loading into a factor was used to determine phenotypic subgroups. Our factor analysis found three important factors that were characterised by: psychological well-being features; non-tremor motor features, such as posture and rigidity; and cognitive features. Our subsequent five cluster model identified groups characterised by (1) mild motor and non-motor disease (25.4%), (2) poor posture and cognition (23.3%), (3) severe tremor (20.8%), (4) poor psychological well-being, RBD and sleep (18.9%), and (5) severe motor and non-motor disease with poor psychological well-being (11.7%). Our approach identified several Parkinson's phenotypic sub-groups driven by largely dopaminergic-resistant features (RBD, impaired cognition and posture, poor psychological well-being) that, in addition to dopaminergic-responsive motor features may be important for studying the aetiology, progression, and medication response of early Parkinson's.
Junior College Faculty Job Satisfaction.
ERIC Educational Resources Information Center
Frankel, Joanne
Some of the research done to date concerning job satisfaction of junior college faculty is reviewed in this "Brief." Part I of the "Brief" describes four frameworks that have been applied to the analysis of job satisfaction: the traditional approach, the two-factor approach, the need hierarchy, and the cognitive dissonance approach. Part II…
Dynamic Assessment: An Approach Toward Reducing Test Bias.
ERIC Educational Resources Information Center
Carlson, Jerry S.; Wiedl, Karl Heinz
Through dynamic testing (the notion that tailored testing can be extended to the use of a learning oriented approach to assessment), analysis were made of how motivational, personality, and cognitive style factors interact with assessment approaches to yield performance data. Testing procedures involving simple feedback, elaborated feedback, and…
Marshall, Brendan M; Moran, Kieran A
2015-12-01
Previous studies investigating the biomechanical factors associated with maximal countermovement jump height have typically used cross-sectional data. An alternative but less common approach is to use pre-to-posttraining change data, where the relationship between an improvement in jump height and a change in a factor is examined more directly. Our study compared the findings of these approaches. Such an evaluation is necessary because cross-sectional studies are currently a primary source of information for coaches when examining what factors to train to enhance performance. The countermovement jump of 44 males was analyzed before and after an 8-week training intervention. Correlations with jump height were calculated using both cross-sectional (pretraining data only) and pre-to-posttraining change data. Eight factors identified in the cross-sectional analysis were not significantly correlated with a change in jump height in the pre-to-post analysis. Additionally, only 6 of 11 factors identified in the pre-to-post analysis were identified in the cross-sectional analysis. These findings imply that (a) not all factors identified in a cross-sectional analysis may be critical to jump height improvement and (b) cross-sectional analyses alone may not provide an insight into all of the potential factors to train to enhance jump height. Coaches must be aware of these limitations when examining cross-sectional studies to identify factors to train to enhance jump ability. Additional findings highlight that although exercises prescribed to improve jump height should aim to enhance concentric power production at all joints, a particular emphasis on enhancing hip joint peak power may be warranted.
NASA Astrophysics Data System (ADS)
Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige
2015-05-01
The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.
Revuelta Menéndez, Javier; Ximénez Gómez, Carmen
2012-11-01
The application of mean and covariance structure analysis with quantitative data is increasing. However, latent means analysis with qualitative data is not as widespread. This article summarizes the procedures to conduct an analysis of latent means of dichotomous data from an item response theory approach. We illustrate the implementation of these procedures in an empirical example referring to the organizational context, where a multi-group analysis was conducted to compare the latent means of three employee groups in two factors measuring personal preferences and the perceived degree of rewards from the organization. Results show that higher personal motivations are associated with higher perceived importance of the organization, and that these perceptions differ across groups, so that higher-level employees have a lower level of personal and perceived motivation. The article shows how to estimate the factor means and the factor correlation from dichotomous data, and how to assess goodness of fit. Lastly, we provide the M-Plus syntax code in order to facilitate the latent means analyses for applied researchers.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Quantitative Analysis of Guanine Nucleotide Exchange Factors (GEFs) as Enzymes
Randazzo, Paul A; Jian, Xiaoying; Chen, Pei-Wen; Zhai, Peng; Soubias, Olivier; Northup, John K
2014-01-01
The proteins that possess guanine nucleotide exchange factor (GEF) activity, which include about ~800 G protein coupled receptors (GPCRs),1 15 Arf GEFs,2 81 Rho GEFs,3 8 Ras GEFs,4 and others for other families of GTPases,5 catalyze the exchange of GTP for GDP on all regulatory guanine nucleotide binding proteins. Despite their importance as catalysts, relatively few exchange factors (we are aware of only eight for ras superfamily members) have been rigorously characterized kinetically.5–13 In some cases, kinetic analysis has been simplistic leading to erroneous conclusions about mechanism (as discussed in a recent review14). In this paper, we compare two approaches for determining the kinetic properties of exchange factors: (i) examining individual equilibria, and; (ii) analyzing the exchange factors as enzymes. Each approach, when thoughtfully used,14,15 provides important mechanistic information about the exchange factors. The analysis as enzymes is described in further detail. With the focus on the production of the biologically relevant guanine nucleotide binding protein complexed with GTP (G•GTP), we believe it is conceptually simpler to connect the kinetic properties to cellular effects. Further, the experiments are often more tractable than those used to analyze the equilibrium system and, therefore, more widely accessible to scientists interested in the function of exchange factors. PMID:25332840
A contrarian view of the five-factor approach to personality description.
Block, J
1995-03-01
The 5-factor approach (FFA) to personality description has been represented as a comprehensive and compelling rubric for assessment. In this article, various misgivings about the FFA are delineated. The algorithmic method of factor analysis may not provide dimensions that are incisive. The "discovery" of the five factors may be influenced by unrecognized constraints on the variable sets analyzed. Lexical analyses are based on questionable conceptual and methodological assumptions, and have achieved uncertain results. The questionnaire version of the FFA has not demonstrated the special merits and sufficiencies of the five factors settled upon. Serious uncertainties have arisen in regard to the claimed 5-factor structure and the substantive meanings of the factors. Some implications of these problems are drawn.
Task Decomposition in Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald Laurids; Joe, Jeffrey Clark
2014-06-01
In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less
Using cognitive work analysis to explore activity allocation within military domains.
Jenkins, D P; Stanton, N A; Salmon, P M; Walker, G H; Young, M S
2008-06-01
Cognitive work analysis (CWA) is frequently advocated as an approach for the analysis of complex socio-technical systems. Much of the current CWA literature within the military domain pays particular attention to its initial phases; work domain analysis and contextual task analysis. Comparably, the analysis of the social and organisational constraints receives much less attention. Through the study of a helicopter mission planning system software tool, this paper describes an approach for investigating the constraints affecting the distribution of work. The paper uses this model to evaluate the potential benefits of the social and organisational analysis phase within a military context. The analysis shows that, through its focus on constraints, the approach provides a unique description of the factors influencing the social organisation within a complex domain. This approach appears to be compatible with existing approaches and serves as a validation of more established social analysis techniques. As part of the ergonomic design of mission planning systems, the social organisation and cooperation analysis phase of CWA provides a constraint-based description informing allocation of function between key actor groups. This approach is useful because it poses questions related to the transfer of information and optimum working practices.
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin
2016-04-01
Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.
Erhart, M; Hagquist, C; Auquier, P; Rajmil, L; Power, M; Ravens-Sieberer, U
2010-07-01
This study compares item reduction analysis based on classical test theory (maximizing Cronbach's alpha - approach A), with analysis based on the Rasch Partial Credit Model item-fit (approach B), as applied to children and adolescents' health-related quality of life (HRQoL) items. The reliability and structural, cross-cultural and known-group validity of the measures were examined. Within the European KIDSCREEN project, 3019 children and adolescents (8-18 years) from seven European countries answered 19 HRQoL items of the Physical Well-being dimension of a preliminary KIDSCREEN instrument. The Cronbach's alpha and corrected item total correlation (approach A) were compared with infit mean squares and the Q-index item-fit derived according to a partial credit model (approach B). Cross-cultural differential item functioning (DIF ordinal logistic regression approach), structural validity (confirmatory factor analysis and residual correlation) and relative validity (RV) for socio-demographic and health-related factors were calculated for approaches (A) and (B). Approach (A) led to the retention of 13 items, compared with 11 items with approach (B). The item overlap was 69% for (A) and 78% for (B). The correlation coefficient of the summated ratings was 0.93. The Cronbach's alpha was similar for both versions [0.86 (A); 0.85 (B)]. Both approaches selected some items that are not strictly unidimensional and items displaying DIF. RV ratios favoured (A) with regard to socio-demographic aspects. Approach (B) was superior in RV with regard to health-related aspects. Both types of item reduction analysis should be accompanied by additional analyses. Neither of the two approaches was universally superior with regard to cultural, structural and known-group validity. However, the results support the usability of the Rasch method for developing new HRQoL measures for children and adolescents.
Analysis of older driver safety interventions : a human factors taxonomic approach
DOT National Transportation Integrated Search
1999-03-01
The careful application of human factors design principles and guidelines is integral to : the development of safe, efficient and usable Intelligent Transportation Systems (ITS). One : segment of the driving population that may significantly benefit ...
USDA-ARS?s Scientific Manuscript database
Risk factors for obesity and weight gain are typically evaluated individually while "adjusting for" the influence of other confounding factors, and few studies, if any, have created risk profiles by clustering risk factors. We identified subgroups of postmenopausal women homogeneous in their cluster...
Behavioral Dimensions in One-Year-Olds and Dimensional Stability in Infancy.
ERIC Educational Resources Information Center
Hagekull, Berit; And Others
1980-01-01
The dimensional structure of infants' behavioral repertoire was shown to be highly stable over 3 to 15 months of age. Factor analysis of parent questionnaire data produced seven factors named Intensity/Activity, Regularity, Approach-Withdrawal, Sensory Sensitivity, Attentiveness, Manageability and Sensitivity to New Food. An eighth factor,…
Contemporary militant extremism: a linguistic approach to scale development.
Stankov, Lazar; Higgins, Derrick; Saucier, Gerard; Knežević, Goran
2010-06-01
In this article, the authors describe procedures used in the development of a new scale of militant extremist mindset. A 2-step approach consisted of (a) linguistic analysis of the texts produced by known terrorist organizations and selection of statements from these texts that reflect the mindset of those belonging to these organizations and (b) analyses of the structural properties of the scales based on 132 selected statements. Factor analysis of militant extremist statements with participants (N = 452) from Australia, Serbia, and the United States produced 3 dimensions: (a) justification and advocacy of violence (War factor), (b) violence in the name of God (God factor), and (c) blaming Western nations for the problems in the world today (West factor). We also report the distributions of scores for the 3 subscales, mean differences among the 3 national samples, and correlations with a measure of dogmatism (M. Rokeach, 1956).
ERIC Educational Resources Information Center
Jurs, Stephen; And Others
The scree test and its linear regression technique are reviewed, and results of its use in factor analysis and Delphi data sets are described. The scree test was originally a visual approach for making judgments about eigenvalues, which considered the relationships of the eigenvalues to one another as well as their actual values. The graph that is…
Marsh, Herbert W; Guo, Jiesi; Parker, Philip D; Nagengast, Benjamin; Asparouhov, Tihomir; Muthén, Bengt; Dicke, Theresa
2017-01-12
Scalar invariance is an unachievable ideal that in practice can only be approximated; often using potentially questionable approaches such as partial invariance based on a stepwise selection of parameter estimates with large modification indices. Study 1 demonstrates an extension of the power and flexibility of the alignment approach for comparing latent factor means in large-scale studies (30 OECD countries, 8 factors, 44 items, N = 249,840), for which scalar invariance is typically not supported in the traditional confirmatory factor analysis approach to measurement invariance (CFA-MI). Importantly, we introduce an alignment-within-CFA (AwC) approach, transforming alignment from a largely exploratory tool into a confirmatory tool, and enabling analyses that previously have not been possible with alignment (testing the invariance of uniquenesses and factor variances/covariances; multiple-group MIMIC models; contrasts on latent means) and structural equation models more generally. Specifically, it also allowed a comparison of gender differences in a 30-country MIMIC AwC (i.e., a SEM with gender as a covariate) and a 60-group AwC CFA (i.e., 30 countries × 2 genders) analysis. Study 2, a simulation study following up issues raised in Study 1, showed that latent means were more accurately estimated with alignment than with the scalar CFA-MI, and particularly with partial invariance scalar models based on the heavily criticized stepwise selection strategy. In summary, alignment augmented by AwC provides applied researchers from diverse disciplines considerable flexibility to address substantively important issues when the traditional CFA-MI scalar model does not fit the data. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
ERIC Educational Resources Information Center
Gong, Yue; Beck, Joseph E.; Heffernan, Neil T.
2011-01-01
Student modeling is a fundamental concept applicable to a variety of intelligent tutoring systems (ITS). However, there is not a lot of practical guidance on how to construct and train such models. This paper compares two approaches for student modeling, Knowledge Tracing (KT) and Performance Factors Analysis (PFA), by evaluating their predictive…
Silberstein, Lev; Goncalves, Kevin A; Kharchenko, Peter V; Turcotte, Raphael; Kfoury, Youmna; Mercier, Francois; Baryawno, Ninib; Severe, Nicolas; Bachand, Jacqueline; Spencer, Joel A; Papazian, Ani; Lee, Dongjun; Chitteti, Brahmananda Reddy; Srour, Edward F; Hoggatt, Jonathan; Tate, Tiffany; Lo Celso, Cristina; Ono, Noriaki; Nutt, Stephen; Heino, Jyrki; Sipilä, Kalle; Shioda, Toshihiro; Osawa, Masatake; Lin, Charles P; Hu, Guo-Fu; Scadden, David T
2016-10-06
Physiological stem cell function is regulated by secreted factors produced by niche cells. In this study, we describe an unbiased approach based on the differential single-cell gene expression analysis of mesenchymal osteolineage cells close to, and further removed from, hematopoietic stem/progenitor cells (HSPCs) to identify candidate niche factors. Mesenchymal cells displayed distinct molecular profiles based on their relative location. We functionally examined, among the genes that were preferentially expressed in proximal cells, three secreted or cell-surface molecules not previously connected to HSPC biology-the secreted RNase angiogenin, the cytokine IL18, and the adhesion molecule Embigin-and discovered that all of these factors are HSPC quiescence regulators. Therefore, our proximity-based differential single-cell approach reveals molecular heterogeneity within niche cells and can be used to identify novel extrinsic stem/progenitor cell regulators. Similar approaches could also be applied to other stem cell/niche pairs to advance the understanding of microenvironmental regulation of stem cell function. Copyright © 2016 Elsevier Inc. All rights reserved.
Patel, Nitin R; Ankolekar, Suresh
2007-11-30
Classical approaches to clinical trial design ignore economic factors that determine economic viability of a new drug. We address the choice of sample size in Phase III trials as a decision theory problem using a hybrid approach that takes a Bayesian view from the perspective of a drug company and a classical Neyman-Pearson view from the perspective of regulatory authorities. We incorporate relevant economic factors in the analysis to determine the optimal sample size to maximize the expected profit for the company. We extend the analysis to account for risk by using a 'satisficing' objective function that maximizes the chance of meeting a management-specified target level of profit. We extend the models for single drugs to a portfolio of clinical trials and optimize the sample sizes to maximize the expected profit subject to budget constraints. Further, we address the portfolio risk and optimize the sample sizes to maximize the probability of achieving a given target of expected profit.
A human factors methodology for real-time support applications
NASA Technical Reports Server (NTRS)
Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.
1983-01-01
A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.
Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya
2012-01-05
A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Walter, Justin D.; Littlefield, Peter; Delbecq, Scott; Prody, Gerry; Spiegel, P. Clint
2010-01-01
New approaches are currently being developed to expose biochemistry and molecular biology undergraduates to a more interactive learning environment. Here, we propose a unique project-based laboratory module, which incorporates exposure to biophysical chemistry approaches to address problems in protein chemistry. Each of the experiments described…
ERIC Educational Resources Information Center
Chen, Junjun; Brown, Gavin T. L.
2016-01-01
This study surveyed 1064 Chinese school teachers' approaches to teaching and conceptions of assessment, and examined their inter-relationship using confirmatory factor analysis and structural equation modeling. Three approaches to teaching (i.e. Knowledge Transmission, Student-Focused, and Examination Preparation) and six conceptions of assessment…
A discrimlnant function approach to ecological site classification in northern New England
James M. Fincher; Marie-Louise Smith
1994-01-01
Describes one approach to ecologically based classification of upland forest community types of the White and Green Mountain physiographic regions. The classification approach is based on an intensive statistical analysis of the relationship between the communities and soil-site factors. Discriminant functions useful in distinguishing between types based on soil-site...
PTSD's underlying symptom dimensions and relations with behavioral inhibition and activation.
Contractor, Ateka A; Elhai, Jon D; Ractliffe, Kendra C; Forbes, David
2013-10-01
Reinforcement sensitivity theory (RST) stipulates that individuals have a behavioral activation system (BAS) guiding approach (rewarding) behaviors (Gray, 1971, 1981), and behavioral inhibition system (BIS) guiding conflict resolution between approach and avoidance (punishment) behaviors (Gray & McNaughton, 2000). Posttraumatic stress disorder (PTSD) severity overall relates to both BIS (e.g., Myers, VanMeenen, & Servatius, 2012; Pickett, Bardeen, & Orcutt, 2011) and BAS (Pickett et al., 2011). Using a more refined approach, we assessed specific relations between PTSD's latent factors (Simms, Watson, & Doebbeling, 2002) and observed variables measuring BIS and BAS using 308 adult, trauma-exposed primary care patients. Confirmatory factor analysis and Wald chi-square tests demonstrated a significantly greater association with BIS severity compared to BAS severity for PTSD's dysphoria, avoidance, and re-experiencing factors. Further, PTSD's avoidance factor significantly mediated relations between BIS/BAS severity and PTSD's dysphoria factor. Copyright © 2013 Elsevier Ltd. All rights reserved.
Busico, Gianluigi; Cuoco, Emilio; Kazakis, Nerantzis; Colombani, Nicolò; Mastrocicco, Micòl; Tedesco, Dario; Voudouris, Konstantinos
2018-03-01
Shallow aquifers are the most accessible reservoirs of potable groundwater; nevertheless, they are also prone to various sources of pollution and it is usually difficult to distinguish between human and natural sources at the watershed scale. The area chosen for this study (the Campania Plain) is characterized by high spatial heterogeneities both in geochemical features and in hydraulic properties. Groundwater mineralization is driven by many processes such as, geothermal activity, weathering of volcanic products and intense human activities. In such a landscape, multivariate statistical analysis has been used to differentiate among the main hydrochemical processes occurring in the area, using three different approaches of factor analysis: (i) major elements, (ii) trace elements, (iii) both major and trace elements. The elaboration of the factor analysis approaches has revealed seven distinct hydrogeochemical processes: i) Salinization (Cl - , Na + ); ii) Carbonate rocks dissolution; iii) Anthropogenic inputs (NO 3 - , SO 4 2- , U, V); iv) Reducing conditions (Fe 2+ , Mn 2+ ); v) Heavy metals contamination (Cr and Ni); vi) Geothermal fluids influence (Li + ); and vii) Volcanic products contribution (As, Rb). Results from this study highlight the need to separately apply factor analysis when a large data set of trace elements is available. In fact, the impact of geothermal fluids in the shallow aquifer was identified from the application of the factor analysis using only trace elements. This study also reveals that the factor analysis of major and trace elements can differentiate between anthropogenic and geogenic sources of pollution in intensively exploited aquifers. Copyright © 2017 Elsevier Ltd. All rights reserved.
A risk-factor analysis of medical litigation judgments related to fall injuries in Korea.
Kim, Insook; Won, Seonae; Lee, Mijin; Lee, Won
2018-01-01
The aim of this study was to find out the risk factors through analysis of seven medical malpractice judgments related to fall injuries. The risk factors were analysed by using the framework that approaches falls from a systems perspective and comprised people, organisational or environmental factors, with each factor being comprised of subfactors. The risk factors found in each of the seven judgments were aggregated into one framework. The risk factors related to patients (i.e. the people factor) were age, pain, related disease, activities and functional status, urination state, cognitive function impairment, past history of fall, blood transfusion, sleep endoscopy state and uncooperative attitude. The risk factors related to the medical staff and caregivers (i.e. people factor) were observation negligence, no fall prevention activities and negligence in managing high-risk group for fall. Organisational risk factors were a lack of workforce, a lack of training, neglecting the management of the high-risk group, neglecting the management of caregivers and the absence of a fall prevention procedure. Regarding the environment, the risk factors were found to be the emergency room, chairs without a backrest and the examination table. Identifying risk factors is essential for preventing fall accidents, since falls are preventable patient-safety incidents. Falls do not happen as a result of a single risk factor. Therefore, a systems approach is effective to identify risk factors, especially organisational and environmental factors.
One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald Laurids Boring, PhD
2014-09-01
In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally,more » both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.« less
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
ERIC Educational Resources Information Center
Manna, Venessa F.; Yoo, Hanwook
2015-01-01
This study examined the heterogeneity in the English-as-a-second-language (ESL) test population by modeling the relationship between test-taker background characteristics and test performance as measured by the "TOEFL iBT"® using a confirmatory factor analysis (CFA) with covariate approach. The background characteristics studied…
An approach to market analysis for lighter than air transportation of freight
NASA Technical Reports Server (NTRS)
Roberts, P. O.; Marcus, H. S.; Pollock, J. H.
1975-01-01
An approach is presented to marketing analysis for lighter than air vehicles in a commercial freight market. After a discussion of key characteristics of supply and demand factors, a three-phase approach to marketing analysis is described. The existing transportation systems are quantitatively defined and possible roles for lighter than air vehicles within this framework are postulated. The marketing analysis views the situation from the perspective of both the shipper and the carrier. A demand for freight service is assumed and the resulting supply characteristics are determined. Then, these supply characteristics are used to establish the demand for competing modes. The process is then iterated to arrive at the market solution.
Penel, Nicolas; Le Cesne, Axel; Bonvalot, Sylvie; Giraud, Antoine; Bompas, Emmanuelle; Rios, Maria; Salas, Sébastien; Isambert, Nicolas; Boudou-Rouquette, Pascaline; Honore, Charles; Italiano, Antoine; Ray-Coquard, Isabelle; Piperno-Neumann, Sophie; Gouin, François; Bertucci, François; Ryckewaert, Thomas; Kurtz, Jean-Emmanuel; Ducimetiere, Françoise; Coindre, Jean-Michel; Blay, Jean-Yves
2017-09-01
The outcome of desmoid-type fibromatosis (DTF) is unpredictable. Currently, a wait-and-see approach tends to replace large en bloc resection as the first therapeutic approach. Nevertheless, there are no validated factors to guide the treatment choice. We conducted a prospective study of 771 confirmed cases of DTF. We analysed event-free survival (EFS) based on the occurrence of relapse after surgery, progressive disease during the wait-and-see approach, or change in therapeutic strategy. Identification of prognostic factors was performed using classical methods (log-rank test and Cox model). Overall, the 2-year EFS was 56%; this value did not differ between patients undergoing an operation and those managed by the wait-and-see approach (53% versus 58%, p = 0.415). In univariate analysis, two prognostic factors significantly influenced the outcome: the nature of diagnostic sampling (p = 0.466) and primary location (p = 0.0001). The 2-year EFS was only 32% after open biopsy. The 2-year EFS was 66% for favourable locations (abdominal wall, intra-abdominal, breast, digestive viscera and lower limb) and 41% for unfavourable locations. Among patients with favourable locations, the 2-year EFS was similar in patients treated by both surgery (70%) and the wait-and-see approach (63%; p = 0.413). Among patients with unfavourable locations, the 2-year EFS was significantly enhanced in patients initially managed with the wait-and-see approach (52%) compared with those who underwent initial surgery (25%; p = 0.001). The location of DTF is a major prognostic factor for EFS. If these findings are confirmed by independent analysis, personalised management of DTF must consider this easily obtained parameter. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kharroubi, Adel; Gargouri, Dorra; Baati, Houda; Azri, Chafai
2012-06-01
Concentrations of selected heavy metals (Cd, Pb, Zn, Cu, Mn, and Fe) in surface sediments from 66 sites in both northern and eastern Mediterranean Sea-Boughrara lagoon exchange areas (southeastern Tunisia) were studied in order to understand current metal contamination due to the urbanization and economic development of nearby several coastal regions of the Gulf of Gabès. Multiple approaches were applied for the sediment quality assessment. These approaches were based on GIS coupled with chemometric methods (enrichment factors, geoaccumulation index, principal component analysis, and cluster analysis). Enrichment factors and principal component analysis revealed two distinct groups of metals. The first group corresponded to Fe and Mn derived from natural sources, and the second group contained Cd, Pb, Zn, and Cu originated from man-made sources. For these latter metals, cluster analysis showed two distinct distributions in the selected areas. They were attributed to temporal and spatial variations of contaminant sources input. The geoaccumulation index (I (geo)) values explained that only Cd, Pb, and Cu can be considered as moderate to extreme pollutants in the studied sediments.
ERIC Educational Resources Information Center
Zeiders, Katharine H.; Roosa, Mark W.; Knight, George P.; Gonzales, Nancy A.
2013-01-01
Although Mexican American adolescents experience multiple risk factors in their daily lives, most research examines the influences of risk factors on adjustment independently, ignoring the additive and interactive effects of multiple risk factors. Guided by a person-centered perspective and utilizing latent profile analysis, this study identified…
Identifying influential factors of business process performance using dependency analysis
NASA Astrophysics Data System (ADS)
Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank
2011-02-01
We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.
A novel approach to the analysis of squeezed-film air damping in microelectromechanical systems
NASA Astrophysics Data System (ADS)
Yang, Weilin; Li, Hongxia; Chatterjee, Aveek N.; Elfadel, Ibrahim (Abe M.; Ender Ocak, Ilker; Zhang, TieJun
2017-01-01
Squeezed-film damping (SFD) is a phenomenon that significantly affects the performance of micro-electro-mechanical systems (MEMS). The total damping force in MEMS mainly include the viscous damping force and elastic damping force. Quality factor (Q factor) is usually used to evaluate the damping in MEMS. In this work, we measure the Q factor of a resonator through experiments in a wide range of pressure levels. In fact, experimental characterizations of MEMS have some limitations because it is difficult to conduct experiments at very high vacuum and also hard to differentiate the damping mechanisms from the overall Q factor measurements. On the other hand, classical theoretical analysis of SFD is restricted to strong assumptions and simple geometries. In this paper, a novel numerical approach, which is based on lattice Boltzmann simulations, is proposed to investigate SFD in MEMS. Our method considers the dynamics of squeezed air flow as well as fluid-solid interactions in MEMS. It is demonstrated that Q factor can be directly predicted by numerical simulation, and our simulation results agree well with experimental data. Factors that influence SFD, such as pressure, oscillating amplitude, and driving frequency, are investigated separately. Furthermore, viscous damping and elastic damping forces are quantitatively compared based on comprehensive simulation. The proposed numerical approach as well as experimental characterization enables us to reveal the insightful physics of squeezed-film air damping in MEMS.
Test-retest reliability of the underlying latent factor structure of alcohol subjective response.
Lutz, Joseph A; Childs, Emma
2017-04-01
Alcohol subjective experiences are multi-dimensional and demonstrate wide inter-individual variability. Recent efforts have sought to establish a clearer understanding of subjective alcohol responses by identifying core constructs derived from multiple measurement instruments. The aim of this study was to evaluate the temporal stability of this approach to conceptualizing alcohol subjective experiences across successive alcohol administrations in the same individuals. Healthy moderate alcohol drinkers (n = 104) completed six experimental sessions each, three with alcohol (0.8 g/kg), and three with a non-alcoholic control beverage. Participants reported subjective mood and drug effects using standardized questionnaires before and at repeated times after beverage consumption. We explored the underlying latent structure of subjective responses for all alcohol administrations using exploratory factor analysis and then tested measurement invariance over the three successive administrations using multi-group confirmatory factor analyses. Exploratory factor analyses on responses to alcohol across all administrations yielded four factors representing "Positive mood," "Sedation," "Stimulation/Euphoria," and "Drug effects and Urges." A confirmatory factor analysis on the separate administrations indicated acceptable configural and metric invariance and moderate scalar invariance. In this study, we demonstrate temporal stability of the underlying constructs of subjective alcohol responses derived from factor analysis. These findings strengthen the utility of this approach to conceptualizing subjective alcohol responses especially for use in prospective and longitudinal alcohol challenge studies relating subjective response to alcohol use disorder risk.
Gender and education impact on brain aging: a general cognitive factor approach.
Proust-Lima, Cécile; Amieva, Hélène; Letenneur, Luc; Orgogozo, Jean-Marc; Jacqmin-Gadda, Hélène; Dartigues, Jean-François
2008-09-01
In cognitive aging research, the study of a general cognitive factor has been shown to have a substantial explanatory power over the study of isolated tests. The authors aimed at differentiating the impact of gender and education on global cognitive change with age from their differential impact on 4 psychometric tests using a new latent process approach, which intermediates between a single-factor longitudinal model for sum scores and an item-response theory approach for longitudinal data. The analysis was conducted on a sample of 2,228 subjects from PAQUID, a population-based cohort of older adults followed for 13 years with repeated measures of cognition. Adjusted for vascular factors, the analysis confirmed that women performed better in tests involving verbal components, while men performed better in tests involving visuospatial skills. In addition, the model suggested that women had a slightly steeper global cognitive decline with oldest age than men, even after excluding incident dementia or death. Subjects with higher education exhibited a better mean score for the 4 tests, but this difference tended to attenuate with age for tests involving a speed component. (c) 2008 APA, all rights reserved
Azadeh, Ali; Sheikhalishahi, Mohammad
2015-06-01
A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.
Temperature effects on the strainrange partitioning approach for creep-fatigue analysis
NASA Technical Reports Server (NTRS)
Halford, G. R.; Hirschberg, M. H.; Manson, S. S.
1972-01-01
Examination is made of the influence of temperature on the strainrange partitioning approach to creep-fatigue. Results for Cr-Mo steel and Type 316 stainless steel show the four partitioned strainrange-life relationships to be temperature insensitive to within a factor of two on cyclic life. Monotonic creep and tensile ductilities were also found to be temperature insensitive to within a factor of two. The approach provides bounds on cyclic life that can be readily established for any type of inelastic strain cycle. Continuous strain cycling results obtained over a broad range of high temperatures and frequencies are in excellent agreement with bounds provided by the approach. The observed transition from one bound to the other is also in good agreement with the approach.
Fong, Ted C T; Ho, Rainbow T H
2015-01-01
The aim of this study was to reexamine the dimensionality of the widely used 9-item Utrecht Work Engagement Scale using the maximum likelihood (ML) approach and Bayesian structural equation modeling (BSEM) approach. Three measurement models (1-factor, 3-factor, and bi-factor models) were evaluated in two split samples of 1,112 health-care workers using confirmatory factor analysis and BSEM, which specified small-variance informative priors for cross-loadings and residual covariances. Model fit and comparisons were evaluated by posterior predictive p-value (PPP), deviance information criterion, and Bayesian information criterion (BIC). None of the three ML-based models showed an adequate fit to the data. The use of informative priors for cross-loadings did not improve the PPP for the models. The 1-factor BSEM model with approximately zero residual covariances displayed a good fit (PPP>0.10) to both samples and a substantially lower BIC than its 3-factor and bi-factor counterparts. The BSEM results demonstrate empirical support for the 1-factor model as a parsimonious and reasonable representation of work engagement.
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
A practical guide to environmental association analysis in landscape genomics.
Rellstab, Christian; Gugerli, Felix; Eckert, Andrew J; Hancock, Angela M; Holderegger, Rolf
2015-09-01
Landscape genomics is an emerging research field that aims to identify the environmental factors that shape adaptive genetic variation and the gene variants that drive local adaptation. Its development has been facilitated by next-generation sequencing, which allows for screening thousands to millions of single nucleotide polymorphisms in many individuals and populations at reasonable costs. In parallel, data sets describing environmental factors have greatly improved and increasingly become publicly accessible. Accordingly, numerous analytical methods for environmental association studies have been developed. Environmental association analysis identifies genetic variants associated with particular environmental factors and has the potential to uncover adaptive patterns that are not discovered by traditional tests for the detection of outlier loci based on population genetic differentiation. We review methods for conducting environmental association analysis including categorical tests, logistic regressions, matrix correlations, general linear models and mixed effects models. We discuss the advantages and disadvantages of different approaches, provide a list of dedicated software packages and their specific properties, and stress the importance of incorporating neutral genetic structure in the analysis. We also touch on additional important aspects such as sampling design, environmental data preparation, pooled and reduced-representation sequencing, candidate-gene approaches, linearity of allele-environment associations and the combination of environmental association analyses with traditional outlier detection tests. We conclude by summarizing expected future directions in the field, such as the extension of statistical approaches, environmental association analysis for ecological gene annotation, and the need for replication and post hoc validation studies. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Effati, Meysam; Thill, Jean-Claude; Shabani, Shahin
2015-04-01
The contention of this paper is that many social science research problems are too "wicked" to be suitably studied using conventional statistical and regression-based methods of data analysis. This paper argues that an integrated geospatial approach based on methods of machine learning is well suited to this purpose. Recognizing the intrinsic wickedness of traffic safety issues, such approach is used to unravel the complexity of traffic crash severity on highway corridors as an example of such problems. The support vector machine (SVM) and coactive neuro-fuzzy inference system (CANFIS) algorithms are tested as inferential engines to predict crash severity and uncover spatial and non-spatial factors that systematically relate to crash severity, while a sensitivity analysis is conducted to determine the relative influence of crash severity factors. Different specifications of the two methods are implemented, trained, and evaluated against crash events recorded over a 4-year period on a regional highway corridor in Northern Iran. Overall, the SVM model outperforms CANFIS by a notable margin. The combined use of spatial analysis and artificial intelligence is effective at identifying leading factors of crash severity, while explicitly accounting for spatial dependence and spatial heterogeneity effects. Thanks to the demonstrated effectiveness of a sensitivity analysis, this approach produces comprehensive results that are consistent with existing traffic safety theories and supports the prioritization of effective safety measures that are geographically targeted and behaviorally sound on regional highway corridors.
ERIC Educational Resources Information Center
Ilhan-Beyaztas, Dilek; Göçer-Sahin, Sakine
2018-01-01
A good analysis of the success factors in the university entrance exam, which is an important step for academic careers of students, is believed to help them manage this process. Properties such as self-regulation and learning approaches adopted by students undoubtedly influence their academic achievement as well as their success in university…
Top-down and bottom-up definitions of human failure events in human reliability analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald Laurids
2014-10-01
In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less
Core and peripheral criteria of video game addiction in the game addiction scale for adolescents.
Brunborg, Geir Scott; Hanss, Daniel; Mentzoni, Rune Aune; Pallesen, Ståle
2015-05-01
Assessment of video game addiction often involves measurement of peripheral criteria that indicate high engagement with games, and core criteria that indicate problematic use of games. A survey of the Norwegian population aged 16-74 years (N=10,081, response rate 43.6%) was carried out in 2013, which included the Gaming Addiction Scale for Adolescents (GAS). Confirmatory factor analysis showed that a two-factor structure, which separated peripheral criteria from core criteria, fitted the data better (CFI=0.963; RMSEA=0.058) compared to the original one-factor solution where all items are determined to load only on one factor (CFI=0.905, RMSEA=0.089). This was also found when we analyzed men aged ≤33 years, men aged >33 years, women aged ≤33 years, and women aged >33 years separately. This indicates that the GAS measures both engagement and problems related to video games. Multi-group measurement invariance testing showed that the factor structure was valid in all four groups (configural invariance) for the two-factor structure but not for the one-factor structure. A novel approach to categorization of problem gamers and addicted gamers where only the core criteria items are used (the CORE 4 approach) was compared to the approach where all items are included (the GAS 7 approach). The current results suggest that the CORE 4 approach might be more appropriate for classification of problem gamers and addicted gamers compared to the GAS 7 approach.
ERIC Educational Resources Information Center
National Association of College and University Business Officers, Washington, DC.
Cost behavior analysis, a costing process that can assist managers in estimating how certain institutional costs change in response to volume, policy, and environmental factors, is described. The five steps of this approach are examined, and the application of cost behavior analysis at four college-level settings is documented. The institutions…
Advances in environmental and occupational disorders in 2012.
Peden, David B; Bush, Robert K
2013-03-01
The year 2012 produced a number of advances in our understanding of the effect of environmental factors on allergic diseases, identification of new allergens, immune mechanisms in host defense, factors involved in asthma severity, and therapeutic approaches. This review focuses on the articles published in the Journal in 2012 that enhance our knowledge base of environmental and occupational disorders. Identification of novel allergens can improve diagnostics, risk factor analysis can aid preventative approaches, and studies of genetic-environmental interactions and immune mechanisms will lead to better therapeutics. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.
Robertson, Dale M.; Saad, D.A.; Heisey, D.M.
2006-01-01
Various approaches are used to subdivide large areas into regions containing streams that have similar reference or background water quality and that respond similarly to different factors. For many applications, such as establishing reference conditions, it is preferable to use physical characteristics that are not affected by human activities to delineate these regions. However, most approaches, such as ecoregion classifications, rely on land use to delineate regions or have difficulties compensating for the effects of land use. Land use not only directly affects water quality, but it is often correlated with the factors used to define the regions. In this article, we describe modifications to SPARTA (spatial regression-tree analysis), a relatively new approach applied to water-quality and environmental characteristic data to delineate zones with similar factors affecting water quality. In this modified approach, land-use-adjusted (residualized) water quality and environmental characteristics are computed for each site. Regression-tree analysis is applied to the residualized data to determine the most statistically important environmental characteristics describing the distribution of a specific water-quality constituent. Geographic information for small basins throughout the study area is then used to subdivide the area into relatively homogeneous environmental water-quality zones. For each zone, commonly used approaches are subsequently used to define its reference water quality and how its water quality responds to changes in land use. SPARTA is used to delineate zones of similar reference concentrations of total phosphorus and suspended sediment throughout the upper Midwestern part of the United States. ?? 2006 Springer Science+Business Media, Inc.
Dretzke, Janine; Ensor, Joie; Bayliss, Sue; Hodgkinson, James; Lordkipanidzé, Marie; Riley, Richard D; Fitzmaurice, David; Moore, David
2014-12-03
Prognostic factors are associated with the risk of future health outcomes in individuals with a particular health condition. The prognostic ability of such factors is increasingly being assessed in both primary research and systematic reviews. Systematic review methodology in this area is continuing to evolve, reflected in variable approaches to key methodological aspects. The aim of this article was to (i) explore and compare the methodology of systematic reviews of prognostic factors undertaken for the same clinical question, (ii) to discuss implications for review findings, and (iii) to present recommendations on what might be considered to be 'good practice' approaches. The sample was comprised of eight systematic reviews addressing the same clinical question, namely whether 'aspirin resistance' (a potential prognostic factor) has prognostic utility relative to future vascular events in patients on aspirin therapy for secondary prevention. A detailed comparison of methods around study identification, study selection, quality assessment, approaches to analysis, and reporting of findings was undertaken and the implications discussed. These were summarised into key considerations that may be transferable to future systematic reviews of prognostic factors. Across systematic reviews addressing the same clinical question, there were considerable differences in the numbers of studies identified and overlap between included studies, which could only partially be explained by different study eligibility criteria. Incomplete reporting and differences in terminology within primary studies hampered study identification and selection process across reviews. Quality assessment was highly variable and only one systematic review considered a checklist for studies of prognostic questions. There was inconsistency between reviews in approaches towards analysis, synthesis, addressing heterogeneity and reporting of results. Different methodological approaches may ultimately affect the findings and interpretation of systematic reviews of prognostic research, with implications for clinical decision-making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Hyunjin; Ansong, Charles; McDermott, Jason E.
Background: Systemic bacterial infections are highly regulated and complex processes that are orchestrated by numerous virulence factors. Genes that are coordinately controlled by the set of regulators required for systemic infection are potentially required for pathogenicity. Results: In this study we present a systems biology approach in which sample-matched multi-omic measurements of fourteen virulence-essential regulator mutants were coupled with computational network analysis to efficiently identify Salmonella virulence factors. Immunoblot experiments verified network-predicted virulence factors and a subset was determined to be secreted into the host cytoplasm, suggesting that they are virulence factors directly interacting with host cellular components. Two ofmore » these, SrfN and PagK2, were required for full mouse virulence and were shown to be translocated independent of either of the type III secretion systems in Salmonella or the type III injectisome-related flagellar mechanism. Conclusions: Integrating multi-omic datasets from Salmonella mutants lacking virulence regulators not only identified novel virulence factors but also defined a new class of translocated effectors involved in pathogenesis. The success of this strategy at discovery of known and novel virulence factors suggests that the approach may have applicability for other bacterial pathogens.« less
ERIC Educational Resources Information Center
Bass, Gwen; Lee, Ji Hee; Wells, Craig; Carey, John C.; Lee, Sangmin
2015-01-01
The scale development and exploratory and confirmatory factor analyses of the Protective Factor Index (PFI) is described. The PFI is a 13-item component of elementary students' report cards that replaces typical items associated with student behavior. The PFI is based on the Construct-Based Approach (CBA) to school counseling, which proposes that…
NASA Technical Reports Server (NTRS)
Stanturf, J. A.; Heimbuch, D. G.
1980-01-01
A refinement to the matrix approach to environmental impact assessment is to use landscape units in place of separate environmental elements in the analysis. Landscape units can be delineated by integrating remotely sensed data and available single-factor data. A remote sensing approach to landscape stratification is described and the conditions under which it is superior to other approaches that require single-factor maps are indicated. Flowcharts show the steps necessary to develop classification criteria, delineate units and a map legend, and use the landscape units in impact assessment. Application of the approach to assessing impacts of a transmission line in Montana is presented to illustrate the method.
The integration of Human Factors (HF) in the SAR process training course text
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, T.G.
1995-03-01
This text provides the technical basis for a two-day course on human factors (HF), as applied to the Safety Analysis Report (SAR) process. The overall objective of this text and course is to: provide the participant with a working knowledge of human factors-related requirements, suggestions for doing a human safety analysis applying a graded approach, and an ability to demonstrate using the results of the human safety analysis, that human factors elements as defined by DOE (human factors engineering, procedures, training, oversight, staffing, qualifications), can support wherever necessary, nuclear safety commitments in the SAR. More specifically, the objectives of themore » text and course are: (1) To provide the SAR preparer with general guidelines for doing HE within the context of a graded approach for the SAR; (2) To sensitize DOE facility managers and staff, safety analysts and SAR preparers, independent reviewers, and DOE reviewers and regulators, to DOE Order 5480.23 requirements for HE in the SAR; (3) To provide managers, analysts, reviewers and regulators with a working knowledge of HE concepts and techniques within the context of a graded approach for the SAR, and (4) To provide SAR managers and DOE reviewers and regulators with general guidelines for monitoring and coordinating the work of preparers of HE inputs throughout the SAR process, and for making decisions regarding the safety relevance of HE inputs to the SAR. As a ready reference for implementing the human factors requirements of DOE Order 5480.22 and DOE Standard 3009-94, this course text and accompanying two-day course are intended for all persons who are involved in the SAR.« less
Dimensional profiles of male to female gender identity disorder: an exploratory research.
Fisher, Alessandra D; Bandini, Elisa; Ricca, Valdo; Ferruccio, Naika; Corona, Giovanni; Meriggiola, Maria C; Jannini, Emmanuele A; Manieri, Chiara; Ristori, Jiska; Forti, Gianni; Mannucci, Edoardo; Maggi, Mario
2010-07-01
Male-to-Female Gender Identity Disorder (MtF GID) is a complex phenomenon that could be better evaluated by using a dimensional approach. To explore the aggregation of clinical manifestations of MtF GID in order to identify meaningful variables describing the heterogeneity of the disorder. A consecutive series of 80 MtF GID subjects (mean age 37 +/- 10.3 years), referred to the Interdepartmental Center for Assistance Gender Identity Disorder of Florence and to other Italian centers from July 2008 to June 2009, was studied. Diagnosis was based on formal psychiatric classification criteria. Factor analysis was performed. Several socio-demographic and clinical parameters were investigated. Patients were asked to complete the Bem Sex Role Inventory (BSRI, a self-rating scale to evaluate gender role) and Symptom Checklist-90 Revised (SCL-90-R, a self-rating scale to measure psychological state). Factor analysis identified two dimensional factors: Factor 1 was associated with sexual orientation, and Factor 2 related to behavioral and psychological correlates of early GID development. No correlation was observed between the two factors. A positive correlation between Factor 2 and feminine BSRI score was found, along with a negative correlation between Factor 2 and undifferentiated BSRI score. Moreover, a significant association between SCL-90-R Phobic subscale score and Factor 2 was observed. A variety of other socio-demographic parameters and clinical features were associated with both factors. Behavioral and psychological correlates of Factor 1 (sexual orientation) and Factor 2 (gender identity) do not constitute the framework of two separate clinical entities, but instead represent two dimensions of the complex MtF GID structure, which can be variably intertwined in the same subject. By using factor analysis, we offer a new approach capable of delineating a psychopathological and clinical profile of MtF GID patients.
Chigerwe, Munashe; Ilkiw, Jan E; Boudreaux, Karen A
2011-01-01
The objectives of the present study were to evaluate first-, second-, third-, and fourth-year veterinary medical students' approaches to studying and learning as well as the factors within the curriculum that may influence these approaches. A questionnaire consisting of the short version of the Approaches and Study Skills Inventory for Students (ASSIST) was completed by 405 students, and it included questions relating to conceptions about learning, approaches to studying, and preferences for different types of courses and teaching. Descriptive statistics, factor analysis, Cronbach's alpha analysis, and log-linear analysis were performed on the data. Deep, strategic, and surface learning approaches emerged. There were a few differences between our findings and those presented in previous studies in terms of the correlation of the subscale monitoring effectiveness, which showed loading with both the deep and strategic learning approaches. In addition, the subscale alertness to assessment demands showed correlation with the surface learning approach. The perception of high workloads, the use of previous test files as a method for studying, and examinations that are based only on material provided in lecture notes were positively associated with the surface learning approach. Focusing on improving specific teaching and assessment methods that enhance deep learning is anticipated to enhance students' positive learning experience. These teaching methods include instructors who encourage students to be critical thinkers, the integration of course material in other disciplines, courses that encourage thinking and reading about the learning material, and books and articles that challenge students while providing explanations beyond lecture material.
ERIC Educational Resources Information Center
Agbetsiafa, Douglas
2010-01-01
This paper explores the factors that affect students' evaluation of economic instruction using a sample of 1300 completed rating instruments at a comprehensive four-year mid-western public university. The study uses factor analysis to determine the validity and reliability of the evaluation instrument in assessing instructor or course…
Factors Leading to Success in Diversified Occupation: A Livelihood Analysis in India
ERIC Educational Resources Information Center
Saha, Biswarup; Bahal, Ram
2015-01-01
Purpose: Livelihood diversification is a sound alternative for higher economic growth and its success or failure is conditioned by the interplay of a multitude of factors. The study of the profile of the farmers in which they operate is important to highlight the factors leading to success in diversified livelihoods. Design/Methodology/Approach: A…
Salutogenic factors for mental health promotion in work settings and organizations.
Graeser, Silke
2011-12-01
Accompanied by an increasing awareness of companies and organizations for mental health conditions in work settings and organizations, the salutogenic perspective provides a promising approach to identify supportive factors and resources of organizations to promote mental health. Based on the sense of coherence (SOC) - usually treated as an individual and personality trait concept - an organization-based SOC scale was developed to identify potential salutogenic factors of a university as an organization and work place. Based on results of two samples of employees (n = 362, n = 204), factors associated with the organization-based SOC were evaluated. Statistical analysis yielded significant correlations between mental health and the setting-based SOC as well as the three factors of the SOC yielded by factor analysis yielded three factors comprehensibility, manageability and meaningfulness. Significant statistic results of bivariate and multivariate analyses emphasize the significance of aspects such as participation and comprehensibility referring to the organization, social cohesion and social climate on the social level, and recognition on the individual level for an organization-based SOC. Potential approaches for the further development of interventions for work-place health promotion based on salutogenic factors and resources on the individual, social and organization level are elaborated and the transcultural dimensions of these factors discussed.
Van Liew, Charles; Santoro, Maya S; Edwards, Larissa; Kang, Jeremy; Cronan, Terry A
2016-01-01
The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed.
Edwards, Larissa; Kang, Jeremy
2016-01-01
The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed. PMID:28070160
[Lake eutrophication modeling in considering climatic factors change: a review].
Su, Jie-Qiong; Wang, Xuan; Yang, Zhi-Feng
2012-11-01
Climatic factors are considered as the key factors affecting the trophic status and its process in most lakes. Under the background of global climate change, to incorporate the variations of climatic factors into lake eutrophication models could provide solid technical support for the analysis of the trophic evolution trend of lake and the decision-making of lake environment management. This paper analyzed the effects of climatic factors such as air temperature, precipitation, sunlight, and atmosphere on lake eutrophication, and summarized the research results about the lake eutrophication modeling in considering in considering climatic factors change, including the modeling based on statistical analysis, ecological dynamic analysis, system analysis, and intelligent algorithm. The prospective approaches to improve the accuracy of lake eutrophication modeling with the consideration of climatic factors change were put forward, including 1) to strengthen the analysis of the mechanisms related to the effects of climatic factors change on lake trophic status, 2) to identify the appropriate simulation models to generate several scenarios under proper temporal and spatial scales and resolutions, and 3) to integrate the climatic factors change simulation, hydrodynamic model, ecological simulation, and intelligent algorithm into a general modeling system to achieve an accurate prediction of lake eutrophication under climatic change.
Fighting for Intelligence: A Brief Overview of the Academic Work of John L. Horn
McArdle, John J.; Hofer, Scott M.
2015-01-01
John L. Horn (1928–2006) was a pioneer in multivariate thinking and the application of multivariate methods to research on intelligence and personality. His key works on individual differences in the methodological areas of factor analysis and the substantive areas of cognition are reviewed here. John was also our mentor, teacher, colleague, and friend. We overview John Horn’s main contributions to the field of intelligence by highlighting 3 issues about his methods of factor analysis and 3 of his substantive debates about intelligence. We first focus on Horn’s methodological demonstrations describing (a) the many uses of simulated random variables in exploratory factor analysis; (b) the exploratory uses of confirmatory factor analysis; and (c) the key differences between states, traits, and trait-changes. On a substantive basis, John believed that there were important individual differences among people in terms of cognition and personality. These sentiments led to his intellectual battles about (d) Spearman’s g theory of a unitary intelligence, (e) Guilford’s multifaceted model of intelligence, and (f) the Schaie and Baltes approach to defining the lack of decline of intelligence earlier in the life span. We conclude with a summary of John Horn’s unique approaches to dealing with common issues. PMID:26246642
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
Single-Molecule Studies of Actin Assembly and Disassembly Factors
Smith, Benjamin A.; Gelles, Jeff; Goode, Bruce L.
2014-01-01
The actin cytoskeleton is very dynamic and highly regulated by multiple associated proteins in vivo. Understanding how this system of proteins functions in the processes of actin network assembly and disassembly requires methods to dissect the mechanisms of activity of individual factors and of multiple factors acting in concert. The advent of single-filament and single-molecule fluorescence imaging methods has provided a powerful new approach to discovering actin-regulatory activities and obtaining direct, quantitative insights into the pathways of molecular interactions that regulate actin network architecture and dynamics. Here we describe techniques for acquisition and analysis of single-molecule data, applied to the novel challenges of studying the filament assembly and disassembly activities of actin-associated proteins in vitro. We discuss the advantages of single-molecule analysis in directly visualizing the order of molecular events, measuring the kinetic rates of filament binding and dissociation, and studying the coordination among multiple factors. The methods described here complement traditional biochemical approaches in elucidating actin-regulatory mechanisms in reconstituted filamentous networks. PMID:24630103
ERIC Educational Resources Information Center
Macpherson, Allan; Jayawarna, Dilani
2007-01-01
Purpose: This study aims to investigate the influence of a range of contingent factors that moderate the approaches to training in manufacturing SMEs. Design/methodology/approach: The study is based on a regression analysis of data from a survey of 198 manufacturing SMEs. Findings: The findings suggest that there will be times when formal training…
DOT National Transportation Integrated Search
2016-02-01
In this study, a computational approach for conducting durability analysis of bridges using detailed finite element models is developed. The underlying approach adopted is based on the hypothesis that the two main factors affecting the life of a brid...
Contrasting Conceptions of Intelligence and their Educational Implications. Technical Report No. 14.
ERIC Educational Resources Information Center
Sternberg, Robert J.
The componential conception of intelligence is summarized and contrasted with the psychometric conception. A brief history of concepts of intelligence is presented, beginning with Galton's anthropometric approach and Binet's more educationally relevant approach. Spearman's, and later Thurstone's, contributions to factor analysis promoted a…
Structural analysis and design of multivariable control systems: An algebraic approach
NASA Technical Reports Server (NTRS)
Tsay, Yih Tsong; Shieh, Leang-San; Barnett, Stephen
1988-01-01
The application of algebraic system theory to the design of controllers for multivariable (MV) systems is explored analytically using an approach based on state-space representations and matrix-fraction descriptions. Chapters are devoted to characteristic lambda matrices and canonical descriptions of MIMO systems; spectral analysis, divisors, and spectral factors of nonsingular lambda matrices; feedback control of MV systems; and structural decomposition theories and their application to MV control systems.
Low-Cost Propellant Launch to LEO from a Tethered Balloon - Economic and Thermal Analysis
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.; Schneider, Evan G.; Vaughan, David A.; Hall, Jeffrey L.
2010-01-01
This paper provides new analysis of the economics of low-cost propellant launch coupled with dry hardware re-use, and of the thermal control of the liquid hydrogen once on-orbit. One conclusion is that this approach enables an overall reduction in the cost-permission by as much as a factor of five as compared to current approaches for human exploration of the moon, Mars, and near-Earth asteroids.
NASA Astrophysics Data System (ADS)
Tisdell, C. C.
2017-08-01
Solution methods to exact differential equations via integrating factors have a rich history dating back to Euler (1740) and the ideas enjoy applications to thermodynamics and electromagnetism. Recently, Azevedo and Valentino presented an analysis of the generalized Bernoulli equation, constructing a general solution by linearizing the problem through a substitution. The purpose of this note is to present an alternative approach using 'exact methods', illustrating that a substitution and linearization of the problem is unnecessary. The ideas may be seen as forming a complimentary and arguably simpler approach to Azevedo and Valentino that have the potential to be assimilated and adapted to pedagogical needs of those learning and teaching exact differential equations in schools, colleges, universities and polytechnics. We illustrate how to apply the ideas through an analysis of the Gompertz equation, which is of interest in biomathematical models of tumour growth.
Baddeley, Michelle
2010-01-27
Typically, modern economics has steered away from the analysis of sociological and psychological factors and has focused on narrow behavioural assumptions in which expectations are formed on the basis of mathematical algorithms. Blending together ideas from the social and behavioural sciences, this paper argues that the behavioural approach adopted in most economic analysis, in its neglect of sociological and psychological forces and its simplistically dichotomous categorization of behaviour as either rational or not rational, is too narrow and stark. Behaviour may reflect an interaction of cognitive and emotional factors and this can be captured more effectively using an approach that focuses on the interplay of different decision-making systems. In understanding the mechanisms affecting economic and financial decision-making, an interdisciplinary approach is needed which incorporates ideas from a range of disciplines including sociology, economic psychology, evolutionary biology and neuroeconomics.
Baddeley, Michelle
2010-01-01
Typically, modern economics has steered away from the analysis of sociological and psychological factors and has focused on narrow behavioural assumptions in which expectations are formed on the basis of mathematical algorithms. Blending together ideas from the social and behavioural sciences, this paper argues that the behavioural approach adopted in most economic analysis, in its neglect of sociological and psychological forces and its simplistically dichotomous categorization of behaviour as either rational or not rational, is too narrow and stark. Behaviour may reflect an interaction of cognitive and emotional factors and this can be captured more effectively using an approach that focuses on the interplay of different decision-making systems. In understanding the mechanisms affecting economic and financial decision-making, an interdisciplinary approach is needed which incorporates ideas from a range of disciplines including sociology, economic psychology, evolutionary biology and neuroeconomics. PMID:20026466
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whipple, C
Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less
NASA Astrophysics Data System (ADS)
Medina, Tait Runnfeldt
The increasing global reach of survey research provides sociologists with new opportunities to pursue theory building and refinement through comparative analysis. However, comparison across a broad array of diverse contexts introduces methodological complexities related to the development of constructs (i.e., measurement modeling) that if not adequately recognized and properly addressed undermine the quality of research findings and cast doubt on the validity of substantive conclusions. The motivation for this dissertation arises from a concern that the availability of cross-national survey data has outpaced sociologists' ability to appropriately analyze and draw meaningful conclusions from such data. I examine the implicit assumptions and detail the limitations of three commonly used measurement models in cross-national analysis---summative scale, pooled factor model, and multiple-group factor model with measurement invariance. Using the orienting lens of the double tension I argue that a new approach to measurement modeling that incorporates important cross-national differences into the measurement process is needed. Two such measurement models---multiple-group factor model with partial measurement invariance (Byrne, Shavelson and Muthen 1989) and the alignment method (Asparouhov and Muthen 2014; Muthen and Asparouhov 2014)---are discussed in detail and illustrated using a sociologically relevant substantive example. I demonstrate that the former approach is vulnerable to an identification problem that arbitrarily impacts substantive conclusions. I conclude that the alignment method is built on model assumptions that are consistent with theoretical understandings of cross-national comparability and provides an approach to measurement modeling and construct development that is uniquely suited for cross-national research. The dissertation makes three major contributions: First, it provides theoretical justification for a new cross-national measurement model and explicates a link between theoretical conceptions of cross-national comparability and a statistical method. Second, it provides a clear and detailed discussion of model identification in multiple-group confirmatory factor analysis that is missing from the literature. This discussion sets the stage for the introduction of the identification problem within multiple-group confirmatory factor analysis with partial measurement invariance and the alternative approach to model identification employed by the alignment method. Third, it offers the first pedagogical presentation of the alignment method using a sociologically relevant example.
Brytek-Matera, Anna; Rogoza, Radosław
2015-03-01
In Poland, appropriate means to assess body image are relatively limited. The aim of the study was to evaluate the psychometric properties of the Polish version of the Multidimensional Body-Self Relations Questionnaire (MBSRQ). To do so, a sample of 341 females ranging in age from 18 to 35 years (M = 23.09; SD = 3.14) participated in the present study. Owing to the fact that the confirmatory factor analysis of the original nine-factor model was not well fitted to the data (RMSEA = 0.06; CFI = 0.75) the exploratory approach was employed. Based on parallel analysis and minimum average partial an eight-factor structure of the Polish version of the MBSRQ was distinguished. Exploratory factor analysis revealed a factorial structure similar to the original version. The proposed model was tested using an exploratory structural equation modelling approach which resulted in good fit (RMSEA = 0.04; CFI = 0.91). In the present study, the internal reliability assessed by McDonald's ω coefficient amounts from 0.66 to 0.91. In conclusion, the Polish version of the MBSRQ is a useful measure for the attitudinal component of body image assessment.
Multivariate Statistical Analysis of MSL APXS Bulk Geochemical Data
NASA Astrophysics Data System (ADS)
Hamilton, V. E.; Edwards, C. S.; Thompson, L. M.; Schmidt, M. E.
2014-12-01
We apply cluster and factor analyses to bulk chemical data of 130 soil and rock samples measured by the Alpha Particle X-ray Spectrometer (APXS) on the Mars Science Laboratory (MSL) rover Curiosity through sol 650. Multivariate approaches such as principal components analysis (PCA), cluster analysis, and factor analysis compliment more traditional approaches (e.g., Harker diagrams), with the advantage of simultaneously examining the relationships between multiple variables for large numbers of samples. Principal components analysis has been applied with success to APXS, Pancam, and Mössbauer data from the Mars Exploration Rovers. Factor analysis and cluster analysis have been applied with success to thermal infrared (TIR) spectral data of Mars. Cluster analyses group the input data by similarity, where there are a number of different methods for defining similarity (hierarchical, density, distribution, etc.). For example, without any assumptions about the chemical contributions of surface dust, preliminary hierarchical and K-means cluster analyses clearly distinguish the physically adjacent rock targets Windjana and Stephen as being distinctly different than lithologies observed prior to Curiosity's arrival at The Kimberley. In addition, they are separated from each other, consistent with chemical trends observed in variation diagrams but without requiring assumptions about chemical relationships. We will discuss the variation in cluster analysis results as a function of clustering method and pre-processing (e.g., log transformation, correction for dust cover) and implications for interpreting chemical data. Factor analysis shares some similarities with PCA, and examines the variability among observed components of a dataset so as to reveal variations attributable to unobserved components. Factor analysis has been used to extract the TIR spectra of components that are typically observed in mixtures and only rarely in isolation; there is the potential for similar results with data from APXS. These techniques offer new ways to understand the chemical relationships between the materials interrogated by Curiosity, and potentially their relation to materials observed by APXS instruments on other landed missions.
College Enrollment Motivation: A Theoretical Marketing Approach.
ERIC Educational Resources Information Center
Pomazal, Richard J.
1980-01-01
Personal beliefs and opinions regarding enrolling at university were obtained from 147 residents to test ability of a consumer/marketing theory of behavioral intention to account for factors related to college enrollment motivation. Analysis of the perceived quality of education revealed factors that were different from enrollment motivational…
Algama, Manjula; Tasker, Edward; Williams, Caitlin; Parslow, Adam C; Bryson-Richardson, Robert J; Keith, Jonathan M
2017-03-27
Computational identification of non-coding RNAs (ncRNAs) is a challenging problem. We describe a genome-wide analysis using Bayesian segmentation to identify intronic elements highly conserved between three evolutionarily distant vertebrate species: human, mouse and zebrafish. We investigate the extent to which these elements include ncRNAs (or conserved domains of ncRNAs) and regulatory sequences. We identified 655 deeply conserved intronic sequences in a genome-wide analysis. We also performed a pathway-focussed analysis on genes involved in muscle development, detecting 27 intronic elements, of which 22 were not detected in the genome-wide analysis. At least 87% of the genome-wide and 70% of the pathway-focussed elements have existing annotations indicative of conserved RNA secondary structure. The expression of 26 of the pathway-focused elements was examined using RT-PCR, providing confirmation that they include expressed ncRNAs. Consistent with previous studies, these elements are significantly over-represented in the introns of transcription factors. This study demonstrates a novel, highly effective, Bayesian approach to identifying conserved non-coding sequences. Our results complement previous findings that these sequences are enriched in transcription factors. However, in contrast to previous studies which suggest the majority of conserved sequences are regulatory factor binding sites, the majority of conserved sequences identified using our approach contain evidence of conserved RNA secondary structures, and our laboratory results suggest most are expressed. Functional roles at DNA and RNA levels are not mutually exclusive, and many of our elements possess evidence of both. Moreover, ncRNAs play roles in transcriptional and post-transcriptional regulation, and this may contribute to the over-representation of these elements in introns of transcription factors. We attribute the higher sensitivity of the pathway-focussed analysis compared to the genome-wide analysis to improved alignment quality, suggesting that enhanced genomic alignments may reveal many more conserved intronic sequences.
Global analysis of bacterial transcription factors to predict cellular target processes.
Doerks, Tobias; Andrade, Miguel A; Lathe, Warren; von Mering, Christian; Bork, Peer
2004-03-01
Whole-genome sequences are now available for >100 bacterial species, giving unprecedented power to comparative genomics approaches. We have applied genome-context methods to predict target processes that are regulated by transcription factors (TFs). Of 128 orthologous groups of proteins annotated as TFs, to date, 36 are functionally uncharacterized; in our analysis we predict a probable cellular target process or biochemical pathway for half of these functionally uncharacterized TFs.
ERIC Educational Resources Information Center
Mittal, Surabhi; Mehar, Mamta
2016-01-01
Purpose: The paper analyzes factors that affect the likelihood of adoption of different agriculture-related information sources by farmers. Design/Methodology/Approach: The paper links the theoretical understanding of the existing multiple sources of information that farmers use, with the empirical model to analyze the factors that affect the…
Cross-Cultural Validation of the Five-Factor Structure of Social Goals: A Filipino Investigation
ERIC Educational Resources Information Center
King, Ronnel B.; Watkins, David A.
2012-01-01
The aim of the present study was to test the cross-cultural validity of the five-factor structure of social goals that Dowson and McInerney proposed. Using both between-network and within-network approaches to construct validation, 1,147 Filipino high school students participated in the study. Confirmatory factor analysis indicated that the…
Social Anxiety among Chinese People.
Fan, Qianqian; Chang, Weining C
2015-01-01
The experience of social anxiety has largely been investigated among Western populations; much less is known about social anxiety in other cultures. Unlike the Western culture, the Chinese emphasize interdependence and harmony with social others. In addition, it is unclear if Western constructed instruments adequately capture culturally conditioned conceptualizations and manifestations of social anxiety that might be specific to the Chinese. The present study employed a sequence of qualitative and quantitative approaches to examine the assessment of social anxiety among the Chinese people. Interviews and focus group discussions with Chinese participants revealed that some items containing the experience of social anxiety among the Chinese are not present in existing Western measures. Factor analysis was employed to examine the factor structure of the more comprehensive scale. This approach revealed an "other concerned anxiety" factor that appears to be specific to the Chinese. Subsequent analysis found that the new factor-other concerned anxiety-functioned the same as other social anxiety factors in their association with risk factors of social anxiety, such as attachment, parenting, behavioral inhibition/activation, and attitude toward group. The implications of these findings for a more culturally sensitive assessment tool of social anxiety among the Chinese were discussed.
ERIC Educational Resources Information Center
Axelrod, Saul
1987-01-01
Emerging approaches for dealing with inappropriate behaviors of the disabled involve conducting a functional or structural behavior analysis to isolate the factors responsible for the aberrant behavior and implementing corrective procedures (often alternatives to punishment) relevant to the function of the inappropriate behavior. (Author/DB)
Buciński, Adam; Marszałł, Michał Piotr; Krysiński, Jerzy; Lemieszek, Andrzej; Załuski, Jerzy
2010-07-01
Hodgkin's lymphoma is one of the most curable malignancies and most patients achieve a lasting complete remission. In this study, artificial neural network (ANN) analysis was shown to provide significant factors with regard to 5-year recurrence after lymphoma treatment. Data from 114 patients treated for Hodgkin's disease were available for evaluation and comparison. A total of 31 variables were subjected to ANN analysis. The ANN approach as an advanced multivariate data processing method was shown to provide objective prognostic data. Some of these prognostic factors are consistent or even identical to the factors evaluated earlier by other statistical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Yuanshun; Baek, Seung H.; Garcia-Diza, Alberto
2012-01-01
This paper designs a comprehensive approach based on the engineering machine/system concept, to model, analyze, and assess the level of CO2 exchange between the atmosphere and terrestrial ecosystems, which is an important factor in understanding changes in global climate. The focus of this article is on spatial patterns and on the correlation between levels of CO2 fluxes and a variety of influencing factors in eco-environments. The engineering/machine concept used is a system protocol that includes the sequential activities of design, test, observe, and model. This concept is applied to explicitly include various influencing factors and interactions associated with CO2 fluxes.more » To formulate effective models of a large and complex climate system, this article introduces a modeling technique that will be referred to as Stochastic Filtering Analysis of Variance (SFANOVA). The CO2 flux data observed from some sites of AmeriFlux are used to illustrate and validate the analysis, prediction and globalization capabilities of the proposed engineering approach and the SF-ANOVA technology. The SF-ANOVA modeling approach was compared to stepwise regression, ridge regression, and neural networks. The comparison indicated that the proposed approach is a valid and effective tool with similar accuracy and less complexity than the other procedures.« less
Summerfield, Taryn L.; Yu, Lianbo; Gulati, Parul; Zhang, Jie; Huang, Kun; Romero, Roberto; Kniss, Douglas A.
2011-01-01
A majority of the studies examining the molecular regulation of human labor have been conducted using single gene approaches. While the technology to produce multi-dimensional datasets is readily available, the means for facile analysis of such data are limited. The objective of this study was to develop a systems approach to infer regulatory mechanisms governing global gene expression in cytokine-challenged cells in vitro, and to apply these methods to predict gene regulatory networks (GRNs) in intrauterine tissues during term parturition. To this end, microarray analysis was applied to human amnion mesenchymal cells (AMCs) stimulated with interleukin-1β, and differentially expressed transcripts were subjected to hierarchical clustering, temporal expression profiling, and motif enrichment analysis, from which a GRN was constructed. These methods were then applied to fetal membrane specimens collected in the absence or presence of spontaneous term labor. Analysis of cytokine-responsive genes in AMCs revealed a sterile immune response signature, with promoters enriched in response elements for several inflammation-associated transcription factors. In comparison to the fetal membrane dataset, there were 34 genes commonly upregulated, many of which were part of an acute inflammation gene expression signature. Binding motifs for nuclear factor-κB were prominent in the gene interaction and regulatory networks for both datasets; however, we found little evidence to support the utilization of pathogen-associated molecular pattern (PAMP) signaling. The tissue specimens were also enriched for transcripts governed by hypoxia-inducible factor. The approach presented here provides an uncomplicated means to infer global relationships among gene clusters involved in cellular responses to labor-associated signals. PMID:21655103
Qualitative research methods in renal medicine: an introduction.
Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M
2015-09-01
Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
An enhanced performance through agent-based secure approach for mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Bisen, Dhananjay; Sharma, Sanjeev
2018-01-01
This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.
RECENT APPLICATIONS OF SOURCE APPORTIONMENT METHODS AND RELATED NEEDS
Traditional receptor modeling studies have utilized factor analysis (like principal component analysis, PCA) and/or Chemical Mass Balance (CMB) to assess source influences. The limitations with these approaches is that PCA is qualitative and CMB requires the input of source pr...
Lee, Yeonok; Wu, Hulin
2012-01-01
Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.
Life-table methods for detecting age-risk factor interactions in long-term follow-up studies.
Logue, E E; Wing, S
1986-01-01
Methodological investigation has suggested that age-risk factor interactions should be more evident in age of experience life tables than in follow-up time tables due to the mixing of ages of experience over follow-up time in groups defined by age at initial examination. To illustrate the two approaches, age modification of the effect of total cholesterol on ischemic heart disease mortality in two long-term follow-up studies was investigated. Follow-up time life table analysis of 116 deaths over 20 years in one study was more consistent with a uniform relative risk due to cholesterol, while age of experience life table analysis was more consistent with a monotonic negative age interaction. In a second follow-up study (160 deaths over 24 years), there was no evidence of a monotonic negative age-cholesterol interaction by either method. It was concluded that age-specific life table analysis should be used when age-risk factor interactions are considered, but that both approaches yield almost identical results in absence of age interaction. The identification of the more appropriate life-table analysis should be ultimately guided by the nature of the age or time phenomena of scientific interest.
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.
AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku
2014-05-27
The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.
Instream-Flow Analysis for the Luquillo Experimental Forest, Puerto Rico: Methods and Analysis
F.N. Scatena; S.L. Johnson
2001-01-01
This study develops two habitat-based approaches for evaluating instream-flow requirements within the Luquillo Experimental Forest in northeastern Puerto Rico. The analysis is restricted to instream-flow requirements in upland streams dominated by the common communities of freshwater decapods. In headwater streams, pool volume was the most consistent factor...
Fracture Analyses of Cracked Delta Eye Plates in Ship Towing
NASA Astrophysics Data System (ADS)
Huang, Xiangbing; Huang, Xingling; Sun, Jizheng
2018-01-01
Based on fracture mechanics, a safety analysis approach is proposed for cracked delta eye plates in ship towing. The static analysis model is presented when the delta eye plate is in service, and the fracture criterion is introduced on basis of stress intensity factor, which is estimated with domain integral method. Subsequently, three-dimensional finite element analyses are carried out to obtain the effective stress intensity factors, and a case is studied to demonstrate the reasonability of the approach. The results show that the classical strength theory is not applicable to evaluate the cracked plate while fracture mechanics can solve the problem very well, and the load level, which a delta eye plate can carry on, decreases evidently when it is damaged.
ERIC Educational Resources Information Center
Mostertman, L. J.
Because of the uncertainty related to water resources development projects, and because of the multitude of factors influencing their performance, the systems analysis approach is often used as an instrument in the planning and design process. The approach will also yield good results in the programming of the maintenance and management of the…
A Practical Tutorial on Modified Condition/Decision Coverage
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.
2001-01-01
This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.
Threshold resummation S factor in QCD: The case of unequal masses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solovtsova, O. P., E-mail: olsol@theor.jinr.r; Chernichenko, Yu. D., E-mail: chern@gstu.gomel.b
A new relativistic Coulomb-like threshold resummation S factor in quantum chromodynamics is obtained. The analysis in question is performed within the quantum-field-theory quasipotential approach formulated in the relativistic configuration representation for the case of interaction between two relativistic particles that have unequal masses.
Olympic Education as a Factor of Socialization of Preschoolers
ERIC Educational Resources Information Center
Varfolomeeva, Zoya S.; Surinov, Ilya A.
2016-01-01
The purpose of this study is theoretical substantiation and experimental confirmation of importance of the Olympic education as a socialization factor of the preschoolers. To address the study issues, theoretical methods of analysis, generalization and systematization as well as personal and activity approaches were applied. The older preschoolers…
Drug Taking Beliefs of Australian Adolescents: A Pilot Study
ERIC Educational Resources Information Center
Skrzypiec, Grace; Owens, Laurence
2013-01-01
In this study adolescents offered their insights and perspectives of factors associated with adolescent illicit drug taking intentions. The factors explored were identified using a cross-disciplinary approach involving the Theory of Planned Behavior (TPB) and criminological theories, and these formed the framework for data analysis. Interviews…
Developing hazelnut tissue culture medium free of ion confounding
USDA-ARS?s Scientific Manuscript database
The general approach for tissue culture medium optimization is to use salts as factors in experimental design and analysis. However, using salts as factors leads to ion confounding, making it difficult to detect the effects of individual ions on particular growth responses. This study focused on tes...
ERIC Educational Resources Information Center
Ballantine, Joan; Guo, Xin; Larres, Patricia
2015-01-01
This research provides new insights into the measurement of students' authorial identity and its potential for minimising the incidence of unintentional plagiarism by providing evidence about the psychometric properties of the Student Authorship Questionnaire (SAQ). Exploratory and confirmatory factor analyses (EFA and CFA) are employed to…
Non-Linear Modeling of Growth Prerequisites in a Finnish Polytechnic Institution of Higher Education
ERIC Educational Resources Information Center
Nokelainen, Petri; Ruohotie, Pekka
2009-01-01
Purpose: This study aims to examine the factors of growth-oriented atmosphere in a Finnish polytechnic institution of higher education with categorical exploratory factor analysis, multidimensional scaling and Bayesian unsupervised model-based visualization. Design/methodology/approach: This study was designed to examine employee perceptions of…
Factors Associated with Sexual Behavior among Adolescents: A Multivariate Analysis.
ERIC Educational Resources Information Center
Harvey, S. Marie; Spigner, Clarence
1995-01-01
A self-administered survey examining multiple factors associated with engaging in sexual intercourse was completed by 1,026 high school students in a classroom setting. Findings suggest that effective interventions to address teenage pregnancy need to utilize a multifaceted approach to the prevention of high-risk behaviors. (JPS)
Factors Influencing Teachers' Engagement in Informal Learning Activities
ERIC Educational Resources Information Center
Lohman, Margaret C.
2006-01-01
Purpose: The purpose of this study is to examine factors influencing the engagement of public school teachers in informal learning activities. Design/methodology/approach: This study used a survey research design. Findings: Analysis of the data found that teachers rely to a greater degree on interactive than on independent informal learning…
Organisational Factors and Teachers' Professional Development in Dutch Secondary Schools
ERIC Educational Resources Information Center
Evers, Arnoud T.; van der Heijden, Beatrice I. J. M.; Kreijns, Karel; Gerrichhauzen, John T. G.
2011-01-01
Purpose: The purpose of this paper is to report on a study that investigates the relationship between organisational factors, Teachers' Professional Development (TPD) and occupational expertise. Design/methodology/approach: A survey was administered among 152 Dutch teachers in secondary education. Findings: Analysis of the data revealed that of…
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann
1988-01-01
Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.
Gómez, Eduardo J.
2017-01-01
Background: This article conducts a comparative national and subnational government analysis of the political, economic, and ideational constructivist contextual factors facilitating the adoption of obesity and diabetes policy. Methods: We adopt a nested analytical approach to policy analysis, which combines cross-national statistical analysis with subnational case study comparisons to examine theoretical prepositions and discover alternative contextual factors; this was combined with an ideational constructivist approach to policy-making. Results: Contrary to the existing literature, we found that with the exception of cross-national statistical differences in access to healthcare infrastructural resources, the growing burden of obesity and diabetes, rising healthcare costs and increased citizens’ knowledge had no predictive affect on the adoption of obesity and diabetes policy. We then turned to a subnational comparative analysis of the states of Mississippi in the United States and Rio Grande do Norte in Brazil to further assess the importance of infrastructural resources, at two units of analysis: the state governments versus rural municipal governments. Qualitative evidence suggests that differences in subnational healthcare infrastructural resources were insufficient for explaining policy reform processes, highlighting instead other potentially important factors, such as state-civil societal relationships and policy diffusion in Mississippi, federal policy intervention in Rio Grande do Norte, and politicians’ social construction of obesity and the resulting differences in policy roles assigned to the central government. Conclusion: We conclude by underscoring the complexity of subnational policy responses to obesity and diabetes, the importance of combining resource and constructivist analysis for better understanding the context of policy reform, while underscoring the potential lessons that the United States can learn from Brazil. PMID:29179290
Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan
We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rankmore » impacts both overcompleteness and sparsity.« less
Modeling of gold production in Malaysia
NASA Astrophysics Data System (ADS)
Muda, Nora; Ainuddeen, Nasihah Rasyiqah; Ismail, Hamizun; Umor, Mohd Rozi
2013-04-01
This study was conducted to identify the main factors that contribute to the gold production and hence determine the factors that affect to the development of the mining industry in Malaysia. An econometric approach was used by performing the cointegration analysis among the factors to determine the existence of long term relationship between the gold prices, the number of gold mines, the number of workers in gold mines and the gold production. The study continued with the Granger analysis to determine the relationship between factors and gold production. Results have found that there are long term relationship between price, gold production and number of employees. Granger causality analysis shows that there is only one way relationship between the number of employees with gold production in Malaysia and the number of gold mines in Malaysia.
Azadeh, A; Motevali Haghighi, S; Gaeini, Z; Shabanpour, N
2016-07-01
This study presents an integrated approach for analyzing the impact of macro-ergonomics factors in healthcare supply chain (HCSC) by data envelopment analysis (DEA). The case of this study is the supply chain (SC) of a real hospital. Thus, healthcare standards and macro-ergonomics factors are considered to be modeled by the mathematical programming approach. Over 28 subsidiary SC divisions with parallel missions and objectives are evaluated by analyzing inputs and outputs through DEA. Each division in this HCSC is considered as decision making unit (DMU). This approach can analyze the impact of macro-ergonomics factors on supply chain management (SCM) in healthcare sector. Also, this method ranks the relevant performance efficiencies of each HCSC. In this study by using proposed method, the most effective macro-ergonomics factor on HCSC is identified as "teamwork" issue. Also, this study would help managers to identify the areas of weaknesses in their SCM system and set improvement target plan for the related SCM system in healthcare industry. To the best of our knowledge, this is the first study for macro-ergonomics optimization of HCSC. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Azadeh, Ali; Sheikhalishahi, Mohammad
2014-01-01
Background A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. Methods To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. Results The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. Conclusion The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors. PMID:26106505
Haddad, Mark; Waqas, Ahmed; Sukhera, Ahmed Bashir; Tarar, Asad Zaman
2017-07-27
Depression is common mental health problem and leading contributor to the global burden of disease. The attitudes and beliefs of the public and of health professionals influence social acceptance and affect the esteem and help-seeking of people experiencing mental health problems. The attitudes of clinicians are particularly relevant to their role in accurately recognising and providing appropriate support and management of depression. This study examines the characteristics of the revised depression attitude questionnaire (R-DAQ) with doctors working in healthcare settings in Lahore, Pakistan. A cross-sectional survey was conducted in 2015 using the revised depression attitude questionnaire (R-DAQ). A convenience sample of 700 medical practitioners based in six hospitals in Lahore was approached to participate in the survey. The R-DAQ structure was examined using Parallel Analysis from polychoric correlations. Unweighted least squares analysis (ULSA) was used for factor extraction. Model fit was estimated using goodness-of-fit indices and the root mean square of standardized residuals (RMSR), and internal consistency reliability for the overall scale and subscales was assessed using reliability estimates based on Mislevy and Bock (BILOG 3 Item analysis and test scoring with binary logistic models. Mooresville: Scientific Software, 55) and the McDonald's Omega statistic. Findings using this approach were compared with principal axis factor analysis based on Pearson correlation matrix. 601 (86%) of the doctors approached consented to participate in the study. Exploratory factor analysis of R-DAQ scale responses demonstrated the same 3-factor structure as in the UK development study, though analyses indicated removal of 7 of the 22 items because of weak loading or poor model fit. The 3 factor solution accounted for 49.8% of the common variance. Scale reliability and internal consistency were adequate: total scale standardised alpha was 0.694; subscale reliability for professional confidence was 0.732, therapeutic optimism/pessimism was 0.638, and generalist perspective was 0.769. The R-DAQ was developed with a predominantly UK-based sample of health professionals. This study indicates that this scale functions adequately and provides a valid measure of depression attitudes for medical practitioners in Pakistan, with the same factor structure as in the scale development sample. However, optimal scale function necessitated removal of several items, with a 15-item scale enabling the most parsimonious factor solution for this population.
NASA Technical Reports Server (NTRS)
Yeh, Hsien-Yang
1988-01-01
The theory of anisotropic elasticity was used to evaluate the anisotropic stress concentration factors of a composite laminated plate containing a small circular hole. This advanced composite was used to manufacture the X-29A forward-swept wing. It was found for composite material, that the anisotropic stress concentration is no longer a constant, and that the locations of maximum tangential stress points could shift by changing the fiber orientation with respect to the loading axis. The analysis showed that through the lamination process, the stress concentration factor could be reduced drastically, and therefore the structural performance could be improved. Both the mixture rule approach and the constant strain approach were used to calculate the stress concentration factor of room temperature. The results predicted by the mixture rule approach were about twenty percent deviate from the experimental data. However, the results predicted by the constant strain approach matched the testing data very well. This showed the importance of the inplane shear effect on the evaluation of the stress concentration factor for the X-29A composite plate.
Scherer, Ronny; Nilsen, Trude; Jansen, Malte
2016-01-01
Students' perceptions of instructional quality are among the most important criteria for evaluating teaching effectiveness. The present study evaluates different latent variable modeling approaches (confirmatory factor analysis, exploratory structural equation modeling, and bifactor modeling), which are used to describe these individual perceptions with respect to their factor structure, measurement invariance, and the relations to selected educational outcomes (achievement, self-concept, and motivation in mathematics). On the basis of the Programme for International Student Assessment (PISA) 2012 large-scale data sets of Australia, Canada, and the USA (N = 26,746 students), we find support for the distinction between three factors of individual students' perceptions and full measurement invariance across countries for all modeling approaches. In this regard, bifactor exploratory structural equation modeling outperformed alternative approaches with respect to model fit. Our findings reveal significant relations to the educational outcomes. This study synthesizes different modeling approaches of individual students' perceptions of instructional quality and provides insights into the nature of these perceptions from an individual differences perspective. Implications for the measurement and modeling of individually perceived instructional quality are discussed.
Nonlinear bulging factor based on R-curve data
NASA Technical Reports Server (NTRS)
Jeong, David Y.; Tong, Pin
1994-01-01
In this paper, a nonlinear bulging factor is derived using a strain energy approach combined with dimensional analysis. The functional form of the bulging factor contains an empirical constant that is determined using R-curve data from unstiffened flat and curved panel tests. The determination of this empirical constant is based on the assumption that the R-curve is the same for both flat and curved panels.
A new technique for ordering asymmetrical three-dimensional data sets in ecology.
Pavoine, Sandrine; Blondel, Jacques; Baguette, Michel; Chessel, Daniel
2007-02-01
The aim of this paper is to tackle the problem that arises from asymmetrical data cubes formed by two crossed factors fixed by the experimenter (factor A and factor B, e.g., sites and dates) and a factor which is not controlled for (the species). The entries of this cube are densities in species. We approach this kind of data by the comparison of patterns, that is to say by analyzing first the effect of factor B on the species-factor A pattern, and second the effect of factor A on the species-factor B pattern. The analysis of patterns instead of individual responses requires a correspondence analysis. We use a method we call Foucart's correspondence analysis to coordinate the correspondence analyses of several independent matrices of species x factor A (respectively B) type, corresponding to each modality of factor B (respectively A). Such coordination makes it possible to evaluate the effect of factor B (respectively A) on the species-factor A (respectively B) pattern. The results obtained by such a procedure are much more insightful than those resulting from a classical single correspondence analysis applied to the global matrix that is obtained by simply unrolling the data cube, juxtaposing for example the individual species x factor A matrices through modalities of factor B. This is because a single global correspondence analysis combines three effects of factors in a way that cannot be determined from factorial maps (factor A, factor B, and factor A x factor B interaction) whereas the applications of Foucart's correspondence analysis clearly discriminate two different issues. Using two data sets, we illustrate that this technique proves to be particularly powerful in the analyses of ecological convergence which include several distinct data sets and in the analyses of spatiotemporal variations of species distributions.
NASA Astrophysics Data System (ADS)
Yuhendar, A. H.; Wusqa, U.; Kartiko, R. D.; Raya, N. R.; Misbahudin
2016-05-01
Large-scale landslide occurred in Margamukti village, Pangalengan, Bandung Regency, West Java Province, Indonesia. The landslide damaged geothermal gas pipeline along 300 m in Wayang Windu Geothermal Field. Based on field observation, landslide occured in rotational sliding movement. Laboratory analysis were conducted to obtain the characteristics of the soil. Based on the condition of the landslide in this area, the Factor of Safety can be simulated by the soil mechanics approach. Factor of safety analysis based on soil cohesion and internal friction angle was conducted using manual sensitivity analysis for back analysis. The analysis resulted soil cohesion in critical condition (FS<1) is 6.01 kPa. This value is smaller than cohesion of undisturbed slope soil sample. Water from rainfall is the most important instability factors in research area. Because it decreases cohesion in soils and increases weight and pore water pressure in granular media.
Lee, A H; Yau, K K
2001-01-01
To identify factors associated with hospital length of stay (LOS) and to model variations in LOS within Diagnosis Related Groups (DRGs). A proportional hazards frailty modelling approach is proposed that accounts for patient transfers and the inherent correlation of patients clustered within hospitals. The investigation is based on patient discharge data extracted for a group of obstetrical DRGs. Application of the frailty approach has highlighted several significant factors after adjustment for patient casemix and random hospital effects. In particular, patients admitted for childbirth with private medical insurance coverage have higher risk of prolonged hospitalization compared to public patients. The determination of pertinent factors provides important information to hospital management and clinicians in assessing the risk of prolonged hospitalization. The analysis also enables the comparison of inter-hospital variations across adjacent DRGs.
Analyzing Response Times in Tests with Rank Correlation Approaches
ERIC Educational Resources Information Center
Ranger, Jochen; Kuhn, Jorg-Tobias
2013-01-01
It is common practice to log-transform response times before analyzing them with standard factor analytical methods. However, sometimes the log-transformation is not capable of linearizing the relation between the response times and the latent traits. Therefore, a more general approach to response time analysis is proposed in the current…
A Typology of Adult Literacy Instructional Approaches
ERIC Educational Resources Information Center
Beder, Hal; Lipnevich, Anastasiya; Robinson-Geller, Perrine
2007-01-01
This study addresses the primary question, "What instructional approaches typify adult literacy education in the United States?" as well as several secondary questions. To address the primary question, a survey was developed and responses were received from 598 adult literacy teachers in 12 states. When the data were subjected to factor analysis,…
ERIC Educational Resources Information Center
Mills, Rosemary S. L.; Hastings, Paul D.; Helm, Jonathan; Serbin, Lisa A.; Etezadi, Jamshid; Stack, Dale M.; Schwartzman, Alex E.; Li, Hai Hong
2012-01-01
This study evaluated a comprehensive model of factors associated with internalizing problems (IP) in early childhood, hypothesizing direct, mediated, and moderated pathways linking child temperamental inhibition, maternal overcontrol and rejection, and contextual stressors to IP. In a novel approach, three samples were integrated to form a large…
A Cognitive Component Analysis Approach for Developing Game-Based Spatial Learning Tools
ERIC Educational Resources Information Center
Hung, Pi-Hsia; Hwang, Gwo-Jen; Lee, Yueh-Hsun; Su, I-Hsiang
2012-01-01
Spatial ability has been recognized as one of the most important factors affecting the mathematical performance of students. Previous studies on spatial learning have mainly focused on developing strategies to shorten the problem-solving time of learners for very specific learning tasks. Such an approach usually has limited effects on improving…
The Semantic Distance Task: Quantifying Semantic Distance with Semantic Network Path Length
ERIC Educational Resources Information Center
Kenett, Yoed N.; Levi, Effi; Anaki, David; Faust, Miriam
2017-01-01
Semantic distance is a determining factor in cognitive processes, such as semantic priming, operating upon semantic memory. The main computational approach to compute semantic distance is through latent semantic analysis (LSA). However, objections have been raised against this approach, mainly in its failure at predicting semantic priming. We…
Robles, A; Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2014-04-01
The results of a global sensitivity analysis of a filtration model for submerged anaerobic MBRs (AnMBRs) are assessed in this paper. This study aimed to (1) identify the less- (or non-) influential factors of the model in order to facilitate model calibration and (2) validate the modelling approach (i.e. to determine the need for each of the proposed factors to be included in the model). The sensitivity analysis was conducted using a revised version of the Morris screening method. The dynamic simulations were conducted using long-term data obtained from an AnMBR plant fitted with industrial-scale hollow-fibre membranes. Of the 14 factors in the model, six were identified as influential, i.e. those calibrated using off-line protocols. A dynamic calibration (based on optimisation algorithms) of these influential factors was conducted. The resulting estimated model factors accurately predicted membrane performance. Copyright © 2014 Elsevier Ltd. All rights reserved.
A fuel-based approach to estimating motor vehicle exhaust emissions
NASA Astrophysics Data System (ADS)
Singer, Brett Craig
Motor vehicles contribute significantly to air pollution problems; accurate motor vehicle emission inventories are therefore essential to air quality planning. Current travel-based inventory models use emission factors measured from potentially biased vehicle samples and predict fleet-average emissions which are often inconsistent with on-road measurements. This thesis presents a fuel-based inventory approach which uses emission factors derived from remote sensing or tunnel-based measurements of on-road vehicles. Vehicle activity is quantified by statewide monthly fuel sales data resolved to the air basin level. Development of the fuel-based approach includes (1) a method for estimating cold start emission factors, (2) an analysis showing that fuel-normalized emission factors are consistent over a range of positive vehicle loads and that most fuel use occurs during loaded-mode driving, (3) scaling factors relating infrared hydrocarbon measurements to total exhaust volatile organic compound (VOC) concentrations, and (4) an analysis showing that economic factors should be considered when selecting on-road sampling sites. The fuel-based approach was applied to estimate carbon monoxide (CO) emissions from warmed-up vehicles in the Los Angeles area in 1991, and CO and VOC exhaust emissions for Los Angeles in 1997. The fuel-based CO estimate for 1991 was higher by a factor of 2.3 +/- 0.5 than emissions predicted by California's MVEI 7F model. Fuel-based inventory estimates for 1997 were higher than those of California's updated MVEI 7G model by factors of 2.4 +/- 0.2 for CO and 3.5 +/- 0.6 for VOC. Fuel-based estimates indicate a 20% decrease in the mass of CO emitted, despite an 8% increase in fuel use between 1991 and 1997; official inventory models predict a 50% decrease in CO mass emissions during the same period. Cold start CO and VOC emission factors derived from parking garage measurements were lower than those predicted by the MVEI 7G model. Current inventories in California appear to understate total exhaust CO and VOC emissions, while overstating the importance of cold start emissions. The fuel-based approach yields robust, independent, and accurate estimates of on-road vehicle emissions. Fuel-based estimates should be used to validate or adjust official vehicle emission inventories before society embarks on new, more costly air pollution control programs.
Butler, Rebecca A.
2014-01-01
Stroke aphasia is a multidimensional disorder in which patient profiles reflect variation along multiple behavioural continua. We present a novel approach to separating the principal aspects of chronic aphasic performance and isolating their neural bases. Principal components analysis was used to extract core factors underlying performance of 31 participants with chronic stroke aphasia on a large, detailed battery of behavioural assessments. The rotated principle components analysis revealed three key factors, which we labelled as phonology, semantic and executive/cognition on the basis of the common elements in the tests that loaded most strongly on each component. The phonology factor explained the most variance, followed by the semantic factor and then the executive-cognition factor. The use of principle components analysis rendered participants’ scores on these three factors orthogonal and therefore ideal for use as simultaneous continuous predictors in a voxel-based correlational methodology analysis of high resolution structural scans. Phonological processing ability was uniquely related to left posterior perisylvian regions including Heschl’s gyrus, posterior middle and superior temporal gyri and superior temporal sulcus, as well as the white matter underlying the posterior superior temporal gyrus. The semantic factor was uniquely related to left anterior middle temporal gyrus and the underlying temporal stem. The executive-cognition factor was not correlated selectively with the structural integrity of any particular region, as might be expected in light of the widely-distributed and multi-functional nature of the regions that support executive functions. The identified phonological and semantic areas align well with those highlighted by other methodologies such as functional neuroimaging and neurostimulation. The use of principle components analysis allowed us to characterize the neural bases of participants’ behavioural performance more robustly and selectively than the use of raw assessment scores or diagnostic classifications because principle components analysis extracts statistically unique, orthogonal behavioural components of interest. As such, in addition to improving our understanding of lesion–symptom mapping in stroke aphasia, the same approach could be used to clarify brain–behaviour relationships in other neurological disorders. PMID:25348632
Pariser, Joseph J; Pearce, Shane M; Patel, Sanjay G; Bales, Gregory T
2015-10-01
To examine the national trends of simple prostatectomy (SP) for benign prostatic hyperplasia (BPH) focusing on perioperative outcomes and risk factors for complications. The National Inpatient Sample (2002-2012) was utilized to identify patients with BPH undergoing SP. Analysis included demographics, hospital details, associated procedures, and operative approach (open, robotic, or laparoscopic). Outcomes included complications, length of stay, charges, and mortality. Multivariate logistic regression was used to determine the risk factors for perioperative complications. Linear regression was used to assess the trends in the national annual utilization of SP. The study population included 35,171 patients. Median length of stay was 4 days (interquartile range 3-6). Cystolithotomy was performed concurrently in 6041 patients (17%). The overall complication rate was 28%, with bleeding occurring most commonly. In total, 148 (0.4%) patients experienced in-hospital mortality. On multivariate analysis, older age, black race, and overall comorbidity were associated with greater risk of complications while the use of a minimally invasive approach and concurrent cystolithotomy had a decreased risk. Over the study period, the national use of simple prostatectomy decreased, on average, by 145 cases per year (P = .002). By 2012, 135/2580 procedures (5%) were performed using a minimally invasive approach. The nationwide utilization of SP for BPH has decreased. Bleeding complications are common, but perioperative mortality is low. Patients who are older, black race, or have multiple comorbidities are at higher risk of complications. Minimally invasive approaches, which are becoming increasingly utilized, may reduce perioperative morbidity. Copyright © 2015 Elsevier Inc. All rights reserved.
Henriques, Justin J; Louis, Garrick E
2011-01-01
Capacity Factor Analysis is a decision support system for selection of appropriate technologies for municipal sanitation services in developing communities. Developing communities are those that lack the capability to provide adequate access to one or more essential services, such as water and sanitation, to their residents. This research developed two elements of Capacity Factor Analysis: a capacity factor based classification for technologies using requirements analysis, and a matching policy for choosing technology options. First, requirements analysis is used to develop a ranking for drinking water supply and greywater reuse technologies. Second, using the Capacity Factor Analysis approach, a matching policy is developed to guide decision makers in selecting the appropriate drinking water supply or greywater reuse technology option for their community. Finally, a scenario-based informal hypothesis test is developed to assist in qualitative model validation through case study. Capacity Factor Analysis is then applied in Cimahi Indonesia as a form of validation. The completed Capacity Factor Analysis model will allow developing communities to select drinking water supply and greywater reuse systems that are safe, affordable, able to be built and managed by the community using local resources, and are amenable to expansion as the community's management capacity increases. Copyright © 2010 Elsevier Ltd. All rights reserved.
Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions
2018-03-20
USAARL Report No. 2018-08 Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions By Kathryn A...3 Statistical Analysis Approach ..............................................................................................3 Results...1 Introduction The success of unmanned aerial systems (UAS) operations relies upon a variety of factors, including, but not limited to
Journey to vaccination: a protocol for a multinational qualitative study
Wheelock, Ana; Miraldo, Marisa; Parand, Anam; Vincent, Charles; Sevdalis, Nick
2014-01-01
Introduction In the past two decades, childhood vaccination coverage has increased dramatically, averting an estimated 2–3 million deaths per year. Adult vaccination coverage, however, remains inconsistently recorded and substandard. Although structural barriers are known to limit coverage, social and psychological factors can also affect vaccine uptake. Previous qualitative studies have explored beliefs, attitudes and preferences associated with seasonal influenza (flu) vaccination uptake, yet little research has investigated how participants’ context and experiences influence their vaccination decision-making process over time. This paper aims to provide a detailed account of a mixed methods approach designed to understand the wider constellation of social and psychological factors likely to influence adult vaccination decisions, as well as the context in which these decisions take place, in the USA, the UK, France, India, China and Brazil. Methods and analysis We employ a combination of qualitative interviewing approaches to reach a comprehensive understanding of the factors influencing vaccination decisions, specifically seasonal flu and tetanus. To elicit these factors, we developed the journey to vaccination, a new qualitative approach anchored on the heuristics and biases tradition and the customer journey mapping approach. A purposive sampling strategy is used to select participants who represent a range of key sociodemographic characteristics. Thematic analysis will be used to analyse the data. Typical journeys to vaccination will be proposed. Ethics and dissemination Vaccination uptake is significantly influenced by social and psychological factors, some of which are under-reported and poorly understood. This research will provide a deeper understanding of the barriers and drivers to adult vaccination. Our findings will be published in relevant peer-reviewed journals and presented at academic conferences. They will also be presented as practical recommendations at policy and industry meetings and healthcare professionals’ forums. This research was approved by relevant local ethics committees. PMID:24486678
On a Modern Philosophy of Evaluating Scientific Publications
NASA Astrophysics Data System (ADS)
Guz, A. N.; Rushchitsky, J. J.; Chernyshenko, I. S.
2005-10-01
Current approaches to the citation analysis of scientific publications are outlined. Science Citation Index, Impact Factor, Immediacy Index, and the selection procedure for Essential Science Indicators—a relatively new citation analysis tool—are described. The new citation evaluation tool has yet not been discussed adequately by mechanicians
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
What drives continuous improvement project success in healthcare?
Stelson, Paul; Hille, Joshua; Eseonu, Chinweike; Doolen, Toni
2017-02-13
Purpose The purpose of this paper is to present findings from a study of factors that affect continuous improvement (CI) project success in hospitals. Design/methodology/approach Quantitative regression analysis was performed on Likert scale survey responses. Qualitative thematic analysis was performed on open-ended survey responses and written reports on CI projects. Findings The paper identifies managerial and employee factors that affect project success. These factors include managerial support, communication, and affective commitment. Affective commitment is the extent to which employees perceive the change as being needed or necessary. Practical implications The results highlight how managerial decisions, approaches to communication - including communication before, during and after CI projects affect project success. The results also show that success depends on the way employees perceive proposed changes. This suggests the need for a more individualized approach to CI, lean, and broader change initiatives. Originality/value This research is the first to fuse project success and sustainability theory to CI projects, beyond Kaizen events, in healthcare environments. The research is particularly important at a time when healthcare organizations are required to make rapid changes with limited resources as they work toward outcome-based assessment and reimbursement rules.
ERIC Educational Resources Information Center
Mohd Daud, Norzaidi; Zakaria, Halimi
2017-01-01
Purpose: The purpose of this paper is to investigate the impact of antecedent factors on collaborative technologies usage among academic researchers in Malaysian research universities. Design/methodology/approach: Data analysis was conducted on data collected from 156 academic researchers from five Malaysian research universities. This study…
ERIC Educational Resources Information Center
Tisdell, C. C.
2017-01-01
Solution methods to exact differential equations via integrating factors have a rich history dating back to Euler (1740) and the ideas enjoy applications to thermodynamics and electromagnetism. Recently, Azevedo and Valentino presented an analysis of the generalized Bernoulli equation, constructing a general solution by linearizing the problem…
ERIC Educational Resources Information Center
Rosique-Blasco, Mario; Madrid-Guijarro, Antonia; García-Pérez-de-Lema, Domingo
2016-01-01
Purpose: The purpose of this paper is to explore how entrepreneurial skills (such as creativity, proactivity and risk tolerance) and socio-cultural factors (such as role model and businessman image) affect secondary education students' propensity towards entrepreneurial options in their future careers. Design/methodology/approach: A sample of…
Personal and Contextual Factors Related to Internalizing Problems during Adolescence
ERIC Educational Resources Information Center
Oliva, Alfredo; Parra, Águeda; Reina, M. Carmen
2014-01-01
Background: Over the past decades, ample empirical evidence has been collected about the factors linked to internalizing problems during adolescence. However, there is a lack of research that use holistic approaches to study the joint analysis of a series of contextual and personal variables considered to be related to internalizing problems.…
Job Satisfaction: Factor Analysis of Greek Primary School Principals' Perceptions
ERIC Educational Resources Information Center
Saiti, Anna; Fassoulis, Konstantinos
2012-01-01
Purpose: The purpose of this paper is to investigate the factors that affect the level of job satisfaction that school principals experience and, based on the findings, to suggest policies or techniques for improving it. Design/methodology/approach: Questionnaires were administered to 180 primary school heads in 13 prefectures--one from each of…
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Social Anxiety among Chinese People
Fan, Qianqian; Chang, Weining C.
2015-01-01
The experience of social anxiety has largely been investigated among Western populations; much less is known about social anxiety in other cultures. Unlike the Western culture, the Chinese emphasize interdependence and harmony with social others. In addition, it is unclear if Western constructed instruments adequately capture culturally conditioned conceptualizations and manifestations of social anxiety that might be specific to the Chinese. The present study employed a sequence of qualitative and quantitative approaches to examine the assessment of social anxiety among the Chinese people. Interviews and focus group discussions with Chinese participants revealed that some items containing the experience of social anxiety among the Chinese are not present in existing Western measures. Factor analysis was employed to examine the factor structure of the more comprehensive scale. This approach revealed an “other concerned anxiety” factor that appears to be specific to the Chinese. Subsequent analysis found that the new factor—other concerned anxiety—functioned the same as other social anxiety factors in their association with risk factors of social anxiety, such as attachment, parenting, behavioral inhibition/activation, and attitude toward group. The implications of these findings for a more culturally sensitive assessment tool of social anxiety among the Chinese were discussed. PMID:26380367
Chauhan, Rinki; Ravi, Janani; Datta, Pratik; Chen, Tianlong; Schnappinger, Dirk; Bassler, Kevin E.; Balázsi, Gábor; Gennaro, Maria Laura
2016-01-01
Accessory sigma factors, which reprogram RNA polymerase to transcribe specific gene sets, activate bacterial adaptive responses to noxious environments. Here we reconstruct the complete sigma factor regulatory network of the human pathogen Mycobacterium tuberculosis by an integrated approach. The approach combines identification of direct regulatory interactions between M. tuberculosis sigma factors in an E. coli model system, validation of selected links in M. tuberculosis, and extensive literature review. The resulting network comprises 41 direct interactions among all 13 sigma factors. Analysis of network topology reveals (i) a three-tiered hierarchy initiating at master regulators, (ii) high connectivity and (iii) distinct communities containing multiple sigma factors. These topological features are likely associated with multi-layer signal processing and specialized stress responses involving multiple sigma factors. Moreover, the identification of overrepresented network motifs, such as autoregulation and coregulation of sigma and anti-sigma factor pairs, provides structural information that is relevant for studies of network dynamics. PMID:27029515
Tobin, David L; Banker, Judith D; Weisberg, Laura; Bowers, Wayne
2007-12-01
Although several studies have shown that eating disorders clinicians do not generally use treatment manuals, findings regarding what they do use have typically been vague, or closely linked to a particular theoretical approach. Our goal was to identify what eating disorder clinicians do with their patients in a more theoretically neutral context. We also sought to describe an empirically defined approach to psychotherapeutic practice as defined by clinicians via factor analysis. A survey developed for this study was administered to 265 clinicians recruited online and at regional and international meetings for eating disorders professionals. Only 6% of respondents reported they adhered closely to treatment manuals and 98% of the respondents indicated they used both behavioral and dynamically informed interventions. Factor analysis of clinicians' use of 32 therapeutic strategies suggested seven dimensions: Psychodynamic Interventions, Coping Skills Training, Family History, CBT, Contracts, Therapist Disclosure, and Patient Feelings. The findings of this study suggest that most clinicians use a wide array of eating disorder treatment interventions drawn from empirically supported treatments, such as CBT-BN, and from treatments that have no randomized controlled trial support. Factor analysis suggested theoretically linked dimensions of treatment, but also dimensions that are common across models. (c) 2007 by Wiley Periodicals, Inc.
Hansen, Hans; Weber, Reinhard
2009-02-01
An evaluation of tonal components in noise using a semantic differential approach yields several perceptual and connotative factors. This study investigates the effect of culture on these factors with the aid of equivalent listening tests carried out in Japan (n=20), France (n=23), and Germany (n=20). The data's equivalence level is determined by a bias analysis. This analysis gives insight in the cross-cultural validity of the scales used for sound character determination. Three factors were extracted by factor analysis in all cultural subsamples: pleasant, metallic, and power. By employing appropriate target rotations of the factor spaces, the rotated factors were compared and they yield high similarities between the different cultural subsamples. To check cross-cultural differences in means, an item bias analysis was conducted. The a priori assumption of unbiased scales is rejected; the differences obtained are partially linked to bias effects. Acoustical sound descriptors were additionally tested for the semantic dimensions. The high agreement in judgments between the different cultural subsamples contrast the moderate success of the signal parameters to describe the dimensions.
[PROGNOSTIC MODELS IN MODERN MANAGEMENT OF VULVAR CANCER].
Tsvetkov, Ch; Gorchev, G; Tomov, S; Nikolova, M; Genchev, G
2016-01-01
The aim of the research was to evaluate and analyse prognosis and prognostic factors in patients with squamous cell vulvar carcinoma after primary surgery with individual approach applied during the course of treatment. In the period between January 2000 and July 2010, 113 patients with squamous cell carcinoma of the vulva were diagnosed and operated on at Gynecologic Oncology Clinic of Medical University, Pleven. All the patients were monitored at the same clinic. Individual approach was applied to each patient and whenever it was possible, more conservative operative techniques were applied. The probable clinicopathological characteristics influencing the overall survival and recurrence free survival were analyzed. Univariate statistical analysis and Cox regression analysis were made in order to evaluate the characteristics, which were statistically significant for overall survival and survival without recurrence. A multivariate logistic regression analysis (Forward Wald procedure) was applied to evaluate the combined influence of the significant factors. While performing the multivariate analysis, the synergic effect of the independent prognostic factors of both kinds of survivals was also evaluated. Approaching individually each patient, we applied the following operative techniques: 1. Deep total radical vulvectomy with separate incisions for lymph dissection (LD) or without dissection--68 (60.18 %) patients. 2. En-bloc vulvectomy with bilateral LD without vulva reconstruction--10 (8.85%) 3. Modified radical vulvactomy (hemivulvectomy, patial vulvactomy)--25 (22.02%). 4. wide-local excision--3 (2.65%). 5. Simple (total /partial) vulvectomy--5 (4.43%) patients. 6. En-bloc resection with reconstruction--2 (1.77%) After a thorough analysis of the overall survival and recurrence free survival, we made the conclusion that the relapse occurrence and clinical stage of FIGO were independent prognostic factors for overall survival and the independent prognostic factors for recurrence free survival were: metastatic inguinal nodes (unilateral or bilateral), tumor size (above or below 3 cm) and lymphovascular space invasion. On the basis of these results we created two prognostic models: 1. A prognostic model of overall survival 2. A prognostic model for survival without recurrence. Following the surgical staging of the disease, were able to gather and analyse important clinicopathological indexes, which gave us the opportunity to form prognostic groups for overall survival and recurrence-free survival.
Shmool, Jessie L C; Kubzansky, Laura D; Newman, Ogonnaya Dotson; Spengler, John; Shepard, Peggy; Clougherty, Jane E
2014-11-06
Recent toxicological and epidemiological evidence suggests that chronic psychosocial stress may modify pollution effects on health. Thus, there is increasing interest in refined methods for assessing and incorporating non-chemical exposures, including social stressors, into environmental health research, towards identifying whether and how psychosocial stress interacts with chemical exposures to influence health and health disparities. We present a flexible, GIS-based approach for examining spatial patterns within and among a range of social stressors, and their spatial relationships with air pollution, across New York City, towards understanding their combined effects on health. We identified a wide suite of administrative indicators of community-level social stressors (2008-2010), and applied simultaneous autoregressive models and factor analysis to characterize spatial correlations among social stressors, and between social stressors and air pollutants, using New York City Community Air Survey (NYCCAS) data (2008-2009). Finally, we provide an exploratory ecologic analysis evaluating possible modification of the relationship between nitrogen dioxide (NO2) and childhood asthma Emergency Department (ED) visit rates by social stressors, to demonstrate how the methods used to assess stressor exposure (and/or consequent psychosocial stress) may alter model results. Administrative indicators of a range of social stressors (e.g., high crime rate, residential crowding rate) were not consistently correlated (rho = - 0.44 to 0.89), nor were they consistently correlated with indicators of socioeconomic position (rho = - 0.54 to 0.89). Factor analysis using 26 stressor indicators suggested geographically distinct patterns of social stressors, characterized by three factors: violent crime and physical disorder, crowding and poor access to resources, and noise disruption and property crimes. In an exploratory ecologic analysis, these factors were differentially associated with area-average NO2 and childhood asthma ED visits. For example, only the 'violent crime and disorder' factor was significantly associated with asthma ED visits, and only the 'crowding and resource access' factor modified the association between area-level NO2 and asthma ED visits. This spatial approach enabled quantification of complex spatial patterning and confounding between chemical and non-chemical exposures, and can inform study design for epidemiological studies of separate and combined effects of multiple urban exposures.
Fatigue Crack Growth Rate and Stress-Intensity Factor Corrections for Out-of-Plane Crack Growth
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Herman, Dave J.; James, Mark A.
2003-01-01
Fatigue crack growth rate testing is performed by automated data collection systems that assume straight crack growth in the plane of symmetry and use standard polynomial solutions to compute crack length and stress-intensity factors from compliance or potential drop measurements. Visual measurements used to correct the collected data typically include only the horizontal crack length, which for cracks that propagate out-of-plane, under-estimates the crack growth rates and over-estimates the stress-intensity factors. The authors have devised an approach for correcting both the crack growth rates and stress-intensity factors based on two-dimensional mixed mode-I/II finite element analysis (FEA). The approach is used to correct out-of-plane data for 7050-T7451 and 2025-T6 aluminum alloys. Results indicate the correction process works well for high DeltaK levels but fails to capture the mixed-mode effects at DeltaK levels approaching threshold (da/dN approximately 10(exp -10) meter/cycle).
NASA Astrophysics Data System (ADS)
Sumantari, Y. D.; Slamet, I.; Sugiyanto
2017-06-01
Semiparametric regression is a statistical analysis method that consists of parametric and nonparametric regression. There are various approach techniques in nonparametric regression. One of the approach techniques is spline. Central Java is one of the most densely populated province in Indonesia. Population density in this province can be modeled by semiparametric regression because it consists of parametric and nonparametric component. Therefore, the purpose of this paper is to determine the factors that in uence population density in Central Java using the semiparametric spline regression model. The result shows that the factors which in uence population density in Central Java is Family Planning (FP) active participants and district minimum wage.
Gray, Andrea; Maguire, Timothy; Schloss, Rene; Yarmush, Martin L
2015-01-01
Induction of therapeutic mesenchymal stromal cell (MSC) function is dependent upon activating factors present in diseased or injured tissue microenvironments. These functions include modulation of macrophage phenotype via secreted molecules including prostaglandin E2 (PGE2). Many approaches aim to optimize MSC-based therapies, including preconditioning using soluble factors and cell immobilization in biomaterials. However, optimization of MSC function is usually inefficient as only a few factors are manipulated in parallel. We utilized fractional factorial design of experiments to screen a panel of 6 molecules (lipopolysaccharide [LPS], polyinosinic-polycytidylic acid [poly(I:C)], interleukin [IL]-6, IL-1β, interferon [IFN]-β, and IFN-γ), individually and in combinations, for the upregulation of MSC PGE2 secretion and attenuation of macrophage secretion of tumor necrosis factor (TNF)-α, a pro-inflammatory molecule, by activated-MSC conditioned medium (CM). We used multivariable linear regression (MLR) and analysis of covariance to determine differences in functions of optimal factors on monolayer MSCs and alginate-encapsulated MSCs (eMSCs). The screen revealed that LPS and IL-1β potently activated monolayer MSCs to enhance PGE2 production and attenuate macrophage TNF-α. Activation by LPS and IL-1β together synergistically increased MSC PGE2, but did not synergistically reduce macrophage TNF-α. MLR and covariate analysis revealed that macrophage TNF-α was strongly dependent on the MSC activation factor, PGE2 level, and macrophage donor but not MSC culture format (monolayer versus encapsulated). The results demonstrate the feasibility and utility of using statistical approaches for higher throughput cell analysis. This approach can be extended to develop activation schemes to maximize MSC and MSC-biomaterial functions prior to transplantation to improve MSC therapies. © 2015 American Institute of Chemical Engineers.
Water conservation behavior in Australia.
Dolnicar, Sara; Hurlimann, Anna; Grün, Bettina
2012-08-30
Ensuring a nation's long term water supply requires the use of both supply-sided approaches such as water augmentation through water recycling, and demand-sided approaches such as water conservation. Conservation behavior can only be increased if the key drivers of such behavior are understood. The aim of this study is to reveal the main drivers from a comprehensive pool of hypothesized factors. An empirical study was conducted with 3094 Australians. Data was analyzed using multivariate linear regression analysis and decision trees to determine which factors best predict self-reported water conservation behavior. Two key factors emerge: high level of pro-environmental behavior; and pro-actively seeking out information about water. A number of less influential factors are also revealed. Public communication strategy implications are derived. Copyright © 2012 Elsevier Ltd. All rights reserved.
Chen, Gang; Adleman, Nancy E.; Saad, Ziad S.; Leibenluft, Ellen; Cox, RobertW.
2014-01-01
All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance–covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within- subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT)with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse–Geisser and Huynh–Feldt) with MVT-WS. PMID:24954281
Gene expression profiling--Opening the black box of plant ecosystem responses to global change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leakey, A.D.B.; Ainsworth, E.A.; Bernard, S.M.
The use of genomic techniques to address ecological questions is emerging as the field of genomic ecology. Experimentation under environmentally realistic conditions to investigate the molecular response of plants to meaningful changes in growth conditions and ecological interactions is the defining feature of genomic ecology. Since the impact of global change factors on plant performance are mediated by direct effects at the molecular, biochemical and physiological scales, gene expression analysis promises important advances in understanding factors that have previously been consigned to the 'black box' of unknown mechanism. Various tools and approaches are available for assessing gene expression in modelmore » and non-model species as part of global change biology studies. Each approach has its own unique advantages and constraints. A first generation of genomic ecology studies in managed ecosystems and mesocosms have provided a testbed for the approach and have begun to reveal how the experimental design and data analysis of gene expression studies can be tailored for use in an ecological context.« less
NASA Astrophysics Data System (ADS)
Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.
2016-09-01
The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.
Dimensionality Assessment of Ordered Polytomous Items with Parallel Analysis
ERIC Educational Resources Information Center
Timmerman, Marieke E.; Lorenzo-Seva, Urbano
2011-01-01
Parallel analysis (PA) is an often-recommended approach for assessment of the dimensionality of a variable set. PA is known in different variants, which may yield different dimensionality indications. In this article, the authors considered the most appropriate PA procedure to assess the number of common factors underlying ordered polytomously…
A Meta-Analysis of the Predictors of Cyberbullying Perpetration and Victimization
ERIC Educational Resources Information Center
Guo, Siying
2016-01-01
Previous studies so far have investigated various aspects of cyberbullying. Using meta-analytic approaches, the study was primarily to determine the target factors predicting individuals' perpetration and victimization in cyberbullying. A meta-analysis of 77 studies containing 418 primary effect sizes was conducted to exam the relative magnitude…
Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM
ERIC Educational Resources Information Center
Warner, Rebecca M.
2007-01-01
This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…
Exploring Incomplete Rating Designs with Mokken Scale Analysis
ERIC Educational Resources Information Center
Wind, Stefanie A.; Patil, Yogendra J.
2018-01-01
Recent research has explored the use of models adapted from Mokken scale analysis as a nonparametric approach to evaluating rating quality in educational performance assessments. A potential limiting factor to the widespread use of these techniques is the requirement for complete data, as practical constraints in operational assessment systems…
Corriveau, H; Arsenault, A B; Dutil, E; Lepage, Y
1992-01-01
An evaluation based on the Bobath approach to treatment has previously been developed and partially validated. The purpose of the present study was to verify the content validity of this evaluation with the use of a statistical approach known as principal components analysis. Thirty-eight hemiplegic subjects participated in the study. Analysis of the scores on each of six parameters (sensorium, active movements, muscle tone, reflex activity, postural reactions, and pain) was evaluated on three occasions across a 2-month period. Each time this produced three factors that contained 70% of the variation in the data set. The first component mainly reflected variations in mobility, the second mainly variations in muscle tone, and the third mainly variations in sensorium and pain. The results of such exploratory analysis highlight the fact that some of the parameters are not only important but also interrelated. These results seem to partially support the conceptual framework substantiating the Bobath approach to treatment.
NASA Astrophysics Data System (ADS)
Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai
2017-08-01
Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.
ERIC Educational Resources Information Center
Binder, Martin; Coad, Alex
2011-01-01
There is an ambiguity in Amartya Sen's capability approach as to what constitutes an individual's resources, conversion factors and valuable functionings. What we here call the "circularity problem" points to the fact that all three concepts seem to be mutually endogenous and interdependent. To econometrically account for this…
ERIC Educational Resources Information Center
Riese, Hanne
2011-01-01
This article argues that participative approaches, such as those found in enterprise or entrepreneurship education, allow several factors to influence learning activity. The "Mini-enterprise" (Young Enterprise) approach is one where students set up and run their own business during a school year. This article is based on the analysis of…
Application of Grey Relational Analysis to Decision-Making during Product Development
ERIC Educational Resources Information Center
Hsiao, Shih-Wen; Lin, Hsin-Hung; Ko, Ya-Chuan
2017-01-01
A multi-attribute decision-making (MADM) approach was proposed in this study as a prediction method that differs from the conventional production and design methods for a product. When a client has different dimensional requirements, this approach can quickly provide a company with design decisions for each product. The production factors of a…
USDA-ARS?s Scientific Manuscript database
The use of nitrogen stable isotopes for estimation of animal trophic position has become an indispensable approach in food web ecology. Compound-specific isotope analysis of amino acids is a new approach for estimating trophic position that may overcome key issues associated with nitrogen stable iso...
ERIC Educational Resources Information Center
Greenwood, Charles R., Ed.; And Others
This monograph contains five papers that discuss an eco-behavioral approach to psychology, special education, and applied behavior analysis. The papers point out the advantages of assessing ecological factors (such as natural stimuli and special education procedures) in a quantitative fashion and in a temporal relationship with student behavior.…
Comparative study of two approaches to model the offshore fish cages
NASA Astrophysics Data System (ADS)
Zhao, Yun-peng; Wang, Xin-xin; Decew, Jud; Tsukrov, Igor; Bai, Xiao-dong; Bi, Chun-wei
2015-06-01
The goal of this paper is to provide a comparative analysis of two commonly used approaches to discretize offshore fish cages: the lumped-mass approach and the finite element technique. Two case studies are chosen to compare predictions of the LMA (lumped-mass approach) and FEA (finite element analysis) based numerical modeling techniques. In both case studies, we consider several loading conditions consisting of different uniform currents and monochromatic waves. We investigate motion of the cage, its deformation, and the resultant tension in the mooring lines. Both model predictions are sufficient close to the experimental data, but for the first experiment, the DUT-FlexSim predictions are slightly more accurate than the ones provided by Aqua-FE™. According to the comparisons, both models can be successfully utilized to the design and analysis of the offshore fish cages provided that an appropriate safety factor is chosen.
Pomp, E R; Van Stralen, K J; Le Cessie, S; Vandenbroucke, J P; Rosendaal, F R; Doggen, C J M
2010-07-01
We discuss the analytic and practical considerations in a large case-control study that had two control groups; the first control group consisting of partners of patients and the second obtained by random digit dialling (RDD). As an example of the evaluation of a general lifestyle factor, we present body mass index (BMI). Both control groups had lower BMIs than the patients. The distribution in the partner controls was closer to that of the patients, likely due to similar lifestyles. A statistical approach was used to pool the results of both analyses, wherein partners were analyzed with a matched analysis, while RDDs were analyzed without matching. Even with a matched analysis, the odds ratio with partner controls remained closer to unity than with RDD controls, which is probably due to unmeasured confounders in the comparison with the random controls as well as intermediary factors. However, when studying injuries as a risk factor, the odds ratio remained higher with partner control subjects than with RRD control subjects, even after taking the matching into account. Finally we used factor V Leiden as an example of a genetic risk factor. The frequencies of factor V Leiden were identical in both control groups, indicating that for the analyses of this genetic risk factor the two control groups could be combined in a single unmatched analysis. In conclusion, the effect measures with the two control groups were in the same direction, and of the same order of magnitude. Moreover, it was not always the same control group that produced the higher or lower estimates, and a matched analysis did not remedy the differences. Our experience with the intricacies of dealing with two control groups may be useful to others when thinking about an optimal research design or the best statistical approach.
NASA Astrophysics Data System (ADS)
Tanty, Kiranbala; Mukharjee, Bibhuti Bhusan; Das, Sudhanshu Shekhar
2018-06-01
The present study investigates the effect of replacement of coarse fraction of natural aggregates by recycled concrete aggregates on the properties of hot mix asphalt (HMA) using general factorial design approach. For this two factors i.e. recycled coarse aggregates percentage [RCA (%)] and bitumen content percentage [BC (%)] are considered. Tests have been carried out on the HMA type bituminous concrete, prepared with varying RCA (%) and BC (%). Analysis of variance has been performed on the experimental data to determine the effect of the chosen factors on various parameters such as stability, flow, air void, void mineral aggregate, void filled with bitumen and bulk density. The study depicts that RCA (%) and BC (%) have significant effect on the selected responses as p value is less than the chosen significance level. In addition to above, the outcomes of the statistical analysis indicate that interaction between factors have significant effects on void mineral aggregate and bulk density of bituminous concrete.
NASA Astrophysics Data System (ADS)
Tanty, Kiranbala; Mukharjee, Bibhuti Bhusan; Das, Sudhanshu Shekhar
2018-02-01
The present study investigates the effect of replacement of coarse fraction of natural aggregates by recycled concrete aggregates on the properties of hot mix asphalt (HMA) using general factorial design approach. For this two factors i.e. recycled coarse aggregates percentage [RCA (%)] and bitumen content percentage [BC (%)] are considered. Tests have been carried out on the HMA type bituminous concrete, prepared with varying RCA (%) and BC (%). Analysis of variance has been performed on the experimental data to determine the effect of the chosen factors on various parameters such as stability, flow, air void, void mineral aggregate, void filled with bitumen and bulk density. The study depicts that RCA (%) and BC (%) have significant effect on the selected responses as p value is less than the chosen significance level. In addition to above, the outcomes of the statistical analysis indicate that interaction between factors have significant effects on void mineral aggregate and bulk density of bituminous concrete.
Gürgen, Fikret; Gürgen, Nurgül
2003-01-01
This study proposes an intelligent data analysis approach to investigate and interpret the distinctive factors of diabetes mellitus patients with and without ischemic (non-embolic type) stroke in a small population. The database consists of a total of 16 features collected from 44 diabetic patients. Features include age, gender, duration of diabetes, cholesterol, high density lipoprotein, triglyceride levels, neuropathy, nephropathy, retinopathy, peripheral vascular disease, myocardial infarction rate, glucose level, medication and blood pressure. Metric and non-metric features are distinguished. First, the mean and covariance of the data are estimated and the correlated components are observed. Second, major components are extracted by principal component analysis. Finally, as common examples of local and global classification approach, a k-nearest neighbor and a high-degree polynomial classifier such as multilayer perceptron are employed for classification with all the components and major components case. Macrovascular changes emerged as the principal distinctive factors of ischemic-stroke in diabetes mellitus. Microvascular changes were generally ineffective discriminators. Recommendations were made according to the rules of evidence-based medicine. Briefly, this case study, based on a small population, supports theories of stroke in diabetes mellitus patients and also concludes that the use of intelligent data analysis improves personalized preventive intervention. PMID:12685939
NASA Technical Reports Server (NTRS)
Deal, Don E.
1991-01-01
The chief goals of the summer project have been twofold - first, for my host group and myself to learn as much of the working details of Taguchi analysis as possible in the time allotted, and, secondly, to apply the methodology to a design problem with the intention of establishing a preliminary set of near-optimal (in the sense of producing a desired response) design parameter values from among a large number of candidate factor combinations. The selected problem is concerned with determining design factor settings for an automated approach program which is to have the capability of guiding the Shuttle into the docking port of the Space Station under controlled conditions so as to meet and/or optimize certain target criteria. The candidate design parameters under study were glide path (i.e., approach) angle, path intercept and approach gains, and minimum impulse bit mode (a parameter which defines how Shuttle jets shall be fired). Several performance criteria were of concern: terminal relative velocity at the instant the two spacecraft are mated; docking offset; number of Shuttle jet firings in certain specified directions (of interest due to possible plume impingement on the Station's solar arrays), and total RCS (a measure of the energy expended in performing the approach/docking maneuver). In the material discussed here, we have focused on single performance criteria - total RCS. An analysis of the possibility of employing a multiobjective function composed of a weighted sum of the various individual criteria has been undertaken, but is, at this writing, incomplete. Results from the Taguchi statistical analysis indicate that only three of the original four posited factors are significant in affecting RCS response. A comparison of model simulation output (via Monte Carlo) with predictions based on estimated factor effects inferred through the Taguchi experiment array data suggested acceptable or close agreement between the two except at the predicted optimum point, where a difference outside a rule-of-thumb bound was observed. We have concluded that there is most likely an interaction effect not provided for in the original orthogonal array selected as the basis for our experimental design. However, we feel that the data indicates that this interaction is a mild one and that inclusion of its effect will not alter the location of the optimum.
Richardson, Miles
2017-04-01
In ergonomics there is often a need to identify and predict the separate effects of multiple factors on performance. A cost-effective fractional factorial approach to understanding the relationship between task characteristics and task performance is presented. The method has been shown to provide sufficient independent variability to reveal and predict the effects of task characteristics on performance in two domains. The five steps outlined are: selection of performance measure, task characteristic identification, task design for user trials, data collection, regression model development and task characteristic analysis. The approach can be used for furthering knowledge of task performance, theoretical understanding, experimental control and prediction of task performance. Practitioner Summary: A cost-effective method to identify and predict the separate effects of multiple factors on performance is presented. The five steps allow a better understanding of task factors during the design process.
Determinants of job stress in chemical process industry: A factor analysis approach.
Menon, Balagopal G; Praveensal, C J; Madhu, G
2015-01-01
Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.
Factors Influencing Cecal Intubation Time during Retrograde Approach Single-Balloon Enteroscopy
Chen, Peng-Jen; Shih, Yu-Lueng; Huang, Hsin-Hung; Hsieh, Tsai-Yuan
2014-01-01
Background and Aim. The predisposing factors for prolonged cecal intubation time (CIT) during colonoscopy have been well identified. However, the factors influencing CIT during retrograde SBE have not been addressed. The aim of this study was to determine the factors influencing CIT during retrograde SBE. Methods. We investigated patients who underwent retrograde SBE at a medical center from January 2011 to March 2014. The medical charts and SBE reports were reviewed. The patients' characteristics and procedure-associated data were recorded. These data were analyzed with univariate analysis as well as multivariate logistic regression analysis to identify the possible predisposing factors. Results. We enrolled 66 patients into this study. The median CIT was 17.4 minutes. With univariate analysis, there was no statistical difference in age, sex, BMI, or history of abdominal surgery, except for bowel preparation (P = 0.021). Multivariate logistic regression analysis showed that inadequate bowel preparation (odds ratio 30.2, 95% confidence interval 4.63–196.54; P < 0.001) was the independent predisposing factors for prolonged CIT during retrograde SBE. Conclusions. For experienced endoscopist, inadequate bowel preparation was the independent predisposing factor for prolonged CIT during retrograde SBE. PMID:25505904
Atighechian, Golrokh; Maleki, Mohammadreza; Aryankhesal, Aidin; Jahangiri, Katayoun
2016-07-24
Oil spill in fresh water can affect ecological processes and accordingly it can influence human health. Iran, due to having 58.8 % of the world oil reserves, is highly vulnerable to water contamination by oil products. The aim of this study was to determine environmental factors affecting the management of the oil spill into one of the river in Iran using the PESTLE analysis. This was a qualitative case study conducted in 2015 on an oil spill incident in Iran and its roots from a disaster management approach. Semi-structured interviews were conducted for data collection. Seventy managers and staffs with those responsible or involved in oil spill incident management were recruited to the study. Qualitative content analysis approach was employed for the data analysis. Document analysis was used to collect additional information. Findings of the present study indicated that different factors affected the management of the event of oil spill onto one of the central river and consequently the management of drink water resources. Using this analysis, managers can plan for such events and develop scenarios for them to have better performance for the future events.
Atighechian, Golrokh; Maleki, Mohammadreza; Aryankhesal, Aidin; Jahangiri, Katayoun
2016-01-01
Introduction: Oil spill in fresh water can affect ecological processes and accordingly it can influence human health. Iran, due to having 58.8 % of the world oil reserves, is highly vulnerable to water contamination by oil products. Aim: The aim of this study was to determine environmental factors affecting the management of the oil spill into one of the river in Iran using the PESTLE analysis. Material and methods: This was a qualitative case study conducted in 2015 on an oil spill incident in Iran and its roots from a disaster management approach. Semi-structured interviews were conducted for data collection. Seventy managers and staffs with those responsible or involved in oil spill incident management were recruited to the study. Qualitative content analysis approach was employed for the data analysis. Document analysis was used to collect additional information. Results: Findings of the present study indicated that different factors affected the management of the event of oil spill onto one of the central river and consequently the management of drink water resources. Using this analysis, managers can plan for such events and develop scenarios for them to have better performance for the future events. PMID:27698608
Relative Velocity as a Metric for Probability of Collision Calculations
NASA Technical Reports Server (NTRS)
Frigm, Ryan Clayton; Rohrbaugh, Dave
2008-01-01
Collision risk assessment metrics, such as the probability of collision calculation, are based largely on assumptions about the interaction of two objects during their close approach. Specifically, the approach to probabilistic risk assessment can be performed more easily if the relative trajectories of the two close approach objects are assumed to be linear during the encounter. It is shown in this analysis that one factor in determining linearity is the relative velocity of the two encountering bodies, in that the assumption of linearity breaks down at low relative approach velocities. The first part of this analysis is the determination of the relative velocity threshold below which the assumption of linearity becomes invalid. The second part is a statistical study of conjunction interactions between representative asset spacecraft and the associated debris field environment to determine the likelihood of encountering a low relative velocity close approach. This analysis is performed for both the LEO and GEO orbit regimes. Both parts comment on the resulting effects to collision risk assessment operations.
NASA Astrophysics Data System (ADS)
Le Duy, Nguyen; Heidbüchel, Ingo; Meyer, Hanno; Merz, Bruno; Apel, Heiko
2018-02-01
This study analyzes the influence of local and regional climatic factors on the stable isotopic composition of rainfall in the Vietnamese Mekong Delta (VMD) as part of the Asian monsoon region. It is based on 1.5 years of weekly rainfall samples. In the first step, the isotopic composition of the samples is analyzed by local meteoric water lines (LMWLs) and single-factor linear correlations. Additionally, the contribution of several regional and local factors is quantified by multiple linear regression (MLR) of all possible factor combinations and by relative importance analysis. This approach is novel for the interpretation of isotopic records and enables an objective quantification of the explained variance in isotopic records for individual factors. In this study, the local factors are extracted from local climate records, while the regional factors are derived from atmospheric backward trajectories of water particles. The regional factors, i.e., precipitation, temperature, relative humidity and the length of backward trajectories, are combined with equivalent local climatic parameters to explain the response variables δ18O, δ2H, and d-excess of precipitation at the station of measurement. The results indicate that (i) MLR can better explain the isotopic variation in precipitation (R2 = 0.8) compared to single-factor linear regression (R2 = 0.3); (ii) the isotopic variation in precipitation is controlled dominantly by regional moisture regimes (˜ 70 %) compared to local climatic conditions (˜ 30 %); (iii) the most important climatic parameter during the rainy season is the precipitation amount along the trajectories of air mass movement; (iv) the influence of local precipitation amount and temperature is not significant during the rainy season, unlike the regional precipitation amount effect; (v) secondary fractionation processes (e.g., sub-cloud evaporation) can be identified through the d-excess and take place mainly in the dry season, either locally for δ18O and δ2H, or along the air mass trajectories for d-excess. The analysis shows that regional and local factors vary in importance over the seasons and that the source regions and transport pathways, and particularly the climatic conditions along the pathways, have a large influence on the isotopic composition of rainfall. Although the general results have been reported qualitatively in previous studies (proving the validity of the approach), the proposed method provides quantitative estimates of the controlling factors, both for the whole data set and for distinct seasons. Therefore, it is argued that the approach constitutes an advancement in the statistical analysis of isotopic records in rainfall that can supplement or precede more complex studies utilizing atmospheric models. Due to its relative simplicity, the method can be easily transferred to other regions, or extended with other factors. The results illustrate that the interpretation of the isotopic composition of precipitation as a recorder of local climatic conditions, as for example performed for paleorecords of water isotopes, may not be adequate in the southern part of the Indochinese Peninsula, and likely neither in other regions affected by monsoon processes. However, the presented approach could open a pathway towards better and seasonally differentiated reconstruction of paleoclimates based on isotopic records.
Do not blame the driver: a systems analysis of the causes of road freight crashes.
Newnam, Sharon; Goode, Natassia
2015-03-01
Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation
Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger
2015-01-01
Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321
Zhang, T; Yang, M; Xiao, X; Feng, Z; Li, C; Zhou, Z; Ren, Q; Li, X
2014-03-01
Many infectious diseases exhibit repetitive or regular behaviour over time. Time-domain approaches, such as the seasonal autoregressive integrated moving average model, are often utilized to examine the cyclical behaviour of such diseases. The limitations for time-domain approaches include over-differencing and over-fitting; furthermore, the use of these approaches is inappropriate when the assumption of linearity may not hold. In this study, we implemented a simple and efficient procedure based on the fast Fourier transformation (FFT) approach to evaluate the epidemic dynamic of scarlet fever incidence (2004-2010) in China. This method demonstrated good internal and external validities and overcame some shortcomings of time-domain approaches. The procedure also elucidated the cycling behaviour in terms of environmental factors. We concluded that, under appropriate circumstances of data structure, spectral analysis based on the FFT approach may be applicable for the study of oscillating diseases.
Identifying Risk and Protective Factors in Recidivist Juvenile Offenders: A Decision Tree Approach
Ortega-Campos, Elena; García-García, Juan; Gil-Fenoy, Maria José; Zaldívar-Basurto, Flor
2016-01-01
Research on juvenile justice aims to identify profiles of risk and protective factors in juvenile offenders. This paper presents a study of profiles of risk factors that influence young offenders toward committing sanctionable antisocial behavior (S-ASB). Decision tree analysis is used as a multivariate approach to the phenomenon of repeated sanctionable antisocial behavior in juvenile offenders in Spain. The study sample was made up of the set of juveniles who were charged in a court case in the Juvenile Court of Almeria (Spain). The period of study of recidivism was two years from the baseline. The object of study is presented, through the implementation of a decision tree. Two profiles of risk and protective factors are found. Risk factors associated with higher rates of recidivism are antisocial peers, age at baseline S-ASB, problems in school and criminality in family members. PMID:27611313
ERIC Educational Resources Information Center
McQueen, Robert J.; Janson, Annick
2016-01-01
Purpose: This paper aims to examine factors which influence how tacit knowledge is built and applied by client-facing consultants. Design/methodology/approach: Qualitative methods (interviews, thematic analysis) were used to gather and analyse data from 15 consultants in an agricultural extension context. Findings: Twenty-six factors about how…
A Pedagogical Approach to the Boltzmann Factor through Experiments and Simulations
ERIC Educational Resources Information Center
Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.
2009-01-01
The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to…
ERIC Educational Resources Information Center
Konold, Timothy R.; Glutting, Joseph J.
2008-01-01
This study employed a correlated trait-correlated method application of confirmatory factor analysis to disentangle trait and method variance from measures of attention-deficit/hyperactivity disorder obtained at the college level. The two trait factors were "Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition" ("DSM-IV")…
Proteomic profiling of early degenerative retina of RCS rats.
Zhu, Zhi-Hong; Fu, Yan; Weng, Chuan-Huang; Zhao, Cong-Jian; Yin, Zheng-Qin
2017-01-01
To identify the underlying cellular and molecular changes in retinitis pigmentosa (RP). Label-free quantification-based proteomics analysis, with its advantages of being more economic and consisting of simpler procedures, has been used with increasing frequency in modern biological research. Dystrophic RCS rats, the first laboratory animal model for the study of RP, possess a similar pathological course as human beings with the diseases. Thus, we employed a comparative proteomics analysis approach for in-depth proteome profiling of retinas from dystrophic RCS rats and non-dystrophic congenic controls through Linear Trap Quadrupole - orbitrap MS/MS, to identify the significant differentially expressed proteins (DEPs). Bioinformatics analyses, including Gene ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway annotation and upstream regulatory analysis, were then performed on these retina proteins. Finally, a Western blotting experiment was carried out to verify the difference in the abundance of transcript factor E2F1. In this study, we identified a total of 2375 protein groups from the retinal protein samples of RCS rats and non-dystrophic congenic controls. Four hundred thirty-four significantly DEPs were selected by Student's t -test. Based on the results of the bioinformatics analysis, we identified mitochondrial dysfunction and transcription factor E2F1 as the key initiation factors in early retinal degenerative process. We showed that the mitochondrial dysfunction and the transcription factor E2F1 substantially contribute to the disease etiology of RP. The results provide a new potential therapeutic approach for this retinal degenerative disease.
Scherer, Ronny; Nilsen, Trude; Jansen, Malte
2016-01-01
Students' perceptions of instructional quality are among the most important criteria for evaluating teaching effectiveness. The present study evaluates different latent variable modeling approaches (confirmatory factor analysis, exploratory structural equation modeling, and bifactor modeling), which are used to describe these individual perceptions with respect to their factor structure, measurement invariance, and the relations to selected educational outcomes (achievement, self-concept, and motivation in mathematics). On the basis of the Programme for International Student Assessment (PISA) 2012 large-scale data sets of Australia, Canada, and the USA (N = 26,746 students), we find support for the distinction between three factors of individual students' perceptions and full measurement invariance across countries for all modeling approaches. In this regard, bifactor exploratory structural equation modeling outperformed alternative approaches with respect to model fit. Our findings reveal significant relations to the educational outcomes. This study synthesizes different modeling approaches of individual students' perceptions of instructional quality and provides insights into the nature of these perceptions from an individual differences perspective. Implications for the measurement and modeling of individually perceived instructional quality are discussed. PMID:26903917
NASA Astrophysics Data System (ADS)
Zeng, Yajun; Skibniewski, Miroslaw J.
2013-08-01
Enterprise resource planning (ERP) system implementations are often characterised with large capital outlay, long implementation duration, and high risk of failure. In order to avoid ERP implementation failure and realise the benefits of the system, sound risk management is the key. This paper proposes a probabilistic risk assessment approach for ERP system implementation projects based on fault tree analysis, which models the relationship between ERP system components and specific risk factors. Unlike traditional risk management approaches that have been mostly focused on meeting project budget and schedule objectives, the proposed approach intends to address the risks that may cause ERP system usage failure. The approach can be used to identify the root causes of ERP system implementation usage failure and quantify the impact of critical component failures or critical risk events in the implementation process.
NASA Astrophysics Data System (ADS)
Long, D.; Scanlon, B. R.; Longuevergne, L.; Chen, X.
2015-12-01
Increasing interest in use of GRACE satellites and a variety of new products to monitor changes in total water storage (TWS) underscores the need to assess the reliability of output from different products. The objective of this study was to assess skills and uncertainties of different approaches for processing GRACE data to restore signal losses caused by spatial filtering based on analysis of 1°×1° grid scale data and basin scale data in 60 river basins globally. Results indicate that scaling factors from six land surface models (LSMs), including four models from GLDAS-1 (Noah 2.7, Mosaic, VIC, and CLM 2.0), CLM 4.0, and WGHM, are similar over most humid, sub-humid, and high-latitude regions but can differ by up to 100% over arid and semi-arid basins and areas with intensive irrigation. Large differences in TWS anomalies from three processing approaches (scaling factor, additive, and multiplicative corrections) were found in arid and semi-arid regions, areas with intensive irrigation, and relatively small basins (e.g., ≤ 200,000 km2). Furthermore, TWS anomaly products from gridded data with CLM4.0 scaling factors and the additive correction approach more closely agree with WGHM output than the multiplicative correction approach. Estimation of groundwater storage changes using GRACE satellites requires caution in selecting an appropriate approach for restoring TWS changes. A priori ground-based data used in forward modeling can provide a powerful tool for explaining the distribution of signal gains or losses caused by low-pass filtering in specific regions of interest and should be very useful for more reliable estimation of groundwater storage changes using GRACE satellites.
Objective determination of image end-members in spectral mixture analysis of AVIRIS data
NASA Technical Reports Server (NTRS)
Tompkins, Stefanie; Mustard, John F.; Pieters, Carle M.; Forsyth, Donald W.
1993-01-01
Spectral mixture analysis has been shown to be a powerful, multifaceted tool for analysis of multi- and hyper-spectral data. Applications of AVIRIS data have ranged from mapping soils and bedrock to ecosystem studies. During the first phase of the approach, a set of end-members are selected from an image cube (image end-members) that best account for its spectral variance within a constrained, linear least squares mixing model. These image end-members are usually selected using a priori knowledge and successive trial and error solutions to refine the total number and physical location of the end-members. However, in many situations a more objective method of determining these essential components is desired. We approach the problem of image end-member determination objectively by using the inherent variance of the data. Unlike purely statistical methods such as factor analysis, this approach derives solutions that conform to a physically realistic model.
Taube-Schiff, M; El Morr, C; Counsell, A; Mehak, Adrienne; Gollan, J
2018-05-01
WHAT IS KNOWN ON THE SUBJECT?: The psychometrics of the CUB measure have been tested within an inpatient psychiatric setting. Results show that the CUB has two factors that reflect patients' approach and avoidance of dimensions of the treatment milieu, and that an increase of approach and decrease of avoidance are correlated with discharge. No empirical research has examined the validity of the CUB in a day hospital programme. WHAT THIS ARTICLE ADDS TO EXISTING KNOWLEDGE?: This study was the first to address the validity of this questionnaire within a psychiatric day hospital setting. This now allows other mental health service providers to use this questionnaire following administration of patient engagement interventions (such as behavioural activation), which are routinely used within this type of a setting. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: Our results can enable healthcare providers to employ an effective and psychometrically validated tool in a day hospital setting to measure treatment outcomes and provide reflections of patients' approach behaviours and avoidance behaviours. Introduction We evaluated the Checklist of Unit Behaviours (CUBs) questionnaire in a novel mental health setting: a day hospital within a large acute care general hospital. No empirical evidence exists, as of yet, to look at the validity of this measure in this type of a treatment setting. The CUB measures two factors, avoidance or approach, of the patients' engagement with the treatment milieu within the previous 24 hr. Aim A confirmatory factor analysis (CFA) was conducted to validate the CUB's original two factor structure in an outpatient day programme. Methods Psychiatric outpatients (n = 163) completed the CUB daily while participating in a day hospital programme in Toronto, Canada. Results A CFA was used to confirm the CUB factors but resulted in a poor fitting model for our sample, χ 2 (103) = 278.59, p < .001, CFI = 0.80, RMSEA = 0.10, SRMR = 0.10. Questions 5, 8 and 10 had higher loadings on a third factor revealed through exploratory factor analysis. We believe this factor, "Group Engagement," reflects the construct of group-related issues. Discussion The CUB was a practical and useful tool in our psychiatric day hospital setting at a large acute care general hospital. Implications for practice Our analysis identified group engagement, a critical variable in day programmes, as patients have autonomy regarding staying or leaving the programme. © 2017 John Wiley & Sons Ltd.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
NASA Astrophysics Data System (ADS)
Phuong, Vu Hung
2018-03-01
This research applies Data Envelopment Analysis (DEA) approach to analyze Total Factor Productivity (TFP) and efficiency changes in Vietnam coal mining industry from 2007 to 2013. The TFP of Vietnam coal mining companies decreased due to slow technological progress and unimproved efficiency. The decadence of technical efficiency in many enterprises proved that the coal mining industry has a large potential to increase productivity through technical efficiency improvement. Enhancing human resource training, technology and research & development investment could help the industry to improve efficiency and productivity in Vietnam coal mining industry.
Evaluation Criteria for Micro-CAI: A Psychometric Approach
Wallace, Douglas; Slichter, Mark; Bolwell, Christine
1985-01-01
The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.
Nfonsam, Landry E.; Cano, Carlos; Mudge, Joann; Schilkey, Faye D.; Curtiss, Jennifer
2012-01-01
Tissue-specific transcription factors are thought to cooperate with signaling pathways to promote patterned tissue specification, in part by co-regulating transcription. The Drosophila melanogaster Pax6 homolog Eyeless forms a complex, incompletely understood regulatory network with the Hedgehog, Decapentaplegic and Notch signaling pathways to control eye-specific gene expression. We report a combinatorial approach, including mRNAseq and microarray analyses, to identify targets co-regulated by Eyeless and Hedgehog, Decapentaplegic or Notch. Multiple analyses suggest that the transcriptomes resulting from co-misexpression of Eyeless+signaling factors provide a more complete picture of eye development compared to previous efforts involving Eyeless alone: (1) Principal components analysis and two-way hierarchical clustering revealed that the Eyeless+signaling factor transcriptomes are closer to the eye control transcriptome than when Eyeless is misexpressed alone; (2) more genes are upregulated at least three-fold in response to Eyeless+signaling factors compared to Eyeless alone; (3) based on gene ontology analysis, the genes upregulated in response to Eyeless+signaling factors had a greater diversity of functions compared to Eyeless alone. Through a secondary screen that utilized RNA interference, we show that the predicted gene CG4721 has a role in eye development. CG4721 encodes a neprilysin family metalloprotease that is highly up-regulated in response to Eyeless+Notch, confirming the validity of our approach. Given the similarity between D. melanogaster and vertebrate eye development, the large number of novel genes identified as potential targets of Ey+signaling factors will provide novel insights to our understanding of eye development in D. melanogaster and humans. PMID:22952997
Random vibration analysis of space flight hardware using NASTRAN
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, S. N.
1990-01-01
During liftoff and ascent flight phases, the Space Transportation System (STS) and payloads are exposed to the random acoustic environment produced by engine exhaust plumes and aerodynamic disturbances. The analysis of payloads for randomly fluctuating loads is usually carried out using the Miles' relationship. This approximation technique computes an equivalent load factor as a function of the natural frequency of the structure, the power spectral density of the excitation, and the magnification factor at resonance. Due to the assumptions inherent in Miles' equation, random load factors are often over-estimated by this approach. In such cases, the estimates can be refined using alternate techniques such as time domain simulations or frequency domain spectral analysis. Described here is the use of NASTRAN to compute more realistic random load factors through spectral analysis. The procedure is illustrated using Spacelab Life Sciences (SLS-1) payloads and certain unique features of this problem are described. The solutions are compared with Miles' results in order to establish trends at over or under prediction.
NASA Astrophysics Data System (ADS)
Oktaviana, P. P.; Fithriasari, K.
2018-04-01
Mostly Indonesian citizen consume vannamei shrimp as their food. Vannamei shrimp also is one of Indonesian exports comodities mainstay. Vannamei shrimp in the ponds and markets could be contaminated by Salmonella sp bacteria. This bacteria will endanger human health. Salmonella sp bacterial contamination on vannamei shrimp could be affected by many factors. This study is intended to identify what factors that supposedly influence the Salmonella sp bacterial contamination on vannamei shrimp. The researchers used the testing result of Salmonella sp bacterial contamination on vannamei shrimp as response variable. This response variable has two categories: 0 = if testing result indicate that there is no Salmonella sp on vannamei shrimp; 1 = if testing result indicate that there is Salmonella sp on vannamei shrimp. There are four factors that supposedly influence the Salmonella sp bacterial contamination on vannamei shrimp, which are the testing result of Salmonella sp bacterial contamination on farmer hand swab; the subdistrict of vannamei shrimp ponds; the fish processing unit supplied by; and the pond are in hectare. This four factors used as predictor variables. The analysis used is Binary Logit Model Approach according to the response variable that has two categories. The analysis result indicates that the factors or predictor variables which is significantly affect the Salmonella sp bacterial contamination on vannamei shrimp are the testing result of Salmonella sp bacterial contamination on farmer hand swab and the subdistrict of vannamei shrimp ponds.
Kilner, T
2004-01-01
Methods: Data generated by a Delphi study investigating the desirable attributes of ambulance technician, paramedic, and clinical supervisor were subject to factor analysis to explore inter-relations between the variables or desirable attributes. Variables that loaded onto any factor at a correlation level of >0.3 were included in the analysis. Results: Three factors emerged in each of the occupational groups. In respect of the ambulance technician these factors may be described as; core professional skills, individual and collaborative approaches to health and safety, and the management of self and clinical situations. For the paramedic the themes are; core professional skills, management of self and clinical situations, and approaches to health and safety. For the clinical supervisor there is again a theme described as core professional skills, with a further two themes described as role model and lifelong learning. Conclusions: The profile of desirable attributes emerging from this study are remarkably similar to the generic benchmark statements for health care programmes outlined by the Quality Assurance Agency for Higher Education. It seems that a case is emerging for a revision of the curriculum currently used for the education and training of ambulance staff, which is more suited to a consumer led health service and which reflects the broader professional base seen in programmes associated with other healthcare professions. This study has suggested outline content, and module structure for the education of the technician, paramedic, and clinical supervisor, based on empirical evidence. PMID:15107389
Restructuring the rotor analysis program C-60
NASA Technical Reports Server (NTRS)
1985-01-01
The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.
ERIC Educational Resources Information Center
Tchumtchoua, Sylvie; Dey, Dipak K.
2012-01-01
This paper proposes a semiparametric Bayesian framework for the analysis of associations among multivariate longitudinal categorical variables in high-dimensional data settings. This type of data is frequent, especially in the social and behavioral sciences. A semiparametric hierarchical factor analysis model is developed in which the…
This research paper uses case analysis methods to understand why participants engage in this innovative approach public participation in scientific research, and what they hope that will mean for their community. The research questions that guide this analysis are: 1) what factor...
ERIC Educational Resources Information Center
Crossley, Michael
2010-01-01
The article argues that greater attention should be paid to contextual factors in educational research and international development cooperation. The analysis draws upon principles that underpin socio-cultural approaches to comparative education, a critical analysis of the political economy of contemporary educational research, and recent research…
Social Judgment Analysis: Methodology for Improving Interpersonal Communication and Understanding.
ERIC Educational Resources Information Center
Rohrbaugh, John; Harmon, Joel
Research has found the Social Judgment Analysis (SJA) approach, with its focus on judgment policy and cognitive feedback, to be a significant factor in developing group member agreement and improving member performance. A controlled experiment was designed to assess the relative quality of the judgment making process provided by SJA.…
Multifractal Properties of Process Control Variables
NASA Astrophysics Data System (ADS)
Domański, Paweł D.
2017-06-01
Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.
David, Helena Maria Scherlowski Leal; Caufield, Catherine
2005-01-01
This exploratory study aimed to investigate factors related to the use of illicit and licit drugs and workplace violence in a group of women from popular classes in the city of Rio de Janeiro. We used a descriptive and analytic quantitative approach was used, as well as a qualitative approach through in-depth interviews with women who suffered or were suffering workplace violence, using the collective subject discourse analysis methodology. The results showed sociodemographic and work situations that can be considered as possible risk factors for drug consumption and workplace violence. The qualitative analysis shows how this group perceives the phenomena of drug use and workplace violence, expanding the comprehension about these issues and providing conceptual and methodological elements for additional studies on this subject.
Wind Tunnel Strain-Gage Balance Calibration Data Analysis Using a Weighted Least Squares Approach
NASA Technical Reports Server (NTRS)
Ulbrich, N.; Volden, T.
2017-01-01
A new approach is presented that uses a weighted least squares fit to analyze wind tunnel strain-gage balance calibration data. The weighted least squares fit is specifically designed to increase the influence of single-component loadings during the regression analysis. The weighted least squares fit also reduces the impact of calibration load schedule asymmetries on the predicted primary sensitivities of the balance gages. A weighting factor between zero and one is assigned to each calibration data point that depends on a simple count of its intentionally loaded load components or gages. The greater the number of a data point's intentionally loaded load components or gages is, the smaller its weighting factor becomes. The proposed approach is applicable to both the Iterative and Non-Iterative Methods that are used for the analysis of strain-gage balance calibration data in the aerospace testing community. The Iterative Method uses a reasonable estimate of the tare corrected load set as input for the determination of the weighting factors. The Non-Iterative Method, on the other hand, uses gage output differences relative to the natural zeros as input for the determination of the weighting factors. Machine calibration data of a six-component force balance is used to illustrate benefits of the proposed weighted least squares fit. In addition, a detailed derivation of the PRESS residuals associated with a weighted least squares fit is given in the appendices of the paper as this information could not be found in the literature. These PRESS residuals may be needed to evaluate the predictive capabilities of the final regression models that result from a weighted least squares fit of the balance calibration data.
1994-12-01
meta-analytic approach . Journal of Applied Psychology , 76, 432-446. 6-14 Raju, N.S., & Dowhower, D.P. (1991). The effect of second-order sampling on the... Psychology . HARPER, G., & KEMBER, D. (1989). Interpretation of factor analysis from the approaches to studying inventory. British Journal of Educational...Contemporary Educational Psychology , 12, 381-385. TRIGWELL, K., & PROSSER, M. (1991). Relating approaches to study and quality of learning outcomes at
Examining evolving performance on the Force Concept Inventory using factor analysis
NASA Astrophysics Data System (ADS)
Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W.
2017-06-01
The application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a pre- and post-test, we see factor analysis as a tool by which the changes in conceptual associations made by our students may be gauged given the evolution of their response patterns. This analysis allows us to identify and track conceptual linkages, affording us insight as to how our students have matured due to instruction. We report on our analysis of 427 pre- and post-tests. The factor models for the pre- and post-tests are explored and compared along with the methodology by which these models were fit to the data. The post-test factor pattern is more aligned with an expert's interpretation of the questions' content, as it allows for a more readily identifiable relationship between factors and physical concepts. We discuss this evolution in the context of approaching the characteristics of an expert with force concepts. Also, we find that certain test items do not significantly contribute to the pre- or post-test factor models and attempt explanations as to why this is so. This may suggest that such questions may not be effective in probing the conceptual understanding of our students.
Muller, Markus K; Wrann, Simon; Widmer, Jeannette; Klasen, Jennifer; Weber, Markus; Hahnloser, Dieter
2016-09-01
The surgical treatment for perforated peptic ulcers can be safely performed laparoscopically. The aim of the study was to define simple predictive factors for conversion and septic complications. This retrospective case-control study analyzed patients treated with either laparoscopic surgery or laparotomy for perforated peptic ulcers. A total of 71 patients were analyzed. Laparoscopically operated patients had a shorter hospital stay (13.7 vs. 15.1 days). In an intention-to-treat analysis, patients with conversion to open surgery (analyzed as subgroup from laparoscopic approach group) showed no prolonged hospital stay (15.3 days) compared to patients with a primary open approach. Complication and mortality rates were not different between the groups. The statistical analysis identified four intraoperative risk factors for conversion: Mannheim peritonitis index (MPI) > 21 (p = 0.02), generalized peritonitis (p = 0.04), adhesions, and perforations located in a region other than the duodenal anterior wall. We found seven predictive factors for septic complications: age >70 (p = 0.02), cardiopulmonary disease (p = 0.04), ASA > 3 (p = 0.002), CRP > 100 (p = 0.005), duration of symptoms >24 h (p = 0.02), MPI > 21(p = 0.008), and generalized peritonitis (p = 0.02). Our data suggest that a primary laparoscopic approach has no disadvantages. Factors necessitating conversions emerged during the procedure inhibiting a preoperative selection. Factors suggesting imminent septic complications can be assessed preoperatively. An assessment of the proposed parameters may help optimize the management of possible septic complications.
Ye, Hanhui; Yuan, Jinjin; Wang, Zhengwu; Huang, Aiqiong; Liu, Xiaolong; Han, Xiao; Chen, Yahong
2016-01-01
Human immunodeficiency virus causes a severe disease in humans, referred to as immune deficiency syndrome. Studies on the interaction between host genetic factors and the virus have revealed dozens of genes that impact diverse processes in the AIDS disease. To resolve more genetic factors related to AIDS, a canonical correlation analysis was used to determine the correlation between AIDS restriction and metabolic pathway gene expression. The results show that HIV-1 postentry cellular viral cofactors from AIDS restriction genes are coexpressed in human transcriptome microarray datasets. Further, the purine metabolism pathway comprises novel host factors that are coexpressed with AIDS restriction genes. Using a canonical correlation analysis for expression is a reliable approach to exploring the mechanism underlying AIDS.
Freeman, Hani D.; Brosnan, Sarah F.; Hopper, Lydia M.; Lambeth, Susan P.; Schapiro, Steven J.; Gosling, Samuel D.
2013-01-01
One effective method for measuring personality in primates is to use personality trait ratings to distill the experience of people familiar with the individual animals. Previous rating instruments were created using either top-down or bottom-up approaches. Top-down approaches, which essentially adapt instruments originally designed for use with another species, can unfortunately lead to the inclusion of traits irrelevant to chimpanzees or fail to include all relevant aspects of chimpanzee personality. Conversely, because bottom-up approaches derive traits specifically for chimpanzees, their unique items may impede comparisons with findings in other studies and other species. To address the limitations of each approach, we developed a new personality rating scale using a combined top-down/bottom-up design. Seventeen raters rated 99 chimpanzees on the new 41-item scale, with all but one item being rated reliably. Principal components analysis, using both varimax and direct oblimin rotations, identified six broad factors. Strong evidence was found for five of the factors (Reactivity/Undependability, Dominance, Openness, Extraversion, and Agreeableness). A sixth factor (Methodical) was offered provisionally until more data are collected. We validated the factors against behavioral data collected independently on the chimpanzees. The five factors demonstrated good evidence for convergent and predictive validity, thereby underscoring the robustness of the factors. Our combined top-down/ bottom-up approach provides the most extensive data to date to support the universal existence of these five personality factors in chimpanzees. This framework, which facilitates cross-species comparisons, can also play a vital role in understanding the evolution of personality and can assist with husbandry and welfare efforts. PMID:23733359
Otoplasty: A graduated approach.
Foda, H M
1999-01-01
Numerous otoplastic techniques have been described for the correction of protruding ears. Technique selection in otoplasty should be done only after careful analysis of the abnormal anatomy responsible for the protruding ear deformity. A graduated surgical approach is presented which is designed to address all contributing factors to the presenting auricular deformity. The approach starts with the more conservative cartilage-sparing suturing techniques, then proceeds to incorporate other more aggressive cartilage weakening maneuvers. Applying this approach resulted in better long-term results with less postoperative lateralization than that encountered on using the cartilage-sparing techniques alone.
Transient risk factors for acute traumatic hand injuries: a case‐crossover study in Hong Kong
Chow, C Y; Lee, H; Lau, J; Yu, I T S
2007-01-01
Objectives To identify the remediable transient risk factors of occupational hand injuries in Hong Kong in order to guide the development of prevention strategies. Methods The case‐crossover study design was adopted. Study subjects were workers with acute hand injuries presenting to the government Occupational Medicine Unit for compensation claims within 90 days from the date of injury. Detailed information on exposures to specific transient factors during the 60 minutes prior to the occurrence of the injury, during the same time interval on the day prior to the injury, as well as the usual exposure during the past work‐month was obtained through telephone interviews. Both matched‐pair interval approach and usual frequency approach were adopted to assess the associations between transient exposures in the workplace and the short‐term risk of sustaining a hand injury. Results A total of 196 injured workers were interviewed. The results of the matched‐pair interval analysis matched well with the results obtained using the usual frequency analysis. Seven significant transient risk factors were identified: using malfunctioning equipment/materials, using a different work method, performing an unusual work task, working overtime, feeling ill, being distracted and rushing, with odds ratios ranging from 10.5 to 26.0 in the matched‐pair interval analysis and relative risks ranging between 8.0 and 28.3 with the usual frequency analysis. Wearing gloves was found to have an insignificant protective effect on the occurrence of hand injury in both analyses. Conclusions Using the case‐crossover study design for acute occupational hand injuries, seven transient risk factors that were mostly modifiable were identified. It is suggested that workers and their employers should increase their awareness of these risk factors, and efforts should be made to avoid exposures to these factors by means of engineering and administrative controls supplemented by safety education and training. PMID:16973734
Power flow analysis of two coupled plates with arbitrary characteristics
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1990-01-01
In the last progress report (Feb. 1988) some results were presented for a parametric analysis on the vibrational power flow between two coupled plate structures using the mobility power flow approach. The results reported then were for changes in the structural parameters of the two plates, but with the two plates identical in their structural characteristics. Herein, limitation is removed. The vibrational power input and output are evaluated for different values of the structural damping loss factor for the source and receiver plates. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. The results obtained from the mobility power flow approach are compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between the SEA results and the mobility power flow results. Furthermore, the benefits derived from using the mobility power flow approach are examined.
ERIC Educational Resources Information Center
Veas, Alejandro; Gilar, Raquel; Miñano, Pablo; Castejón, Juan Luis
2017-01-01
The present study, based on the construct comparability approach, performs a comparative analysis of general points average for seven courses, using exploratory factor analysis (EFA) and the Partial Credit model (PCM) with a sample of 1398 student subjects (M = 12.5, SD = 0.67) from 8 schools in the province of Alicante (Spain). EFA confirmed a…
Image analysis by integration of disparate information
NASA Technical Reports Server (NTRS)
Lemoigne, Jacqueline
1993-01-01
Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.
Iacono, Teresa; Tracy, Jane; Keating, Jenny; Brown, Ted
2009-01-01
The Interaction with Disabled Persons scale (IDP) has been used in research into baseline attitudes and to evaluate whether a shift in attitudes towards people with developmental disabilities has occurred following some form of intervention. This research has been conducted on the assumption that the IDP measures attitudes as a multidimensional construct and has good internal consistency. Such assumptions about the IDP appear flawed, particularly in light of failures to replicate its underlying factor structure. The aim of this study was to evaluate the construct validity and dimensionality of the IDP. This study used a prospective survey approach. Participants were recruited from first and second year undergraduate university students enrolled in health sciences, occupational therapy, physiotherapy, community and emergency health, nursing, and combined degrees of nursing and midwifery, and health sciences and social work at a large Australian university (n=373). Students completed the IDP, a 20-item self-report scale of attitudes towards people with disabilities. The IDP data were analysed using a combination of factor analysis (Classical Test Theory approach) and Rasch analysis (Item Response Theory approach). The results indicated that the original IDP 6-factor solution was not supported. Instead, one factor consisting of five IDP items (9, 11, 12, 17, and 18) labelled Discomfort met the four criteria for empirical validation of test quality: interval level scaling (scalability), unidimensionality, lacked of DIF across the two participant groups and data collection occasions, and hierarchical ordering. Researchers should consider using the Discomfort subscale of the IDP in future attitude research since it exhibits sound measurement properties.
Model wall and recovery temperature effects on experimental heat transfer data analysis
NASA Technical Reports Server (NTRS)
Throckmorton, D. A.; Stone, D. R.
1974-01-01
Basic analytical procedures are used to illustrate, both qualitatively and quantitatively, the relative impact upon heat transfer data analysis of certain factors which may affect the accuracy of experimental heat transfer data. Inaccurate knowledge of adiabatic wall conditions results in a corresponding inaccuracy in the measured heat transfer coefficient. The magnitude of the resulting error is extreme for data obtained at wall temperatures approaching the adiabatic condition. High model wall temperatures and wall temperature gradients affect the level and distribution of heat transfer to an experimental model. The significance of each of these factors is examined and its impact upon heat transfer data analysis is assessed.
Critical management practices influencing on-site waste minimization in construction projects.
Ajayi, Saheed O; Oyedele, Lukumon O; Bilal, Muhammad; Akinade, Olugbenga O; Alaka, Hafiz A; Owolabi, Hakeem A
2017-01-01
As a result of increasing recognition of effective site management as the strategic approach for achieving the required performance in construction projects, this study seeks to identify the key site management practices that are requisite for construction waste minimization. A mixed methods approach, involving field study and survey research were used as means of data collection. After confirmation of construct validity and reliability of scale, data analysis was carried out through a combination of Kruskal-Wallis test, descriptive statistics and exploratory factor analysis. The study suggests that site management functions could significantly reduce waste generation through strict adherence to project drawings, and by ensuring fewer or no design changes during construction process. Provision of waste skips for specific materials and maximisation of on-site reuse of materials are also found to be among the key factors for engendering waste minimization. The result of factor analysis suggests four factors underlying on-site waste management practices with 96.093% of total variance. These measures include contractual provisions for waste minimization, waste segregation, maximisation of materials reuse and effective logistic management. Strategies through which each of the underlying measures could be achieved are further discussed in the paper. Findings of this study would assist construction site managers and other site operatives in reducing waste generated by construction activities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A; Kardia, Sharon L R; Allison, Matthew; Diez Roux, Ana V
2016-11-01
There has been an increased interest in identifying gene-environment interaction (G × E) in the context of multiple environmental exposures. Most G × E studies analyze one exposure at a time, but we are exposed to multiple exposures in reality. Efficient analysis strategies for complex G × E with multiple environmental factors in a single model are still lacking. Using the data from the Multiethnic Study of Atherosclerosis, we illustrate a two-step approach for modeling G × E with multiple environmental factors. First, we utilize common clustering and classification strategies (e.g., k-means, latent class analysis, classification and regression trees, Bayesian clustering using Dirichlet Process) to define subgroups corresponding to distinct environmental exposure profiles. Second, we illustrate the use of an additive main effects and multiplicative interaction model, instead of the conventional saturated interaction model using product terms of factors, to study G × E with the data-driven exposure subgroups defined in the first step. We demonstrate useful analytical approaches to translate multiple environmental exposures into one summary class. These tools not only allow researchers to consider several environmental exposures in G × E analysis but also provide some insight into how genes modify the effect of a comprehensive exposure profile instead of examining effect modification for each exposure in isolation.
Chen, Zhixiang; Shao, Peng; Sun, Qizhao; Zhao, Dong
2015-03-01
The purpose of the present study was to use a prospectively collected data to evaluate the rate of incidental durotomy (ID) during lumbar surgery and determine the associated risk factors by using univariate and multivariate analysis. We retrospectively reviewed 2184 patients who underwent lumbar surgery from January 1, 2009 to December 31, 2011 at a single hospital. Patients with ID (n=97) were compared with the patients without ID (n=2019). The influences of several potential risk factors that might affect the occurrence of ID were assessed using univariate and multivariate analyses. The overall incidence of ID was 4.62%. Univariate analysis demonstrated that older age, diabetes, lumbar central stenosis, posterior approach, revision surgery, prior lumber surgery and minimal invasive surgery are risk factors for ID during lumbar surgery. However, multivariate analysis identified older age, prior lumber surgery, revision surgery, and minimally invasive surgery as independent risk factors. Older age, prior lumber surgery, revision surgery, and minimal invasive surgery were independent risk factors for ID during lumbar surgery. These findings may guide clinicians making future surgical decisions regarding ID and aid in the patient counseling process to alleviate risks and complications. Copyright © 2015 Elsevier B.V. All rights reserved.
Role stressors and coping strategies among nurse managers.
Udod, Sonia; Cummings, Greta G; Care, W Dean; Jenkins, Megan
2017-02-06
Purpose The purpose of this paper is to share preliminary evidence about nurse managers' (NMs) role stressors and coping strategies in acute health-care facilities in Western Canada. Design/methodology/approach A qualitative exploratory inquiry provides deeper insight into NMs' perceptions of their role stressors, coping strategies and factors and practices in the organizational context that facilitate and hinder their work. A purposeful sample of 17 NMs participated in this study. Data were collected through individual interviews and a focus group interview. Braun and Clarke's (2006) six phase approach to thematic analysis guided data analysis. Findings Evidence demonstrates that individual factors, organizational practices and structures affect NMs stress creating an evolving role with unrealistic expectations, responding to continuous organizational change, a fragmented ability to effectively process decisions because of work overload, shifting organizational priorities and being at risk for stress-related ill health. Practical implications These findings have implications for organizational support, intervention programs that enhance leadership approaches, address individual factors and work processes and redesigning the role in consideration of the role stress and work complexity affecting NMs health. Originality/value It is anticipated that health-care leaders would find these results concerning and inspire them to take action to support NMs to do meaningful work as a way to retain existing managers and attract front line nurses to positions of leadership.
Hecker, Kent; El Kurdi, Syliva; Joshi, Durgadatt; Stephen, Craig
2013-12-01
Japanese encephalitis (JE) is the leading cause of viral encephalitis in Asia and a significant public health problem in Nepal. Its epidemiology is influenced by factors affecting its amplifying hosts (pigs), vectors (mosquitoes), and dead-end hosts (including people). While most control efforts target reduced susceptibility to infection either by vaccination of people or pigs or by reduced exposure to mosquitoes; the economic reality of Nepal makes it challenging to implement standard JE control measures. An ecohealth approach has been nominated as a way to assist in finding and prioritizing locally relevant strategies for JE control that may be viable, feasible, and acceptable. We sought to understand if Nepalese experts responsible for JE management conceived of its epidemiology in terms of a socio-ecological system to determine if they would consider ecohealth approaches. Network analysis suggested that they did not conceive JE risk as a product of a socio-ecological system. Traditional proximal risk factors of pigs, mosquitoes, and vaccination predominated experts' conception of JE risk. People seeking to encourage an ecohealth approach or social change models to JE management in Nepal may benefit from adopting social marketing concepts to encourage and empower local experts to examine JE from a socio-ecological perspective.
Analysis of LDPE-ZnO-clay nanocomposites using novel cumulative rheological parameters
NASA Astrophysics Data System (ADS)
Kracalik, Milan
2017-05-01
Polymer nanocomposites exhibit complex rheological behaviour due to physical and also possibly chemical interactions between individual phases. Up to now, rheology of dispersive polymer systems has been usually described by evaluation of viscosity curve (shear thinning phenomenon), storage modulus curve (formation of secondary plateau) or plotting information about dumping behaviour (e.g. Van Gurp-Palmen-plot, comparison of loss factor tan δ). On the contrary to evaluation of damping behaviour, values of cot δ were calculated and called as "storage factor", analogically to loss factor. Then values of storage factor were integrated over specific frequency range and called as "cumulative storage factor". In this contribution, LDPE-ZnO-clay nanocomposites with different dispersion grades (physical networks) have been prepared and characterized by both conventional as well as novel analysis approach. Next to cumulative storage factor, further cumulative rheological parameters like cumulative complex viscosity, cumulative complex modulus or cumulative storage modulus have been introduced.
CrossTalk: The Journal of Defense Software Engineering. Volume 27, Number 1, January/February 2014
2014-02-01
deficit in trustworthiness and will permit analysis on how this deficit needs to be overcome. This analysis will help identify adaptations that are...approaches to trustworthy analysis split into two categories: product-based and process-based. Product-based techniques [9] identify factors that...Criticalities may also be assigned to decompositions and contributions. 5. Evaluation and analysis : in this task the propagation rules of the NFR
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Tracey, Danielle K.; Craven, Rhonda G.
2006-01-01
Confirmatory factor analysis of responses by 211 preadolescents (M age = 10.25 years,SD = 1.48) with mild intellectual disabilities (MIDs) to the individually administered Self Description Questionnaire I-Individual Administration (SDQI-IA) counters widely cited claims that these children cannot differentiate multiple self-concept factors. Results…
ERIC Educational Resources Information Center
Goldweber, Asha; Bradshaw, Catherine P.; Goodman, Kimberly; Monahan, Kathryn; Cooley-Strickland, Michele
2011-01-01
There is compelling evidence for the role of social information processing (SIP) in aggressive behavior. However, less is known about factors that influence stability versus instability in patterns of SIP over time. Latent transition analysis was used to identify SIP patterns over one year and examine how community violence exposure, aggressive…
Yastrebov, V S; Mitikhin, V G; Solokhina, T A; Mitikhina, I A
ОBJECTIVE: a system analysis and modeling for important areas of research of the organization of psychiatric services in Russia in the study mental health of the population, identification of factors affecting the formation of the contingent of persons with mental disorders, organizational and functional structure of mental health services and mental health care. The authors analyzed scientific publications on the problems of psychiatric care organization as well as the results of own research over the last 25 years using system analysis. The approach that allows a creation of a range of population models to monitor the status of mental health based on medical, demographic and social factors (more than 60 factors) of life was suggested. The basic models and approaches for the evaluation of activity of divisions of mental health services at the macro and micro-social levels, taking into account expert information and individual characteristics of patients and relatives, were demonstrated. To improve treatment quality, the models of identification of the factors, which positively or negatively influenced the commitment to psychopharmacotherapy of patients with schizophrenia and their families, were developed.
Lee, Jang-Eun; Lee, Bum-Jin; Chung, Jin-Oh; Kim, Hak-Nam; Kim, Eun-Hee; Jung, Sungheuk; Lee, Hyosang; Lee, Sang-Jun; Hong, Young-Shick
2015-05-01
Numerous factors such as geographical origin, cultivar, climate, cultural practices, and manufacturing processes influence the chemical compositions of tea, in the same way as growing conditions and grape variety affect wine quality. However, the relationships between these factors and tea chemical compositions are not well understood. In this study, a new approach for non-targeted or global analysis, i.e., metabolomics, which is highly reproducible and statistically effective in analysing a diverse range of compounds, was used to better understand the metabolome of Camellia sinensis and determine the influence of environmental factors, including geography, climate, and cultural practices, on tea-making. We found a strong correlation between environmental factors and the metabolome of green, white, and oolong teas from China, Japan, and South Korea. In particular, multivariate statistical analysis revealed strong inter-country and inter-city relationships in the levels of theanine and catechin derivatives found in green and white teas. This information might be useful for assessing tea quality or producing distinct tea products across different locations, and highlights simultaneous identification of diverse tea metabolites through an NMR-based metabolomics approach. Copyright © 2014 Elsevier Ltd. All rights reserved.
Evaluating voice characteristics of first-year acting students in Israel: factor analysis.
Amir, Ofer; Primov-Fever, Adi; Kushnir, Tami; Kandelshine-Waldman, Osnat; Wolf, Michael
2013-01-01
Acting students require diverse, high-quality, and high-intensity vocal performance from early stages of their training. Demanding vocal activities, before developing the appropriate vocal skills, put them in high risk for developing vocal problems. A retrospective analysis of voice characteristics of first-year acting students using several voice evaluation tools. A total of 79 first-year acting students (55 women and 24 men) were assigned into two study groups: laryngeal findings (LFs) and no laryngeal findings, based on stroboscopic findings. Their voice characteristics were evaluated using acoustic analysis, aerodynamic examination, perceptual scales, and self-report questionnaires. Results obtained from each set of measures were examined using a factor analysis approach. Significant differences between the two groups were found for a single fundamental frequency (F(0))-Regularity factor; a single Grade, Roughness, Breathiness, Asthenia, Strain perceptual factor; and the three self-evaluation factors. Gender differences were found for two acoustic analysis factors, which were based on F(0) and its derivatives, namely an aerodynamic factor that represents expiratory volume measurements and a single self-evaluation factor that represents the tendency to seek therapy. Approximately 50% of the first-year acting students had LFs. These students differed from their peers in the control group in a single acoustic analysis factor, as well as perceptual and self-report factors. No group differences, however, were found for the aerodynamic factors. Early laryngeal examination and voice evaluation of future professional voice users could provide a valuable individual baseline, to which later examinations could be compared, and assist in providing personally tailored treatment. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
Characterisation factors for life cycle impact assessment of sound emissions.
Cucurachi, S; Heijungs, R
2014-01-15
Noise is a serious stressor affecting the health of millions of citizens. It has been suggested that disturbance by noise is responsible for a substantial part of the damage to human health. However, no recommended approach to address noise impacts was proposed by the handbook for life cycle assessment (LCA) of the European Commission, nor are characterisation factors (CFs) and appropriate inventory data available in commonly used databases. This contribution provides CFs to allow for the quantification of noise impacts on human health in the LCA framework. Noise propagation standards and international reports on acoustics and noise impacts were used to define the model parameters. Spatial data was used to calculate spatially-defined CFs in the form of 10-by-10-km maps. The results of this analysis were combined with data from the literature to select input data for representative archetypal situations of emission (e.g. urban day with a frequency of 63 Hz, rural night at 8000 Hz, etc.). A total of 32 spatial and 216 archetypal CFs were produced to evaluate noise impacts at a European level (i.e. EU27). The possibility of a user-defined characterisation factor was added to support the possibility of portraying the situation of full availability of information, as well as a highly-localised impact analysis. A Monte Carlo-based quantitative global sensitivity analysis method was applied to evaluate the importance of the input factors in determining the variance of the output. The factors produced are ready to be implemented in the available LCA databases and software. The spatial approach and archetypal approach may be combined and selected according to the amount of information available and the life cycle under study. The framework proposed and used for calculations is flexible enough to be expanded to account for impacts on target subjects other than humans and to continents other than Europe. © 2013 Elsevier B.V. All rights reserved.
Chung, Dongjun; Kuan, Pei Fen; Li, Bo; Sanalkumar, Rajendran; Liang, Kun; Bresnick, Emery H; Dewey, Colin; Keleş, Sündüz
2011-07-01
Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) is rapidly replacing chromatin immunoprecipitation combined with genome-wide tiling array analysis (ChIP-chip) as the preferred approach for mapping transcription-factor binding sites and chromatin modifications. The state of the art for analyzing ChIP-seq data relies on using only reads that map uniquely to a relevant reference genome (uni-reads). This can lead to the omission of up to 30% of alignable reads. We describe a general approach for utilizing reads that map to multiple locations on the reference genome (multi-reads). Our approach is based on allocating multi-reads as fractional counts using a weighted alignment scheme. Using human STAT1 and mouse GATA1 ChIP-seq datasets, we illustrate that incorporation of multi-reads significantly increases sequencing depths, leads to detection of novel peaks that are not otherwise identifiable with uni-reads, and improves detection of peaks in mappable regions. We investigate various genome-wide characteristics of peaks detected only by utilization of multi-reads via computational experiments. Overall, peaks from multi-read analysis have similar characteristics to peaks that are identified by uni-reads except that the majority of them reside in segmental duplications. We further validate a number of GATA1 multi-read only peaks by independent quantitative real-time ChIP analysis and identify novel target genes of GATA1. These computational and experimental results establish that multi-reads can be of critical importance for studying transcription factor binding in highly repetitive regions of genomes with ChIP-seq experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lon N. Haney; David I. Gertman
2003-04-01
Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human errormore » analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.« less
Iguchi, Toshihiro; Hiraki, Takao; Gobara, Hideo; Fujiwara, Hiroyasu; Matsui, Yusuke; Miyoshi, Shinichiro; Kanazawa, Susumu
2016-01-01
To retrospectively evaluate the safety of computed tomography (CT) fluoroscopy-guided short hook wire placement for video-assisted thoracoscopic surgery and the risk factors for pneumothorax associated with this procedure. We analyzed 267 short hook wire placements for 267 pulmonary lesions (mean diameter, 9.9 mm). Multiple variables related to the patients, lesions, and procedures were assessed to determine the risk factors for pneumothorax. Complications (219 grade 1 and 4 grade 2 adverse events) occurred in 196 procedures. No grade 3 or above adverse events were observed. Univariate analysis revealed increased vital capacity (odds ratio [OR], 1.518; P = 0.021), lower lobe lesion (OR, 2.343; P =0.001), solid lesion (OR, 1.845; P = 0.014), prone positioning (OR, 1.793; P = 0.021), transfissural approach (OR, 11.941; P = 0.017), and longer procedure time (OR, 1.036; P = 0.038) were significant predictors of pneumothorax. Multivariate analysis revealed only the transfissural approach (OR, 12.171; P = 0.018) and a longer procedure time (OR, 1.048; P = 0.012) as significant independent predictors. Complications related to CT fluoroscopy-guided preoperative short hook wire placement often occurred, but all complications were minor. A transfissural approach and longer procedure time were significant independent predictors of pneumothorax. Complications related to CT fluoroscopy-guided preoperative short hook wire placement often occur. Complications are usually minor and asymptomatic. A transfissural approach and longer procedure time are significant independent predictors of pneumothorax.
Ben Abid, Sadreddine; Mzoughi, Zeineb; Attaoui, Mohamed Amine; Talbi, Ghofrane; Arfa, Nafaa; Gharbi, Lassaad; Khalfallah, Mohamed Taher
2014-12-01
feasibility and advantages of laparoscopic approach in performed duodenal ulcer have no longer to be demonstrated. Laparoscopic suture and peritoneal cleaning expose to a conversion rate between 10 and 23%. However less than laparotomy, morbidity of this approach is not absent. This study aim to analyze factors exposing to conversion after laparoscopic approach of perforred duodenal ulcer. We also aim to define the morbidity of this approach and predictive factors of this morbidity Methods: Retrospective descriptive study was conducted referring all cases of perforated duodenal ulcer treated laparoscopically over a period of ten years, running from January 2000 to December 2010. All patients were operated by laparoscopy with or without conversion. We have noted conversion factors. A statistical analysis with logistic regression was performed whenever we have sought to identify independent risk factors for conversion verified as statistically significant in univariante. The significance level was set at 5%. Analytic univariant and multivariant study was performed to analyze morbidity factors. 290 patients were included. The median age was 34ans.T he intervention was conducted completely laparoscopically in 91.4% of cases. The conversion rate was 8.6%. It was selected as a risk factor for conversion: age> 32 years, a known ulcer, progressive pain, renal function failure, a difficult peritoneal lavage and having a chronic ulcer. Postoperative morbidity was 5.1%. Three independent risk factors of surgical complications were selected: renal failure, age> 45 years, and a chronic ulcer appearance. Laparoscopic treatment of perforred duodenal ulcer expose to a conversion risk. Morbidity is certainly less than laparotomy and a better Knowledge of predictif's morbidity factors become necessary for a better management of this disease.
Lifestyle Factors in U.S. Residential Electricity Consumption
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanquist, Thomas F.; Orr, Heather M.; Shui, Bin
2012-03-30
A multivariate statistical approach to lifestyle analysis of residential electricity consumption is described and illustrated. Factor analysis of selected variables from the 2005 U.S. Residential Energy Consumption Survey (RECS) identified five lifestyle factors reflecting social and behavioral choices associated with air conditioning, laundry usage, personal computer usage, climate zone of residence, and TV use. These factors were also estimated for 2001 RECS data. Multiple regression analysis using the lifestyle factors yields solutions accounting for approximately 40% of the variance in electricity consumption for both years. By adding the associated household and market characteristics of income, local electricity price and accessmore » to natural gas, variance accounted for is increased to approximately 54%. Income contributed only {approx}1% unique variance to the 2005 and 2001 models, indicating that lifestyle factors reflecting social and behavioral choices better account for consumption differences than income. This was not surprising given the 4-fold range of energy use at differing income levels. Geographic segmentation of factor scores is illustrated, and shows distinct clusters of consumption and lifestyle factors, particularly in suburban locations. The implications for tailored policy and planning interventions are discussed in relation to lifestyle issues.« less
Ross, Amy M; Ilic, Kelley; Kiyoshi-Teo, Hiroko; Lee, Christopher S
2017-12-26
The purpose of this study was to establish the psychometric properties of the new 16-item leadership environment scale. The leadership environment scale was based on complexity science concepts relevant to complex adaptive health care systems. A workforce survey of direct-care nurses was conducted (n = 1,443) in Oregon. Confirmatory factor analysis, exploratory factor analysis, concordant validity test and reliability tests were conducted to establish the structure and internal consistency of the leadership environment scale. Confirmatory factor analysis indices approached acceptable thresholds of fit with a single factor solution. Exploratory factor analysis showed improved fit with a two-factor model solution; the factors were labelled 'influencing relationships' and 'interdependent system supports'. Moderate to strong convergent validity was observed between the leadership environment scale/subscales and both the nursing workforce index and the safety organising scale. Reliability of the leadership environment scale and subscales was strong, with all alphas ≥.85. The leadership environment scale is structurally sound and reliable. Nursing management can employ adaptive complexity leadership attributes, measure their influence on the leadership environment, subsequently modify system supports and relationships and improve the quality of health care systems. The leadership environment scale is an innovative fit to complex adaptive systems and how nurses act as leaders within these systems. © 2017 John Wiley & Sons Ltd.
Xie, Anping; Woods-Hill, Charlotte Z; King, Anne F; Enos-Graves, Heather; Ascenzi, Judy; Gurses, Ayse P; Klaus, Sybil A; Fackler, James C; Milstone, Aaron M
2017-11-20
Work system assessments can facilitate successful implementation of quality improvement programs. Using a human factors engineering approach, we conducted a work system assessment to facilitate the dissemination of a quality improvement program for optimizing blood culture use in pediatric intensive care units at 2 hospitals. Semistructured face-to-face interviews were conducted with clinicians from Johns Hopkins All Children's Hospital and University of Virginia Medical Center. Interview data were analyzed using qualitative content analysis. Blood culture-ordering practices are influenced by various work system factors, including people, tasks, tools and technologies, the physical environment, organizational conditions, and the external environment. A clinical decision-support tool could facilitate implementation by (1) standardizing blood culture-ordering practices, (2) ensuring that prescribing clinicians review the patient's condition before ordering a blood culture, (3) facilitating critical thinking, and (4) empowering nurses to communicate with physicians and advocate for adherence to blood culture-ordering guidelines. The success of interventions for optimizing blood culture use relies heavily on the local context. A work system analysis using a human factors engineering approach can identify key areas to be addressed for the successful dissemination of quality improvement interventions. © The Author 2017. Published by Oxford University Press on behalf of The Journal of the Pediatric Infectious Diseases Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Evaluation of the Current Status of the Combinatorial Approach for the Study of Phase Diagrams
Wong-Ng, W.
2012-01-01
This paper provides an evaluation of the effectiveness of using the high throughput combinatorial approach for preparing phase diagrams of thin film and bulk materials. Our evaluation is based primarily on examples of combinatorial phase diagrams that have been reported in the literature as well as based on our own laboratory experiments. Various factors that affect the construction of these phase diagrams are examined. Instrumentation and analytical approaches needed to improve data acquisition and data analysis are summarized. PMID:26900530
Charmless two-body B decays: A global analysis with QCD factorization
NASA Astrophysics Data System (ADS)
Du, Dongsheng; Sun, Junfeng; Yang, Deshan; Zhu, Guohuai
2003-01-01
In this paper, we perform a global analysis of B→PP and PV decays with the QCD factorization approach. It is encouraging to observe that the predictions of QCD factorization are in good agreement with experiment. The best fit γ is around 79 °. The penguin-diagram to tree-diagram ratio |Pππ/Tππ| of π+π- decays is preferred to be larger than 0.3. We also show the confidence levels for some interesting channels: B0→π0π0, K+K-, and B+→ωπ+, ωK+. For B→πK* decays, they are expected to have smaller branching ratios with more precise measurements.
The eudaimonic component of satisfaction with life and psychological well-being in Spanish cultures.
Díaz, Darío; Stavraki, María; Blanco, Amalio; Gandarillas, Beatriz
2015-01-01
In the study of well-being there are two partially overlapping traditions that have been developed in parallel. Subjective well-being (SWB) has been associated with the hedonistic approach of well-being, and psychological well-being (PWB) with the eudaimonistic one. However, satisfaction with life, the most common SWB indicator, is not strictly a hedonic concept and contains many eudaimonic components. The objective of this research is to examine whether a Eudaimonic Well-being G-Factor of Satisfaction with Life (SWLS) and Psychological Well-being Scales (PWBS) emerges. 400 people from the general population of Colombia (Study 1) and 401 from Spain (Study 2), recruited via advertisement, voluntarily participated and filled in a booklet containing, in order of appearance, the PWBS and the SWLS. According to our hypothesis, parallel analysis, eigenvalues, scree plot graphs and exploratory factor analysis (Study 1) suggested the existence of a one-factor structure. Confirmatory factor analysis (Study 2) indicated that this one-factor model provided excellent data fit. Results of a multi-group confirmatory factor analysis confirmed cross-cultural factor invariance. These results question the view that the satisfaction with life indicator is uniquely hedonic and point to the need for a greater integration between hedonic and eudaimonic traditions.
Single-diffractive production of dijets within the kt-factorization approach
NASA Astrophysics Data System (ADS)
Łuszczak, Marta; Maciuła, Rafał; Szczurek, Antoni; Babiarz, Izabela
2017-09-01
We discuss single-diffractive production of dijets. The cross section is calculated within the resolved Pomeron picture, for the first time in the kt-factorization approach, neglecting transverse momentum of the Pomeron. We use Kimber-Martin-Ryskin unintegrated parton (gluon, quark, antiquark) distributions in both the proton as well as in the Pomeron or subleading Reggeon. The unintegrated parton distributions are calculated based on conventional mmht2014nlo parton distribution functions in the proton and H1 Collaboration diffractive parton distribution functions used previously in the analysis of diffractive structure function and dijets at HERA. For comparison, we present results of calculations performed within the collinear-factorization approach. Our results remain those obtained in the next-to-leading-order approach. The calculation is (must be) supplemented by the so-called gap survival factor, which may, in general, depend on kinematical variables. We try to describe the existing data from Tevatron and make detailed predictions for possible LHC measurements. Several differential distributions are calculated. The E¯T, η ¯ and xp ¯ distributions are compared with the Tevatron data. A reasonable agreement is obtained for the first two distributions. The last one requires introducing a gap survival factor which depends on kinematical variables. We discuss how the phenomenological dependence on one kinematical variable may influence dependence on other variables such as E¯T and η ¯. Several distributions for the LHC are shown.
Mokhtari, Kambiz; Ren, Jun; Roberts, Charles; Wang, Jin
2011-08-30
Ports and offshore terminals are critical infrastructure resources and play key roles in the transportation of goods and people. With more than 80 percent of international trade by volume being carried out by sea, ports and offshore terminals are vital for seaborne trade and international commerce. Furthermore in today's uncertain and complex environment there is a need to analyse the participated risk factors in order to prioritise protective measures in these critically logistics infrastructures. As a result of this study is carried out to support the risk assessment phase of the proposed Risk Management (RM) framework used for the purpose of sea ports and offshore terminals operations and management (PTOM). This has been fulfilled by integration of a generic bow-tie based risk analysis framework into the risk assessment phase as a backbone of the phase. For this reason Fault Tree Analysis (FTA) and Event Tree Analysis (ETA) are used to analyse the risk factors associated within the PTOM. This process will eventually help the port professionals and port risk managers to investigate the identified risk factors more in detail. In order to deal with vagueness of the data Fuzzy Set Theory (FST) and possibility approach are used to overcome the disadvantages of the conventional probability based approaches. Copyright © 2011 Elsevier B.V. All rights reserved.
Snell, Deborah L; Surgenor, Lois J; Hay-Smith, E Jean C; Williman, Jonathan; Siegert, Richard J
2015-01-01
Outcomes after mild traumatic brain injury (MTBI) vary, with slow or incomplete recovery for a significant minority. This study examines whether groups of cases with shared psychological factors but with different injury outcomes could be identified using cluster analysis. This is a prospective observational study following 147 adults presenting to a hospital-based emergency department or concussion services in Christchurch, New Zealand. This study examined associations between baseline demographic, clinical, psychological variables (distress, injury beliefs and symptom burden) and outcome 6 months later. A two-step approach to cluster analysis was applied (Ward's method to identify clusters, K-means to refine results). Three meaningful clusters emerged (high-adapters, medium-adapters, low-adapters). Baseline cluster-group membership was significantly associated with outcomes over time. High-adapters appeared recovered by 6-weeks and medium-adapters revealed improvements by 6-months. The low-adapters continued to endorse many symptoms, negative recovery expectations and distress, being significantly at risk for poor outcome more than 6-months after injury (OR (good outcome) = 0.12; CI = 0.03-0.53; p < 0.01). Cluster analysis supported the notion that groups could be identified early post-injury based on psychological factors, with group membership associated with differing outcomes over time. Implications for clinical care providers regarding therapy targets and cases that may benefit from different intensities of intervention are discussed.
Risk factors for parastomal hernia in Japanese patients with permanent colostomy.
Funahashi, Kimihiko; Suzuki, Takayuki; Nagashima, Yasuo; Matsuda, Satoshi; Koike, Junichi; Shiokawa, Hiroyuki; Ushigome, Mitsunori; Arai, Kenichiro; Kaneko, Tomoaki; Kurihara, Akiharu; Kaneko, Hironori
2014-08-01
Although the definitive risk factors for parastomal hernia development remain unclear, potential contributing factors have been reported from Western countries. The aim of this study was to identify the risk factors for parastomal hernia in Japanese patients with permanent colostomies. All patients who received abdominoperineal resection or total pelvic exenteration at our institution between December 2004 and December 2011 were reviewed. Patient-related, operation-related and postoperative variables were evaluated, in both univariate and multivariate analyses, to identify the risk factors for parastomal hernia formation. Of the 80 patients who underwent colostomy, 22 (27.5 %) developed a parastomal hernia during a median follow-up period of 953 days (range 15-2792 days). Hernia development was significantly associated with increasing patient age and body mass index, a laparoscopic surgical approach and the transperitoneal route of colostomy formation. In the multivariate analysis, the body mass index (p = 0.022), the laparoscopic approach (p = 0.043) and transperitoneal stoma creation (p = 0.021) retained statistical significance. Our findings in Japanese ostomates match those from Western countries: a higher body mass index, the use of a laparoscopic approach and a transperitoneal colostomy are significant independent risk factors for parastomal hernia formation. The precise role of the stoma creation route remains unclear.
UK Parents' Beliefs about Applied Behaviour Analysis as an Approach to Autism Education
ERIC Educational Resources Information Center
Denne, Louise D.; Hastings, Richard P.; Hughes, J. Carl
2017-01-01
Research into factors underlying the dissemination of evidence-based practice is limited within the field of Applied Behaviour Analysis (ABA). This is pertinent, particularly in the UK where national policies and guidelines do not reflect the emerging ABA evidence base, or policies and practices elsewhere. Theories of evidence-based practice in…
A proposed biophysical approach to Visual absorption capability (VAC)
W. C. Yeomans
1979-01-01
In British Columbia, visual analysis is in its formative stages and has only recently been accepted by Government as a resource component, notably within the Resource Analysis Branch, Ministry of Environment. Visual absorption capability (VAC), is an integral factor in visual resource assessment. VAC is examined by the author in the degree to which it relates to...
Information Acquisition, Analysis and Integration
2016-08-03
of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A
ERIC Educational Resources Information Center
Duku, Eric; Vaillancourt, Tracy; Szatmari, Peter; Georgiades, Stelios; Zwaigenbaum, Lonnie; Smith, Isabel M.; Bryson, Susan; Fombonne, Eric; Mirenda, Pat; Roberts, Wendy; Volden, Joanne; Waddell, Charlotte; Thompson, Ann; Bennett, Teresa
2013-01-01
The purpose of this study was to examine the measurement properties of the Social Responsiveness Scale in an accelerated longitudinal sample of 4-year-old preschool children with the complementary approaches of categorical confirmatory factor analysis and Rasch analysis. Measurement models based on the literature and other hypothesized measurement…
ERIC Educational Resources Information Center
Salinas, Esther Charlotte
2013-01-01
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this project examined collaboration around student achievement at the school site leadership level in the Pasadena Unified School District (PUSD). This project is one of three concurrent studies focused on collaboration around student achievement in the PUSD that include…
DOT National Transportation Integrated Search
1992-06-01
The Aids to Navigation (ATON) Service Force Mix (SFM) 2000 Project is documented in a Project Overview and three separately bound volumes. This is Volume III. The Project Overview describes the purpose, approach, analysis, and results of the ATON SFM...
ERIC Educational Resources Information Center
Llamas, Sonia Rodarte
2013-01-01
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this study examined collaboration around student achievement at the central office leadership level in the Pasadena Unified School District (PUSD). This study is one of three concurrent studies focused on collaboration around student achievement in the PUSD that include…
ERIC Educational Resources Information Center
Carruthers, Anthony Steven
2013-01-01
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this project examined collaboration around student achievement in the Pasadena Unified School District (PUSD) from the teacher perspective. As part of a tri-level study, two other projects examined collaboration around student achievement in PUSD from the perspectives of…
Ocean wavenumber estimation from wave-resolving time series imagery
Plant, N.G.; Holland, K.T.; Haller, M.C.
2008-01-01
We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm.
Al-Saffar, Ahmed; Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-Bared, Mohammed
2018-01-01
Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach.
Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm
Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-bared, Mohammed
2018-01-01
Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach. PMID:29684036
ERIC Educational Resources Information Center
Carney, Timothy Jay
2012-01-01
A study design has been developed that employs a dual modeling approach to identify factors associated with facility-level cancer screening improvement and how this is mediated by the use of clinical decision support. This dual modeling approach combines principles of (1) Health Informatics, (2) Cancer Prevention and Control, (3) Health Services…
Nano-Launcher Technologies, Approaches, and Life Cycle Assessment. Phase II
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2014-01-01
Assist in understanding NASA technology and investment approaches, and other driving factors, necessary for enabling dedicated nano-launchers by industry at a cost and flight rate that (1) could support and be supported by an emerging nano-satellite market and (2) would benefit NASAs needs. Develop life-cycle cost, performance and other NASA analysis tools or models required to understand issues, drivers and challenges.
ERIC Educational Resources Information Center
Castro-Schilo, Laura; Ferrer, Emilio
2013-01-01
We illustrate the idiographic/nomothetic debate by comparing 3 approaches to using daily self-report data on affect for predicting relationship quality and breakup. The 3 approaches included (a) the first day in the series of daily data; (b) the mean and variability of the daily series; and (c) parameters from dynamic factor analysis, a…
Yekpe, Ketsia; Abatzoglou, Nicolas; Bataille, Bernard; Gosselin, Ryan; Sharkawi, Tahmer; Simard, Jean-Sébastien; Cournoyer, Antoine
2018-07-01
This study applied the concept of Quality by Design (QbD) to tablet dissolution. Its goal was to propose a quality control strategy to model dissolution testing of solid oral dose products according to International Conference on Harmonization guidelines. The methodology involved the following three steps: (1) a risk analysis to identify the material- and process-related parameters impacting the critical quality attributes of dissolution testing, (2) an experimental design to evaluate the influence of design factors (attributes and parameters selected by risk analysis) on dissolution testing, and (3) an investigation of the relationship between design factors and dissolution profiles. Results show that (a) in the case studied, the two parameters impacting dissolution kinetics are active pharmaceutical ingredient particle size distributions and tablet hardness and (b) these two parameters could be monitored with PAT tools to predict dissolution profiles. Moreover, based on the results obtained, modeling dissolution is possible. The practicality and effectiveness of the QbD approach were demonstrated through this industrial case study. Implementing such an approach systematically in industrial pharmaceutical production would reduce the need for tablet dissolution testing.
Work-related musculoskeletal complaints: some ergonomics challenges upon the start of a new century.
Westgaard, R H
2000-12-01
Three themes likely to be important within health-related ergonomics in the coming years are discussed. The first two themes concern methods for risk analysis of low-level biomechanical and psychosocial exposures. The third theme is approaches to successful implementation of ergonomics interventions. Evidence on the assessment of low-level biomechanical and psychosocial exposures by instrumented measurements is discussed. It is concluded that, despite recent advances in our understanding of exposure-effect associations under these exposure conditions, we must at present rely on more subjective methods, employed in a collaboration between expert and worker. This approach to risk analysis identifies in most cases critical exposures in a work situation. The focus should then be on the successful implementation of measures against those exposures, as identification alone does not solve problems. The aim of improved health for the workers further requires that the full complement of risk factors be considered, including work, leisure time and person-based risk factors. Finally, the need to put ergonomics intervention initiatives in an organisational context is emphasised, and examples of approaches used by Norwegian companies are presented.
Luce, Robert; Hildebrandt, Peter; Kuhlmann, Uwe; Liesen, Jörg
2016-09-01
The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for nonnegative matrix factorization that is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with the vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed. © The Author(s) 2016.
Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S
2016-09-01
The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd
Proteomic profiling of early degenerative retina of RCS rats
Zhu, Zhi-Hong; Fu, Yan; Weng, Chuan-Huang; Zhao, Cong-Jian; Yin, Zheng-Qin
2017-01-01
AIM To identify the underlying cellular and molecular changes in retinitis pigmentosa (RP). METHODS Label-free quantification-based proteomics analysis, with its advantages of being more economic and consisting of simpler procedures, has been used with increasing frequency in modern biological research. Dystrophic RCS rats, the first laboratory animal model for the study of RP, possess a similar pathological course as human beings with the diseases. Thus, we employed a comparative proteomics analysis approach for in-depth proteome profiling of retinas from dystrophic RCS rats and non-dystrophic congenic controls through Linear Trap Quadrupole - orbitrap MS/MS, to identify the significant differentially expressed proteins (DEPs). Bioinformatics analyses, including Gene ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway annotation and upstream regulatory analysis, were then performed on these retina proteins. Finally, a Western blotting experiment was carried out to verify the difference in the abundance of transcript factor E2F1. RESULTS In this study, we identified a total of 2375 protein groups from the retinal protein samples of RCS rats and non-dystrophic congenic controls. Four hundred thirty-four significantly DEPs were selected by Student's t-test. Based on the results of the bioinformatics analysis, we identified mitochondrial dysfunction and transcription factor E2F1 as the key initiation factors in early retinal degenerative process. CONCLUSION We showed that the mitochondrial dysfunction and the transcription factor E2F1 substantially contribute to the disease etiology of RP. The results provide a new potential therapeutic approach for this retinal degenerative disease. PMID:28730077
An Analysis of Machine- and Human-Analytics in Classification.
Tam, Gary K L; Kothari, Vivek; Chen, Min
2017-01-01
In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.
Profitability Analysis of Soybean Oil Processes.
Cheng, Ming-Hsun; Rosentrater, Kurt A
2017-10-07
Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.
Profitability Analysis of Soybean Oil Processes
2017-01-01
Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production. PMID:28991168
A GIS-based approach for comparative analysis of potential fire risk assessment
NASA Astrophysics Data System (ADS)
Sun, Ying; Hu, Lieqiu; Liu, Huiping
2007-06-01
Urban fires are one of the most important sources of property loss and human casualty and therefore it is necessary to assess the potential fire risk with consideration of urban community safety. Two evaluation models are proposed, both of which are integrated with GIS. One is the single factor model concerning the accessibility of fire passage and the other is grey clustering approach based on the multifactor system. In the latter model, fourteen factors are introduced and divided into four categories involving security management, evacuation facility, construction resistance and fire fighting capability. A case study on campus of Beijing Normal University is presented to express the potential risk assessment models in details. A comparative analysis of the two models is carried out to validate the accuracy. The results are approximately consistent with each other. Moreover, modeling with GIS promotes the efficiency the potential risk assessment.
The Faster, Better, Cheaper Approach to Space Missions: An Engineering Management Assessment
NASA Technical Reports Server (NTRS)
Hamaker, Joe
2000-01-01
This paper describes, in viewgraph form, the faster, better, cheaper approach to space missions. The topics include: 1) What drives "Faster, Better, Cheaper"? 2) Why Space Programs are Costly; 3) Background; 4) Aerospace Project Management (Old Culture); 5) Aerospace Project Management (New Culture); 6) Scope of Analysis Limited to Engineering Management Culture; 7) Qualitative Analysis; 8) Some Basic Principles of the New Culture; 9) Cause and Effect; 10) "New Ways of Doing Business" Survey Results; 11) Quantitative Analysis; 12) Recent Space System Cost Trends; 13) Spacecraft Dry Weight Trend; 14) Complexity Factor Trends; 15) Cost Normalization; 16) Cost Normalization Algorithm; 17) Unnormalized Cost vs. Normalized Cost; and 18) Concluding Observations.
Ehlers, Ute Christine; Ryeng, Eirin Olaussen; McCormack, Edward; Khan, Faisal; Ehlers, Sören
2017-02-01
The safety effects of cooperative intelligent transport systems (C-ITS) are mostly unknown and associated with uncertainties, because these systems represent emerging technology. This study proposes a bowtie analysis as a conceptual framework for evaluating the safety effect of cooperative intelligent transport systems. These seek to prevent road traffic accidents or mitigate their consequences. Under the assumption of the potential occurrence of a particular single vehicle accident, three case studies demonstrate the application of the bowtie analysis approach in road traffic safety. The approach utilizes exemplary expert estimates and knowledge from literature on the probability of the occurrence of accident risk factors and of the success of safety measures. Fuzzy set theory is applied to handle uncertainty in expert knowledge. Based on this approach, a useful tool is developed to estimate the effects of safety-related cooperative intelligent transport systems in terms of the expected change in accident occurrence and consequence probability. Copyright © 2016 Elsevier Ltd. All rights reserved.
Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...
2014-11-01
This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less
Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data
NASA Astrophysics Data System (ADS)
Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.
2018-03-01
One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.
A Novel Protocol for Model Calibration in Biological Wastewater Treatment
Zhu, Ao; Guo, Jianhua; Ni, Bing-Jie; Wang, Shuying; Yang, Qing; Peng, Yongzhen
2015-01-01
Activated sludge models (ASMs) have been widely used for process design, operation and optimization in wastewater treatment plants. However, it is still a challenge to achieve an efficient calibration for reliable application by using the conventional approaches. Hereby, we propose a novel calibration protocol, i.e. Numerical Optimal Approaching Procedure (NOAP), for the systematic calibration of ASMs. The NOAP consists of three key steps in an iterative scheme flow: i) global factors sensitivity analysis for factors fixing; ii) pseudo-global parameter correlation analysis for non-identifiable factors detection; and iii) formation of a parameter subset through an estimation by using genetic algorithm. The validity and applicability are confirmed using experimental data obtained from two independent wastewater treatment systems, including a sequencing batch reactor and a continuous stirred-tank reactor. The results indicate that the NOAP can effectively determine the optimal parameter subset and successfully perform model calibration and validation for these two different systems. The proposed NOAP is expected to use for automatic calibration of ASMs and be applied potentially to other ordinary differential equations models. PMID:25682959
Psychosocial Modeling of Insider Threat Risk Based on Behavioral and Word Use Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Kangas, Lars J.; Noonan, Christine F.
In many insider crimes, managers and other coworkers observed that the offenders had exhibited signs of stress, disgruntlement, or other issues, but no alarms were raised. Barriers to using such psychosocial indicators include the inability to recognize the signs and the failure to record the behaviors so that they can be assessed. A psychosocial model was developed to assess an employee’s behavior associated with an increased risk of insider abuse. The model is based on case studies and research literature on factors/correlates associated with precursor behavioral manifestations of individuals committing insider crimes. A complementary Personality Factor modeling approach was developedmore » based on analysis to derive relevant personality characteristics from word use. Several implementations of the psychosocial model were evaluated by comparing their agreement with judgments of human resources and management professionals; the personality factor modeling approach was examined using email samples. If implemented in an operational setting, these models should be part of a set of management tools for employee assessment to identify employees who pose a greater insider threat.« less
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G
2016-09-01
The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.
Advances on the Failure Analysis of the Dam-Foundation Interface of Concrete Dams.
Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián
2015-12-02
Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern.
Advances on the Failure Analysis of the Dam—Foundation Interface of Concrete Dams
Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián
2015-01-01
Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern. PMID:28793709
Multivariate Analysis and Machine Learning in Cerebral Palsy Research
Zhang, Jing
2017-01-01
Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP. PMID:29312134
Multivariate Analysis and Machine Learning in Cerebral Palsy Research.
Zhang, Jing
2017-01-01
Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP.
Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping
2016-09-01
There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.
Gilbert, Kathryn E
2013-02-01
Recent attempts to regulate Crisis Pregnancy Centers, pseudoclinics that surreptitiously aim to dissuade pregnant women from choosing abortion, have confronted the thorny problem of how to define commercial speech. The Supreme Court has offered three potential answers to this definitional quandary. This Note uses the Crisis Pregnancy Center cases to demonstrate that courts should use one of these solutions, the factor-based approach of Bolger v. Youngs Drugs Products Corp., to define commercial speech in the Crisis Pregnancy Center cases and elsewhere. In principle and in application, the Bolger factor-based approach succeeds in structuring commercial speech analysis at the margins of the doctrine.
Leading for the long haul: a mixed-method evaluation of the Sustainment Leadership Scale (SLS).
Ehrhart, Mark G; Torres, Elisa M; Green, Amy E; Trott, Elise M; Willging, Cathleen E; Moullin, Joanna C; Aarons, Gregory A
2018-01-19
Despite our progress in understanding the organizational context for implementation and specifically the role of leadership in implementation, its role in sustainment has received little attention. This paper took a mixed-method approach to examine leadership during the sustainment phase of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Utilizing the Implementation Leadership Scale as a foundation, we sought to develop a short, practical measure of sustainment leadership that can be used for both applied and research purposes. Data for this study were collected as a part of a larger mixed-method study of evidence-based intervention, SafeCare®, sustainment. Quantitative data were collected from 157 providers using web-based surveys. Confirmatory factor analysis was used to examine the factor structure of the Sustainment Leadership Scale (SLS). Qualitative data were collected from 95 providers who participated in one of 15 focus groups. A framework approach guided qualitative data analysis. Mixed-method integration was also utilized to examine convergence of quantitative and qualitative findings. Confirmatory factor analysis supported the a priori higher order factor structure of the SLS with subscales indicating a single higher order sustainment leadership factor. The SLS demonstrated excellent internal consistency reliability. Qualitative analyses offered support for the dimensions of sustainment leadership captured by the quantitative measure, in addition to uncovering a fifth possible factor, available leadership. This study found qualitative and quantitative support for the pragmatic SLS measure. The SLS can be used for assessing leadership of first-level leaders to understand how staff perceive leadership during sustainment and to suggest areas where leaders could direct more attention in order to increase the likelihood that EBIs are institutionalized into the normal functioning of the organization.
Bonsaksen, Tore; Brown, Ted; Lim, Hua Beng; Fong, Kenneth
2017-05-02
Learning outcomes may be a result of several factors including the learning environment, students' predispositions, study efforts, cultural factors and approaches towards studying. This study examined the influence of demographic variables, education-related factors, and approaches to studying on occupational therapy students' Grade Point Average (GPA). Undergraduate occupational therapy students (n = 712) from four countries completed the Approaches and Study Skills Inventory for Students (ASSIST). Demographic background, education-related factors, and ASSIST scores were used in a hierarchical linear regression analysis to predict the students' GPA. Being older, female and more time engaged in self-study activities were associated with higher GPA among the students. In addition, five ASSIST subscales predicted higher GPA: higher scores on 'seeking meaning', 'achieving', and 'lack of purpose', and lower scores on 'time management' and 'fear of failure'. The full model accounted for 9.6% of the variance related to the occupational therapy students' GPA. To improve academic performance among occupational therapy students, it appears important to increase their personal search for meaning and motivation for achievement, and to reduce their fear of failure. The results should be interpreted with caution due to small effect sizes and a modest amount of variance explained by the regression model, and further research on predictors of academic performance is required.
Alternate Methods in Refining the SLS Nozzle Plug Loads
NASA Technical Reports Server (NTRS)
Burbank, Scott; Allen, Andrew
2013-01-01
Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Quality factor analysis for aberrated laser beam
NASA Astrophysics Data System (ADS)
Ghafary, B.; Alavynejad, M.; Kashani, F. D.
2006-12-01
The quality factor of laser beams has attracted considerable attention and some different approaches have been reported to treat the problem. In this paper we analyze quality factor of laser beam and compare the effect of different aberrations on beam quality by expanding pure phase term of wavefront in terms of Zernike polynomials. Also we analyze experimentally the change of beam quality for different Astigmatism aberrations, and compare theoretical results with experimentally results. The experimental and theoretical results are in good agreement.
Construction, Analysis, and Data-Driven Augmentation of Supersaturated Designs
2013-09-01
guidelines and considerations. 27 Table 7. Supersaturated Design Example with 8 Runs and 14 Factors Design Factors Run x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11...much larger than the guidelines proposed by Marley 55 and Woods (2010). They recommend the factor-to-run ratio should be less than 2. Because our ratio...established approach in experimental design: Box (1992) provided general guidelines to consider, and traditional augmentation strategies like fold-over
ERIC Educational Resources Information Center
Bosker, Roel J.; Witziers, Bob
School-effectiveness research has not yet been able to identify the factors of effective and noneffective schools, the real contribution of the significant factors, the true sizes of school effects, and the generalizability of school-effectiveness results. This paper presents findings of a meta analysis, the Dutch PSO programme, that was used to…
USDA-ARS?s Scientific Manuscript database
The Cocoseae is one of 13 tribes of Arecaceae subfamily Arecoideae, and contains a number of palms with significant economic importance, including the monotypic and pantropical Cocos nucifera, the coconut, and African oil palm (Elaeis guineensis). Using seven single copy WRKY transcription factor g...
De Benedictis, Lorenzo; Huck, Christian
2016-12-01
The optimization of near-infrared spectroscopic parameters was realized via design of experiments. With this new approach objectivity can be integrated into conventional, rather subjective approaches. The investigated factors are layer thickness, number of scans and temperature during measurement. Response variables in the full factorial design consisted of absorption intensity, signal-to-noise ratio and reproducibility of the spectra. Optimized factorial combinations have been found to be 0.5mm layer thickness, 64 scans and 25°C ambient temperature for liquid milk measurements. Qualitative analysis of milk indicated a strong correlation of environmental factors, as well as the feeding of cattle with respect to the change in milk composition. This was illustrated with the aid of near-infrared spectroscopy and the previously optimized parameters by detection of altered fatty acids in milk, especially by the fatty acid content (number of carboxylic functions) and the fatty acid length. Copyright © 2016 Elsevier Ltd. All rights reserved.
Lakhanpal, Meena; Singh, Laishram Chandreshwor; Rahman, Tashnin; Sharma, Jagnnath; Singh, M Madhumangal; Kataki, Amal Chandra; Verma, Saurabh; Pandrangi, Santhi Latha; Singh, Y Mohan; Wajid, Saima; Kapur, Sujala; Saxena, Sunita
2016-01-01
Nasopharyngeal carcinoma (NPC) is an epithelial tumour with a distinctive racial and geographical distribution. High incidence of NPC has been reported from China, Southeast Asia, and northeast (NE) region of India. The immune mechanism plays incredibly role in pathogenesis of NPC. Tumour necrosis factors (TNFs) and heat shock protein 70 (HSP 70) constitute significant components of innate as well as adaptive host immunity. Multi-analytical approaches including logistic regression (LR), classification and regression tree (CART) and multifactor dimensionality reduction (MDR) were applied in 120 NPC cases and 100 controls to explore high order interactions among TNF-α (-308 G>A), TNF β (+252 A>G), HSP 70-1 (+190 G>C), HSP 70-hom (+2437 T>C) genes and environmental risk factors. TNF β was identified as the primary etiological factor by all three analytical approaches. Individual analysis of results showed protective effect of TNF β GG genotype (adjusted odds ratio (OR2) = 0.27, 95 % CI = 0.125-0.611, P = 0.001), HSP 70 (+2437) CC genotype (OR2 = 0.17, 95 % CI = 0.0430.69, P = 0.013), while AG genotype of TNF β was found significantly associated with risk of NPC (OR2 = 1.97, 95 % CI = 1.019-3.83, P = 0.04). Analysis of environmental factors demonstrated association of alcohol consumption, living in mud houses and use of firewood for cooking as major risk factors for NPC. Individual haplotype association analysis showed significant risk associated with GTGA haplotype (OR = 68.61, 95 % CI = 2.47-190.37, P = 0.013) while a protective effect with CCAA and GCGA haplotypes (OR = 0.19, 95 % CI = 0.05-0.75, P = 0.019; OR = 0.01 95 % CI = 0.05-0.30, P = 0.007). The multi-analytical approaches applied in this study helped in identification of distinct gene-gene and gene-environment interactions significant in risk assessment of NPC.
NASA Astrophysics Data System (ADS)
Tabibzadeh, Maryam
According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.
Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.
Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter
2015-12-01
Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.
Wen, Kuang-Yi; Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy
2010-01-01
To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Two of the seven factors, 'organizational motivation' and 'meeting user needs,' were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term.
Cai, Li-mei; Ma, Jin; Zhou, Yong-zhang; Huang, Lan-chun; Dou, Lei; Zhang, Cheng-bo; Fu, Shan-ming
2008-12-01
One hundred and eighteen surface soil samples were collected from the Dongguan City, and analyzed for concentration of Cu, Zn, Ni, Cr, Pb, Cd, As, Hg, pH and OM. The spatial distribution and sources of soil heavy metals were studied using multivariate geostatistical methods and GIS technique. The results indicated concentrations of Cu, Zn, Ni, Pb, Cd and Hg were beyond the soil background content in Guangdong province, and especially concentrations of Pb, Cd and Hg were greatly beyond the content. The results of factor analysis group Cu, Zn, Ni, Cr and As in Factor 1, Pb and Hg in Factor 2 and Cd in Factor 3. The spatial maps based on geostatistical analysis show definite association of Factor 1 with the soil parent material, Factor 2 was mainly affected by industries. The spatial distribution of Factor 3 was attributed to anthropogenic influence.
Multilevel poisson regression modelling for determining factors of dengue fever cases in bandung
NASA Astrophysics Data System (ADS)
Arundina, Davila Rubianti; Tantular, Bertho; Pontoh, Resa Septiani
2017-03-01
Scralatina or Dengue Fever is a kind of fever caused by serotype virus which Flavivirus genus and be known as Dengue Virus. Dengue Fever caused by Aedes Aegipty Mosquito bites who infected by a dengue virus. The study was conducted in 151 villages in Bandung. Health Analysts believes that there are two factors that affect the dengue cases, Internal factor (individual) and external factor (environment). The data who used in this research is hierarchical data. The method is used for hierarchical data modelling is multilevel method. Which is, the level 1 is village and level 2 is sub-district. According exploration data analysis, the suitable Multilevel Method is Random Intercept Model. Penalized Quasi Likelihood (PQL) approach on multilevel Poisson is a proper analysis to determine factors that affecting dengue cases in the city of Bandung. Clean and Healthy Behavior factor from the village level have an effect on the number of cases of dengue fever in the city of Bandung. Factor from the sub-district level has no effect.
Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain
Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.
2011-01-01
We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568
NASA Technical Reports Server (NTRS)
Murphy, M. R.
1980-01-01
A resource management approach to aircrew performance is defined and utilized in structuring an analysis of 84 exemplary incidents from the NASA Aviation Safety Reporting System. The distribution of enabling and associated (evolutionary) and recovery factors between and within five analytic categories suggests that resource management training be concentrated on: (1) interpersonal communications, with air traffic control information of major concern; (2) task management, mainly setting priorities and appropriately allocating tasks under varying workload levels; and (3) planning, coordination, and decisionmaking concerned with preventing and recovering from potentially unsafe situations in certain aircraft maneuvers.
Ali, Niloufer S; Ali, Farzana N; Khuwaja, Ali K; Nanji, Kashmira
2014-08-01
OBJECTIVES. To assess the proportion of women subjected to intimate partner violence and the associated factors, and to identify the attitudes of women towards the use of violence by their husbands. DESIGN. Cross-sectional study. SETTING. Family practice clinics at a teaching hospital in Karachi, Pakistan. PARTICIPANTS. A total of 520 women aged between 16 and 60 years were consecutively approached to participate in the study and interviewed by trained data collectors. Overall, 401 completed questionnaires were available for analysis. Multivariate logistic regression analysis was used to identify the association of various factors of interest. RESULTS. In all, 35% of the women reported being physically abused by their husbands in the last 12 months. Multivariate analysis showed that experiences of violence were independently associated with women's illiteracy (adjusted odds ratio=5.9; 95% confidence interval, 1.8-19.6), husband's illiteracy (3.9; 1.4-10.7), smoking habit of husbands (3.3; 1.9-5.8), and substance use (3.1; 1.7-5.7). CONCLUSION. It is imperative that intimate partner violence be considered a major public health concern. It can be prevented through comprehensive, multifaceted, and integrated approaches. The role of education is greatly emphasised in changing the perspectives of individuals and societies against intimate partner violence.
NASA Astrophysics Data System (ADS)
Samphutthanon, R.; Tripathi, N. K.; Ninsawat, S.; Duboz, R.
2014-12-01
The main objective of this research was the development of an HFMD hazard zonation (HFMD-HZ) model by applying AHP and Fuzzy Logic AHP methodologies for weighting each spatial factor such as disease incidence, socio-economic and physical factors. The outputs of AHP and FAHP were input into a Geographic Information Systems (GIS) process for spatial analysis. 14 criteria were selected for analysis as important factors: disease incidence over 10 years from 2003 to 2012, population density, road density, land use and physical features. The results showed a consistency ratio (CR) value for these main criteria of 0.075427 for AHP, the CR for FAHP results was 0.092436. As both remained below the threshold of 0.1, the CR value were acceptable. After linking to actual geospatial data (disease incidence 2013) through spatial analysis by GIS for validation, the results of the FAHP approach were found to match more accurately than those of the AHP approach. The zones with the highest hazard of HFMD outbreaks were located in two main areas in central Muang Chiang Mai district including suburbs and Muang Chiang Rai district including the vicinity. The produced hazardous maps may be useful for organizing HFMD protection plans.
Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Akbarzadeh, Mehdi
2016-09-01
Individual and organizational factors are the factors influencing traumatic occupational injuries. The aim of the present study was the short path analysis of the severity of occupational injuries based on individual and organizational factors. The present cross-sectional analytical study was implemented on traumatic occupational injuries within a ten-year timeframe in 13 large Iranian construction industries. Modeling and data analysis were done using the structural equation modeling (SEM) approach and the IBM SPSS AMOS statistical software version 22.0, respectively. The mean age and working experience of the injured workers were 28.03 ± 5.33 and 4.53 ± 3.82 years, respectively. The portions of construction and installation activities of traumatic occupational injuries were 64.4% and 18.1%, respectively. The SEM findings showed that the individual, organizational and accident type factors significantly were considered as effective factors on occupational injuries' severity (P < 0.05). Path analysis of occupational injuries based on the SEM reveals that individual and organizational factors and their indicator variables are very influential on the severity of traumatic occupational injuries. So, these should be considered to reduce occupational accidents' severity in large construction industries.
Cleared for the visual approach: Human factor problems in air carrier operations
NASA Technical Reports Server (NTRS)
Monan, W. P.
1983-01-01
The study described herein, a set of 353 ASRS reports of unique aviation occurrences significantly involving visual approaches was examined to identify hazards and pitfalls embedded in the visual approach procedure and to consider operational practices that might help avoid future mishaps. Analysis of the report set identified nine aspects of the visual approach procedure that appeared to be predisposing conditions for inducing or exacerbating the effects of operational errors by flight crew members or controllers. Predisposing conditions, errors, and operational consequences of the errors are discussed. In a summary, operational policies that might mitigate the problems are examined.
Indicators of economic security of the region: a risk-based approach to assessing and rating
NASA Astrophysics Data System (ADS)
Karanina, Elena; Loginov, Dmitri
2017-10-01
The article presents the results of research of theoretical and methodical problems of strategy development for economic security of a particular region, justified by the composition of risk factors. The analysis of those risk factors is performed. The threshold values of indicators of economic security of regions were determined using the methods of socioeconomic statistics. The authors concluded that in modern Russian conditions it is necessary to pay great attention to the analysis of the composition and level of indicators of economic security of the region and, based on the materials of this analysis, to formulate more accurate decisions concerning the strategy of socio-economic development.
A generalized nonlinear model-based mixed multinomial logit approach for crash data analysis.
Zeng, Ziqiang; Zhu, Wenbo; Ke, Ruimin; Ash, John; Wang, Yinhai; Xu, Jiuping; Xu, Xinxin
2017-02-01
The mixed multinomial logit (MNL) approach, which can account for unobserved heterogeneity, is a promising unordered model that has been employed in analyzing the effect of factors contributing to crash severity. However, its basic assumption of using a linear function to explore the relationship between the probability of crash severity and its contributing factors can be violated in reality. This paper develops a generalized nonlinear model-based mixed MNL approach which is capable of capturing non-monotonic relationships by developing nonlinear predictors for the contributing factors in the context of unobserved heterogeneity. The crash data on seven Interstate freeways in Washington between January 2011 and December 2014 are collected to develop the nonlinear predictors in the model. Thirteen contributing factors in terms of traffic characteristics, roadway geometric characteristics, and weather conditions are identified to have significant mixed (fixed or random) effects on the crash density in three crash severity levels: fatal, injury, and property damage only. The proposed model is compared with the standard mixed MNL model. The comparison results suggest a slight superiority of the new approach in terms of model fit measured by the Akaike Information Criterion (12.06 percent decrease) and Bayesian Information Criterion (9.11 percent decrease). The predicted crash densities for all three levels of crash severities of the new approach are also closer (on average) to the observations than the ones predicted by the standard mixed MNL model. Finally, the significance and impacts of the contributing factors are analyzed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Factors affecting the surgical approach and timing of bilateral adrenalectomy.
Lan, Billy Y; Taskin, Halit E; Aksoy, Erol; Birsen, Onur; Dural, Cem; Mitchell, Jamie; Siperstein, Allan; Berber, Eren
2015-07-01
Laparoscopic adrenalectomy has gained widespread acceptance. However, the optimal surgical approach to laparoscopic bilateral adrenalectomy has not been clearly defined. The aim of this study is to analyze the patient and intraoperative factors affecting the feasibility and outcome of different surgical approaches to define an algorithm for bilateral adrenalectomy. Between 2000 and 2013, all patients who underwent bilateral adrenalectomy at a single institution were selected for retrospective analysis. Patient factors, surgical approach, operative outcomes, and complications were analyzed. From 2000 to 2013, 28 patients underwent bilateral adrenalectomy. Patient diagnoses included Cushing's disease (n = 19), pheochromocytoma (n = 7), and adrenal metastasis (n = 2). Of these 28 patients, successful laparoscopic adrenalectomy was performed in all but 2 patients. Twenty-three out of the 26 adrenalectomies were completed in a single stage, while three were performed as a staged approach due to deterioration in intraoperative respiratory status in two patients and patient body habitus in one. Of the adrenalectomies completed using the minimally invasive approach, a posterior retroperitoneal (PR) approach was performed in 17 patients and lateral transabdominal (LT) approach in 9 patients. Patients who underwent a LT approach had higher BMI, larger tumor size, and other concomitant intraabdominal pathology. Hospital stay for laparoscopic adrenalectomy was 3.5 days compared to 5 and 12 days for the two open cases. There were no 30-day hospital mortality and 5 patients had minor complications for the entire cohort. A minimally invasive operation is feasible in 93% of patients undergoing bilateral adrenalectomy with 65% of adrenalectomies performed using the PR approach. Indications for the LT approach include morbid obesity, tumor size >6 cm, and other concomitant intraabdominal pathology. Single-stage adrenalectomies are feasible in most patients, with prolonged operative time causing respiratory instability being the main indication for a staged approach.
Development of Evidence-Based Health Policy Documents in Developing Countries: A Case of Iran
Imani-Nasab, Mohammad Hasan; Seyedin, Hesam; Majdzadeh, Reza; Yazdizadeh, Bahareh; Salehi, Masoud
2014-01-01
Background: Evidence-based policy documents that are well developed by senior civil servants and are timely available can reduce the barriers to evidence utilization by health policy makers. This study examined the barriers and facilitators in developing evidence-based health policy documents from the perspective of their producers in a developing country. Methods: In a qualitative study with a framework analysis approach, we conducted semi-structured interviews using purposive and snowball sampling. A qualitative analysis software (MAXQDA-10) was used to apply the codes and manage the data. This study was theory-based and the results were compared to exploratory studies about the factors influencing evidence-based health policymaking. Results: 18 codes and three main themes of behavioral, normative, and control beliefs were identified. Factors that influence the development of evidence-based policy documents were identified by the participants: behavioral beliefs included quality of policy documents, use of resources, knowledge and innovation, being time-consuming and contextualization; normative beliefs included policy authorities, policymakers, policy administrators, and co-workers; and control beliefs included recruitment policy, performance management, empowerment, management stability, physical environment, access to evidence, policy making process, and effect of other factors. Conclusion: Most of the cited barriers to the development of evidence-based policy were related to control beliefs, i.e. barriers at the organizational and health system levels. This study identified the factors that influence the development of evidence-based policy documents based on the components of the theory of planned behavior. But in exploratory studies on evidence utilization by health policymakers, the identified factors were only related to control behaviors. This suggests that the theoretical approach may be preferable to the exploratory approach in identifying the barriers and facilitators of a behavior. PMID:24762343
Overweight and obesity in India: policy issues from an exploratory multi-level analysis.
Siddiqui, Md Zakaria; Donato, Ronald
2016-06-01
This article analyses a nationally representative household dataset-the National Family Health Survey (NFHS-3) conducted in 2005 to 2006-to examine factors influencing the prevalence of overweight/obesity in India. The dataset was disaggregated into four sub-population groups-urban and rural females and males-and multi-level logit regression models were used to estimate the impact of particular covariates on the likelihood of overweight/obesity. The multi-level modelling approach aimed to identify individual and macro-level contextual factors influencing this health outcome. In contrast to most studies on low-income developing countries, the findings reveal that education for females beyond a particular level of educational attainment exhibits a negative relationship with the likelihood of overweight/obesity. This relationship was not observed for males. Muslim females and all Sikh sub-populations have a higher likelihood of overweight/obesity suggesting the importance of socio-cultural influences. The results also show that the relationship between wealth and the probability of overweight/obesity is stronger for males than females highlighting the differential impact of increasing socio-economic status on gender. Multi-level analysis reveals that states exerted an independent influence on the likelihood of overweight/obesity beyond individual-level covariates, reflecting the importance of spatially related contextual factors on overweight/obesity. While this study does not disentangle macro-level 'obesogenic' environmental factors from socio-cultural network influences, the results highlight the need to refrain from adopting a 'one size fits all' policy approach in addressing the overweight/obesity epidemic facing India. Instead, policy implementation requires a more nuanced and targeted approach to incorporate the growing recognition of socio-cultural and spatial contextual factors impacting on healthy behaviours. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Development of evidence-based health policy documents in developing countries: a case of Iran.
Imani-Nasab, Mohammad Hasan; Seyedin, Hesam; Majdzadeh, Reza; Yazdizadeh, Bahareh; Salehi, Masoud
2014-02-07
Evidence-based policy documents that are well developed by senior civil servants and are timely available can reduce the barriers to evidence utilization by health policy makers. This study examined the barriers and facilitators in developing evidence-based health policy documents from the perspective of their producers in a developing country. In a qualitative study with a framework analysis approach, we conducted semi-structured interviews using purposive and snowball sampling. A qualitative analysis software (MAXQDA-10) was used to apply the codes and manage the data. This study was theory-based and the results were compared to exploratory studies about the factors influencing evidence-based health policy-making. 18 codes and three main themes of behavioral, normative, and control beliefs were identified. Factors that influence the development of evidence-based policy documents were identified by the participants: behavioral beliefs included quality of policy documents, use of resources, knowledge and innovation, being time-consuming and contextualization; normative beliefs included policy authorities, policymakers, policy administrators, and co-workers; and control beliefs included recruitment policy, performance management, empowerment, management stability, physical environment, access to evidence, policy making process, and effect of other factors. Most of the cited barriers to the development of evidence-based policy were related to control beliefs, i.e. barriers at the organizational and health system levels. This study identified the factors that influence the development of evidence-based policy documents based on the components of the theory of planned behavior. But in exploratory studies on evidence utilization by health policymakers, the identified factors were only related to control behaviors. This suggests that the theoretical approach may be preferable to the exploratory approach in identifying the barriers and facilitators of a behavior.
Chen, Gang; Adleman, Nancy E; Saad, Ziad S; Leibenluft, Ellen; Cox, Robert W
2014-10-01
All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance-covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within-subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT) with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse-Geisser and Huynh-Feldt) with MVT-WS. To validate the MVM methodology, we performed simulations to assess the controllability for false positives and power achievement. A real FMRI dataset was analyzed to demonstrate the capability of the MVM approach. The methodology has been implemented into an open source program 3dMVM in AFNI, and all the statistical tests can be performed through symbolic coding with variable names instead of the tedious process of dummy coding. Our data indicates that the severity of sphericity violation varies substantially across brain regions. The differences among various modeling methodologies were addressed through direct comparisons between the MVM approach and some of the GLM implementations in the field, and the following two issues were raised: a) the improper formulation of test statistics in some univariate GLM implementations when a within-subject factor is involved in a data structure with two or more factors, and b) the unjustified presumption of uniform sphericity violation and the practice of estimating the variance-covariance structure through pooling across brain regions. Published by Elsevier Inc.
Crashes of instructional flights : analysis of cases and remedial approaches.
DOT National Transportation Integrated Search
1996-02-01
Abstract: Instructional flights experience more than 300 crashes annually and are involved in more than one-third of all midair collisions. Research was undertaken to identify the circumstances of instructional crashes and describe factors related to...
System analysis of automated speed enforcement implementation.
DOT National Transportation Integrated Search
2016-04-01
Speeding is a major factor in a large proportion of traffic crashes, injuries, and fatalities in the United States. Automated Speed Enforcement (ASE) is one of many approaches shown to be effective in reducing speeding violations and crashes. However...
Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S
2016-11-18
Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Kel, AlexanderE
2017-02-01
Computational analysis of master regulators through the search for transcription factor binding sites followed by analysis of signal transduction networks of a cell is a new approach of causal analysis of multi-omics data. This paper contains results on analysis of multi-omics data that include transcriptomics, proteomics and epigenomics data of methotrexate (MTX) resistant colon cancer cell line. The data were used for analysis of mechanisms of resistance and for prediction of potential drug targets and promising compounds for reverting the MTX resistance of these cancer cells. We present all results of the analysis including the lists of identified transcription factors and their binding sites in genome and the list of predicted master regulators - potential drug targets. This data was generated in the study recently published in the article "Multi-omics "Upstream Analysis" of regulatory genomic regions helps identifying targets against methotrexate resistance of colon cancer" (Kel et al., 2016) [4]. These data are of interest for researchers from the field of multi-omics data analysis and for biologists who are interested in identification of novel drug targets against NTX resistance.
Replica Analysis for Portfolio Optimization with Single-Factor Model
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2017-06-01
In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.
Integrated Analysis of Transcriptomic and Proteomic Data
Haider, Saad; Pal, Ranadip
2013-01-01
Until recently, understanding the regulatory behavior of cells has been pursued through independent analysis of the transcriptome or the proteome. Based on the central dogma, it was generally assumed that there exist a direct correspondence between mRNA transcripts and generated protein expressions. However, recent studies have shown that the correlation between mRNA and Protein expressions can be low due to various factors such as different half lives and post transcription machinery. Thus, a joint analysis of the transcriptomic and proteomic data can provide useful insights that may not be deciphered from individual analysis of mRNA or protein expressions. This article reviews the existing major approaches for joint analysis of transcriptomic and proteomic data. We categorize the different approaches into eight main categories based on the initial algorithm and final analysis goal. We further present analogies with other domains and discuss the existing research problems in this area. PMID:24082820
The Importance of Proving the Null
Gallistel, C. R.
2010-01-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? PMID:19348549
Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A
1999-03-01
A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct.
ERIC Educational Resources Information Center
O'Brien, Terry; Cronin, Kieran
2016-01-01
The purpose of this paper is to quantify, review, and analyze published research output of academic librarians from 21 higher education Institutions in Ireland. A mixed approach using an online survey questionnaire, supplemented by content analysis and extensive literature scoping were used for data collection. Factors inhibiting and predicting…
Analysis Matrix of Resilience in the Face of Disability, Old Age and Poverty
ERIC Educational Resources Information Center
Cardenas, Andrea; Lopez, Lucero
2010-01-01
The purpose of this article is to describe the process of the development of the "Resilience Theoretical Analysis Matrix" (RTAM) (or in its Spanish translation: MATR), a tool designed to facilitate a coherent and organised approach to the assessment of a wide spectrum of factors influencing the development of resilience in the face of disability,…
NASA Astrophysics Data System (ADS)
Khan, Sahubar Ali Mohd. Nadhar; Ramli, Razamin; Baten, M. D. Azizul
2015-12-01
Agricultural production process typically produces two types of outputs which are economic desirable as well as environmentally undesirable outputs (such as greenhouse gas emission, nitrate leaching, effects to human and organisms and water pollution). In efficiency analysis, this undesirable outputs cannot be ignored and need to be included in order to obtain the actual estimation of firms efficiency. Additionally, climatic factors as well as data uncertainty can significantly affect the efficiency analysis. There are a number of approaches that has been proposed in DEA literature to account for undesirable outputs. Many researchers has pointed that directional distance function (DDF) approach is the best as it allows for simultaneous increase in desirable outputs and reduction of undesirable outputs. Additionally, it has been found that interval data approach is the most suitable to account for data uncertainty as it is much simpler to model and need less information regarding its distribution and membership function. In this paper, an enhanced DEA model based on DDF approach that considers undesirable outputs as well as climatic factors and interval data is proposed. This model will be used to determine the efficiency of rice farmers who produces undesirable outputs and operates under uncertainty. It is hoped that the proposed model will provide a better estimate of rice farmers' efficiency.
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P
2003-06-01
Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to theory, none considered two or more levels. In other words, although the researchers employ behavioral theory, they omit two fundamental propositions: (1) IT usage is influenced by multiple factors and (2) interventions must be multi-dimensional. Our literature synthesis may provide additional insight into the reason for high failure rates associated with underutilized systems, and underscores the need to move beyond the current dominant approach that employs a single model to guide IT implementation plans that aim to address factors associated with IT acceptance and subsequent positive use behavior.
Abate, Massimo Eraldo; Sánchez, Olga Escobosa; Boschi, Rita; Raspanti, Cinzia; Loro, Loretta; Affinito, Domenico; Cesari, Marilena; Paioli, Anna; Palmerini, Emanuela; Ferrari, Stefano
2014-01-01
The incidence of central venous catheter (CVC)-related complications reported in pediatric sarcoma patients is not established as reports in available literature are limited. The analysis of risk factors is part of the strategy to reduce the incidence of CVC complications. The objective of this study was to determine the incidence of CVC complications in children with bone sarcomas and if defined clinical variables represent a risk factor. During an 8-year period, 155 pediatric patients with bone sarcomas were prospectively followed up for CVC complications. Incidence and correlation with clinical features including gender, age, body mass index, histology, disease stage, and use of thromboprophylaxis with low-molecular-weight heparin were analyzed. Thirty-three CVC complications were recorded among 42 687 CVC-days (0.77 per 1000 CVC-days). No correlation between the specific clinical variables and the CVC complications was found. A high incidence of CVC-related sepsis secondary to gram-negative bacteria was observed. The analysis of CVC complications and their potential risk factors in this sizable and relatively homogeneous pediatric population with bone sarcomas has led to the implementation of a multimodal approach by doctors and nurses to reduce the incidence and morbidity of the CVC-related infections, particularly those related to gram-negative bacteria. As a result of this joint medical and nursing study, a multimodal approach that included equipping faucets with water filters, the reeducation of doctors and nurses, and the systematic review of CVC protocol was implemented.
Experiences of Cigarette Smoking among Iranian Educated Women: A Qualitative Study.
Baheiraei, Azam; Mirghafourvand, Mojgan; Mohammadi, Eesa; Majdzadeh, Reza
2016-01-01
Smoking is a well-known public health problem in women as well as men. In many countries including Iran, there is an increase in tobacco use among women. Exploring the experience of smoking by educated women in order to develop effective tobacco prevention programs in these women is necessary. This study aimed to explore the experiences of smoking among Iranian educated women. This study used a method of qualitative content analysis with the deep individual, semi-structured interviews on a sample of 14 educated female smokers, selected purposefully. Data were analyzed using qualitative content analysis with conventional approach while being collected. The data analysis led to 16 subcategories which were divided into four main categories: (1) Personal factors including subcategories of imitation, show-off and independence, inexperience and curiosity, personal interest and desire, improved mood, and social defiance; (2) family factors including smokers in the family, intrafamily conflicts, and family strictures and limitations; (3) social factors including subcategories of effects of work and school environment, gender equality symbols, peer pressure, and acceptance among friends; and (4) negative consequences of smoking including subcategories of a sense of being physically hurt, psychological and emotional stress, and being looked upon in a negative and judgmental manner. The findings of this study showed that smoking among Iranian educated women is a multifactorial problem. Thus, it is necessary to address smoking among educated women in a holistic approach that focuses on different determinants including personal, family, and social factors particularly the gender roles and stereotypes.
Yang, Miyi; Xi, Xuefei; Wu, Xiaoling; Lu, Runhua; Zhou, Wenfeng; Zhang, Sanbing; Gao, Haixiang
2015-02-13
A novel microextraction technique combining magnetic solid-phase microextraction (MSPME) with ionic liquid dispersive liquid-liquid microextraction (IL-DLLME) to determine four fungicides is presented in this work for the first time. The main factors affecting the extraction efficiency were optimized by the one-factor-at-a-time approach and the impacts of these factors were studied by an orthogonal design. Without tedious clean-up procedure, analytes were extracted from the sample to the adsorbent and organic solvent and then desorbed in acetonitrile prior to chromatographic analysis. Under the optimum conditions, good linearity and high enrichment factors were obtained for all analytes, with correlation coefficients ranging from 0.9998 to 1.0000 and enrichment factors ranging 135 and 159 folds. The recoveries for proposed approach were between 98% and 115%, the limits of detection were between 0.02 and 0.04 μg L(-1) and the RSDs changed from 2.96 to 4.16. The method was successfully applied in the analysis of four fungicides (azoxystrobin, chlorothalonil, cyprodinil and trifloxystrobin) in environmental water samples. The recoveries for the real water samples ranged between 81% and 109%. The procedure proved to be a time-saving, environmentally friendly, and efficient analytical technique. Copyright © 2015 Elsevier B.V. All rights reserved.
Kozinszky, Zoltan; Töreki, Annamária; Hompoth, Emőke A; Dudas, Robert B; Németh, Gábor
2017-04-01
We endeavoured to analyze the factor structure of the Edinburgh Postnatal Depression Scale (EPDS) during a screening programme in Hungary, using exploratory (EFA) and confirmatory factor analysis (CFA), testing both previously published models and newly developed theory-driven ones, after a critical analysis of the literature. Between April 2011 and January 2015, a sample of 2967 pregnant women (between 12th and 30th weeks of gestation) and 714 women 6 weeks after delivery completed the Hungarian version of the EPDS in South-East Hungary. EFAs suggested unidimensionality in both samples. 33 out of 42 previously published models showed good and 6 acceptable fit with our antepartum data in CFAs, whilst 10 of them showed good and 28 acceptable fit in our postpartum sample. Using multiple fit indices, our theory-driven anhedonia (items 1,2) - anxiety (items 4,5) - low mood (items 8,9) model provided the best fit in the antepartum sample. In the postpartum sample, our theory-driven models were again among the best performing models, including an anhedonia and an anxiety factor together with either a low mood or a suicidal risk factor (items 3,6,10). The EPDS showed moderate within- and between-culture invariability, although this would also need to be re-examined with a theory-driven approach. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Sacks, Jason D; Ito, Kazuhiko; Wilson, William E; Neas, Lucas M
2012-10-01
With the advent of multicity studies, uniform statistical approaches have been developed to examine air pollution-mortality associations across cities. To assess the sensitivity of the air pollution-mortality association to different model specifications in a single and multipollutant context, the authors applied various regression models developed in previous multicity time-series studies of air pollution and mortality to data from Philadelphia, Pennsylvania (May 1992-September 1995). Single-pollutant analyses used daily cardiovascular mortality, fine particulate matter (particles with an aerodynamic diameter ≤2.5 µm; PM(2.5)), speciated PM(2.5), and gaseous pollutant data, while multipollutant analyses used source factors identified through principal component analysis. In single-pollutant analyses, risk estimates were relatively consistent across models for most PM(2.5) components and gaseous pollutants. However, risk estimates were inconsistent for ozone in all-year and warm-season analyses. Principal component analysis yielded factors with species associated with traffic, crustal material, residual oil, and coal. Risk estimates for these factors exhibited less sensitivity to alternative regression models compared with single-pollutant models. Factors associated with traffic and crustal material showed consistently positive associations in the warm season, while the coal combustion factor showed consistently positive associations in the cold season. Overall, mortality risk estimates examined using a source-oriented approach yielded more stable and precise risk estimates, compared with single-pollutant analyses.
Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh
2012-10-10
A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.
NASA Astrophysics Data System (ADS)
Maleque, M. A.; Bello, K. A.; Adebisi, A. A.; Akma, N.
2017-03-01
Tungsten inert gas (TIG) torch is one of the most recently used heat source for surface modification of engineering parts, giving similar results to the more expensive high power laser technique. In this study, ceramic-based embedded composite coating has been produced by precoated silicon carbide (SiC) powders on the AISI 4340 low alloy steel substrate using TIG welding torch process. A design of experiment based on Taguchi approach has been adopted to optimize the TIG cladding process parameters. The L9 orthogonal array and the signal-to-noise was used to study the effect of TIG welding parameters such as arc current, travelling speed, welding voltage and argon flow rate on tribological response behaviour (wear rate, surface roughness and wear track width). The objective of the study was to identify optimal design parameter that significantly minimizes each of the surface quality characteristics. The analysis of the experimental results revealed that the argon flow rate was found to be the most influential factor contributing to the minimum wear and surface roughness of the modified coating surface. On the other hand, the key factor in reducing wear scar is the welding voltage. Finally, a convenient and economical Taguchi approach used in this study was efficient to find out optimal factor settings for obtaining minimum wear rate, wear scar and surface roughness responses in TIG-coated surfaces.
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1994-01-01
Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.
Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia
1996-01-01
The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.
Testing students' e-learning via Facebook through Bayesian structural equation modeling.
Salarzadeh Jenatabadi, Hashem; Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.
Indonesian Sea Accident Analysis (Case Study From 2003 – 2013)
NASA Astrophysics Data System (ADS)
Arya Dewanto, Y.; Faturachman, D.
2018-03-01
There are so many accidents in sea transportation in Indonesia. Most of the accidents happen because of low concern aspects of the safety and security of the crew. In sailing, a man as transport users to interact with the ship and the surrounding environment (including other ships, cruise lines, ports, and the situation of local conditions). These interactions are sometimes very complex and related to various aspects of. Aware of the multiplicity of aspects related to the third of these factors, seeking the safety of cruise through a reduction in the number of accidents and the risk of death and serious injuries due to accidents and goods transported is certainly not enough attempted through mono-sector approach, but rather takes a multi-sector approach to the efforts. In this paper, we described the Indonesian Sea Transportation accident analysis for eleven years divided into four items: total of ship accident type, ship accident factor, total of casualties, region of ship accidents. All data founded from Marine Court (Mahkamah Pelayaran). From that 4 items we can find Indonesia Sea Accident Analysis from 2003-2013.
Testing students’ e-learning via Facebook through Bayesian structural equation modeling
Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students’ intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods’ results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated. PMID:28886019
Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly
2015-12-18
This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Nayeri, Nahid Dehghan; Nazari, Ali Akbar; Salsali, Mahvash; Ahmadi, Fazlollah
2005-01-01
Background Nurses, as the largest human resource element of health care systems, have a major role in providing ongoing, high-quality care to patients. Productivity is a significant indicator of professional development within any professional group, including nurses. The human resource element has been identified as the most important factor affecting productivity. This research aimed to explore nurses' perceptions and experiences of productivity and human resource factors improving or impeding it. Method A qualitative approach was used to obtain rich data; open, semi-structured interviews were also conducted. The sampling was based on the maximum variant approach; data analysis was carried out by content analysis, with the constant comparative method. Results Participants indicated that human resources issues are the most important factor in promoting or impeding their productivity. They suggested that the factors influencing effectiveness of human resource elements include: systematic evaluation of staff numbers; a sound selection process based on verifiable criteria; provision of an adequate staffing level throughout the year; full involvement of the ward sister in the process of admitting patients; and sound communication within the care team. Paying attention to these factors creates a suitable background for improved productivity and decreases negative impacts of human resource shortages, whereas ignoring or interfering with them would result in lowering of nurses' productivity. Conclusion Participants maintained that satisfactory human resources can improve nurses' productivity and the quality of care they provide; thereby fulfilling the core objective of the health care system. PMID:16212672
Virtue, Shannon Myers; Pendergast, Laura; Tellez, Marisol; Waldron, Elizabeth; Ismail, Amid
2017-03-01
The aims of this study were to identify noncognitive factors that dental faculty members perceived to contribute to dental students' success and to assess dental faculty members' ratings of the relative importance of these factors to academic performance, clinical performance, and overall success. Out of 184 eligible faculty members at one U.S. dental school, 43 respondents (23.3%) completed a survey in 2015-16. The survey asked respondents to rank the importance of seven noncognitive factors to academic performance, clinical performance, and overall success. Descriptive analysis was conducted to determine the ratings on importance of each noncognitive factor. Two additional open-ended questions asked faculty members to 1) think of dental students who performed very well and list the noncognitive factors they believed contributed to those students' success and 2) identify the two most important of those factors that contributed to success. Qualitative analysis was conducted to identify themes in the open-ended responses. The respondents rated professionalism and preparedness highest in importance for overall success. Preparedness was rated highest in importance for academic performance, and communication was highest in importance for clinical performance. Six themes were identified in the open-ended responses: communication/interpersonal skills, approach to learning, personal characteristics, professionalism, diverse experiences, and technical abilities. On both open-ended items, the most frequently cited noncognitive skill was communication/interpersonal skills followed by approach to learning. In this study, dental faculty members perceived communication, preparedness, and professionalism as important skills contributing to dental students' success.
Following up patients with depression after hospital discharge: a mixed methods approach
2011-01-01
Background A medication information intervention was delivered to patients with a major depressive episode prior to psychiatric hospital discharge. Methods The objective of this study was to explore how patients evolved after hospital discharge and to identify factors influencing this evolution. Using a quasi-experimental longitudinal design, the quantitative analysis measured clinical (using the Hospital Anxiety and Depression Scale, the somatic dimension of the Symptom Checklist 90 and recording the number of readmissions) and humanistic (using the Quality of Life Enjoyment and Satisfaction Questionnaire) outcomes of patients via telephone contacts up to one year following discharge. The qualitative analysis was based on the researcher diary, consisting of reports on the telephone outcome assessment of patients with major depression (n = 99). All reports were analyzed using the thematic framework approach. Results The change in the participants' health status was as diverse as it was at hospital discharge. Participants reported on remissions; changes in mood; relapses; and re-admissions (one third of patients). Quantitative data on group level showed low anxiety, depression and somatic scores over time. Three groups of contributing factors were identified: process, individual and environmental factors. Process factors included self caring process, medical care after discharge, resumption of work and managing daily life. Individual factors were symptom control, medication and personality. Environmental factors were material and social environment. Each of them could ameliorate, deteriorate or be neutral to the patient's health state. A mix of factors was observed in individual patients. Conclusions After hospital discharge, participants with a major depressive episode evolved in many different ways. Process, individual and environmental factors may influence the participant's health status following hospital discharge. Each of the factors could be positive, neutral or negative for the patient. PMID:22074732
Kim, Dokyoon; Basile, Anna O; Bang, Lisa; Horgusluoglu, Emrin; Lee, Seunggeun; Ritchie, Marylyn D; Saykin, Andrew J; Nho, Kwangsik
2017-05-18
Rapid advancement of next generation sequencing technologies such as whole genome sequencing (WGS) has facilitated the search for genetic factors that influence disease risk in the field of human genetics. To identify rare variants associated with human diseases or traits, an efficient genome-wide binning approach is needed. In this study we developed a novel biological knowledge-based binning approach for rare-variant association analysis and then applied the approach to structural neuroimaging endophenotypes related to late-onset Alzheimer's disease (LOAD). For rare-variant analysis, we used the knowledge-driven binning approach implemented in Bin-KAT, an automated tool, that provides 1) binning/collapsing methods for multi-level variant aggregation with a flexible, biologically informed binning strategy and 2) an option of performing unified collapsing and statistical rare variant analyses in one tool. A total of 750 non-Hispanic Caucasian participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI) cohort who had both WGS data and magnetic resonance imaging (MRI) scans were used in this study. Mean bilateral cortical thickness of the entorhinal cortex extracted from MRI scans was used as an AD-related neuroimaging endophenotype. SKAT was used for a genome-wide gene- and region-based association analysis of rare variants (MAF (minor allele frequency) < 0.05) and potential confounding factors (age, gender, years of education, intracranial volume (ICV) and MRI field strength) for entorhinal cortex thickness were used as covariates. Significant associations were determined using FDR adjustment for multiple comparisons. Our knowledge-driven binning approach identified 16 functional exonic rare variants in FANCC significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In addition, the approach identified 7 evolutionary conserved regions, which were mapped to FAF1, RFX7, LYPLAL1 and GOLGA3, significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In further analysis, the functional exonic rare variants in FANCC were also significantly associated with hippocampal volume and cerebrospinal fluid (CSF) Aβ 1-42 (p-value < 0.05). Our novel binning approach identified rare variants in FANCC as well as 7 evolutionary conserved regions significantly associated with a LOAD-related neuroimaging endophenotype. FANCC (fanconi anemia complementation group C) has been shown to modulate TLR and p38 MAPK-dependent expression of IL-1β in macrophages. Our results warrant further investigation in a larger independent cohort and demonstrate that the biological knowledge-driven binning approach is a powerful strategy to identify rare variants associated with AD and other complex disease.
NASA Astrophysics Data System (ADS)
Syed, N. H.; Rehman, A. A.; Hussain, D.; Ishaq, S.; Khan, A. A.
2017-11-01
Morphometric analysis is vital for any watershed investigation and it is inevitable for flood risk assessment in sub-watershed basins. Present study undertaken to carry out critical evaluation and assessment of sub watershed morphological parameters for flood risk assessment of Central Karakorum National Park (CKNP), where Geographical information system and remote sensing (GIS & RS) approach used for quantifying the parameter and mapping of sub watershed units. ASTER DEM used as a geo-spatial data for watershed delineation and stream network. Morphometric analysis carried out using spatial analyst tool of ArcGIS 10.2. The parameters included were bifurcation ratio (Rb), Drainage Texture (Rt), Circulatory ratio (Rc), Elongated ratio (Re), Drainage density (Dd), Stream Length (Lu), Stream order (Su), Slope and Basin length (Lb) have calculated separately. The analysis revealed that the stream order varies from order 1 to 6 and the total numbers of stream segments of all orders were 52. Multi criteria analysis process used to calculate the risk factor. As an accomplished result, map of sub watershed prioritization developed using weighted standardized risk factor. These results helped to understand sensitivity of flush floods in different sub watersheds of the study area and leaded to better management of the mountainous regions in prospect of flush floods.
NASA Astrophysics Data System (ADS)
Martyniv, Oleksandra; Kinasz, Roman
2017-10-01
This material covers the row of basic factors that influence on architectonically-spatial solution formation of building of Higher educational establishments (hereinafter universities). For this purpose, the systematization process of factors that influence on the university architecture was conducted and presented. The conclusion of this article was the proposed concept of considering universities as a hierarchical system, elements of which act as factors of influence, which in the process of alternating influence lead to the main goal, namely the formation of a new university building.
ERIC Educational Resources Information Center
Barakat, Asia; Othman, Afaf
2015-01-01
The present study aims to identify the relationship between the five-factor model of personality and its relationship to cognitive style (rush and prudence) and academic achievement among a sample of students. The study is based on descriptive approach for studying the relationship between the variables of the study, results and analysis. The…
R. J. Dyer; R. D. Westfall; V. L. Sork; P. E. Smouse
2004-01-01
Patterns of pollen dispersal are central to both the ecology and evolution of plant populations. However, the mechanisms controlling either the dispersal process itself or our estimation of that process may be influenced by site-specific factors such as local forest structure and nonuniform adult genetic structure. Here, we present an extension of the AMOVA model...
Trust-Based Analysis of an Air Force Collision Avoidance System
2015-12-01
that test pilots’ trust depended on a number of factors, including the development of a nuisance free algorithm, designing fly-up evasive maneuvers...revealed that test pilots’ trust depended on a number of factors, including the development of a nuisance- free algorithm, designing fly-up evasive ...the terrain collision evasion maneuver. To overcome these limitations, Auto-GCAS was developed with a number of innovative approaches and solutions
a Cognitive Approach to Teaching a Graduate-Level Geobia Course
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel A.
2016-06-01
Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.
Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki
2016-01-01
The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.
Ye, Yusen; Gao, Lin; Zhang, Shihua
2017-01-01
Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions. PMID:29033978
Ye, Yusen; Gao, Lin; Zhang, Shihua
2017-01-01
Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions.
Development of Alabama traffic factors for use in mechanistic-empirical pavement design.
DOT National Transportation Integrated Search
2015-02-01
The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...
Telford, Mark; Senior, Emma
2017-06-08
This article describes the experiences of undergraduate healthcare students taking a module adopting a 'flipped classroom' approach. Evidence suggests that flipped classroom as a pedagogical tool has the potential to enhance student learning and to improve healthcare practice. This innovative approach was implemented within a healthcare curriculum and in a module looking at public health delivered at the beginning of year two of a 3-year programme. The focus of the evaluation study was on the e-learning resources used in the module and the student experiences of these; with a specific aim to evaluate this element of the flipped classroom approach. A mixed-methods approach was adopted and data collected using questionnaires, which were distributed across a whole cohort, and a focus group involving ten participants. Statistical analysis of the data showed the positive student experience of engaging with e-learning. The thematic analysis identified two key themes; factors influencing a positive learning experience and the challenges when developing e-learning within a flipped classroom approach. The study provides guidance for further developments and improvements when developing e-learning as part of the flipped classroom approach.
An approach to evaluating reactive airborne wind shear systems
NASA Technical Reports Server (NTRS)
Gibson, Joseph P., Jr.
1992-01-01
An approach to evaluating reactive airborne windshear detection systems was developed to support a deployment study for future FAA ground-based windshear detection systems. The deployment study methodology assesses potential future safety enhancements beyond planned capabilities. The reactive airborne systems will be an integral part of planned windshear safety enhancements. The approach to evaluating reactive airborne systems involves separate analyses for both landing and take-off scenario. The analysis estimates the probability of effective warning considering several factors including NASA energy height loss characteristics, reactive alert timing, and a probability distribution for microburst strength.
Gurses, Ayse P; Marsteller, Jill A; Ozok, A Ant; Xiao, Yan; Owens, Sharon; Pronovost, Peter J
2010-08-01
Our objective was to identify factors that affect clinicians' compliance with the evidence-based guidelines using an interdisciplinary approach and develop a conceptual framework that can provide a comprehensive and practical guide for designing effective interventions. A literature review and a brainstorming session with 11 researchers from a variety of scientific disciplines were used to identify theoretical and conceptual models describing clinicians' guideline compliance. MEDLINE, EMBASE, CINAHL, and the bibliographies of the papers identified were used as data sources for identifying the relevant theoretical and conceptual models. Thirteen different models that originated from various disciplines including medicine, rural sociology, psychology, human factors and systems engineering, organizational management, marketing, and health education were identified. Four main categories of factors that affect compliance emerged from our analysis: clinician characteristics, guideline characteristics, system characteristics, and implementation characteristics. Based on these findings, we developed an interdisciplinary conceptual framework that specifies the expected interrelationships among these four categories of factors and their impact on clinicians' compliance. An interdisciplinary approach is needed to improve clinicians' compliance with evidence-based guidelines. The conceptual framework from this research can provide a comprehensive and systematic guide to identify barriers to guideline compliance and design effective interventions to improve patient safety.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
Power flow analysis of two coupled plates with arbitrary characteristics
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1988-01-01
The limitation of keeping two plates identical is removed and the vibrational power input and output are evaluated for different area ratios, plate thickness ratios, and for different values of the structural damping loss factor for the source plate (plate with excitation) and the receiver plate. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to be able to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. As was done previously, results obtained from the mobility power flow approach will be compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between SEA results and the mobility power flow results. Furthermore, the benefits that can be derived from using the mobility power flow approach, are also examined.
Puertas, E Benjamín; Rivera, Tamara Y
2016-11-01
To 1) describe patterns of specialty choice; 2) investigate relationships between career selection and selected demographic indicators; and 3) identify salary perception, factors that influence career choice in primary care, and factors that influence desired location of future medical practice. The study used a mixed-methods approach that included a cross-sectional questionnaire survey applied to 234 last-year medical students in Honduras (September 2014), and semi-structured interviews with eight key informants (October 2014). Statistical analysis included chi-square and factor analysis. An alpha level of 0.05 was used to determine significance. In the qualitative analysis, several codes were associated with each other, and five major themes emerged. Primary care careers were the preferred choice for 8.1% of students, who preferred urban settings for future practice location. The perceived salary of specialties other than primary care was significantly higher than those of general practitioners, family practitioners, and pediatricians (P < 0.001). Participants considered "making a difference," income, teaching, prestige, and challenging work the most important factors influencing career choice. Practice in ambulatory settings was significantly associated with a preference for primary care specialties (P = < 0.05). Logistic regression analysis found that factors related to patient-based care were statistically significant for selecting primary care (P = 0.006). The qualitative analysis further endorsed the survey findings, identifying additional factors that influence career choice (future work option; availability of residency positions; and social factors, including violence). Rationales behind preference of a specialty appeared to be based on a combination of ambition and prestige, and on personal and altruistic considerations. Most factors that influence primary care career choice are similar to those found in the literature. There are several factors distinctive to medical students in Honduras-most of them barriers to primary care career choice.
Schaefbauer, Chris L; Campbell, Terrance R; Senteio, Charles; Siek, Katie A; Bakken, Suzanne; Veinot, Tiffany C
2016-01-01
Objective We compare 5 health informatics research projects that applied community-based participatory research (CBPR) approaches with the goal of extending existing CBPR principles to address issues specific to health informatics research. Materials and methods We conducted a cross-case analysis of 5 diverse case studies with 1 common element: integration of CBPR approaches into health informatics research. After reviewing publications and other case-related materials, all coauthors engaged in collaborative discussions focused on CBPR. Researchers mapped each case to an existing CBPR framework, examined each case individually for success factors and barriers, and identified common patterns across cases. Results Benefits of applying CBPR approaches to health informatics research across the cases included the following: developing more relevant research with wider impact, greater engagement with diverse populations, improved internal validity, more rapid translation of research into action, and the development of people. Challenges of applying CBPR to health informatics research included requirements to develop strong, sustainable academic-community partnerships and mismatches related to cultural and temporal factors. Several technology-related challenges, including needs to define ownership of technology outputs and to build technical capacity with community partners, also emerged from our analysis. Finally, we created several principles that extended an existing CBPR framework to specifically address health informatics research requirements. Conclusions Our cross-case analysis yielded valuable insights regarding CBPR implementation in health informatics research and identified valuable lessons useful for future CBPR-based research. The benefits of applying CBPR approaches can be significant, particularly in engaging populations that are typically underserved by health care and in designing patient-facing technology. PMID:26228766
Bish, Melanie; Kenny, Amanda; Nay, Rhonda
2015-04-01
To identify factors that influence directors of nursing in their approach to leadership when working in rural Victoria, Australia. In rural areas, nurses account for the largest component of the health workforce and must be equipped with leadership knowledge and skills to lead reform at a service level. A qualitative descriptive design was used. In-depth semi-structured interviews were undertaken with directors of nursing from rural Victoria. Data were analysed using thematic analysis and a thematic network was developed. Empowerment emerged as the highest order category in the thematic network. This was derived from three organising themes: influence, capital and contextual understanding and the respective basic themes: formal power, informal power, self-knowledge; information, support, resources; and situational factors, career trajectory, connectedness. Rural nurse leaders contend with several issues that influence their approach to leadership. This study provides a platform for further research to foster nurse leadership in rural healthcare services. Acknowledgement of what influences the rural nurse leaders' approach to leadership may assist in the implementation of initiatives designed to develop leadership in a manner that is contextually sensitive. © 2013 John Wiley & Sons Ltd.
de Jong, Jan A Stavenga; Wierstra, Ronny F A; Hermanussen, José
2006-03-01
Research on individual learning approaches (or learning styles) is split in two traditions, one of which is biased towards academic learning, and the other towards learning from direct experience. In the reported study, the two traditions are linked by investigating the relationships between school-based (academic) and work-based (experiential) learning approaches of students in vocational education programs. Participants were 899 students of a Dutch school for secondary vocational education; 758 provided data on school-based learning, and 407 provided data on work-based learning, resulting in an overlap of 266 students from whom data were obtained on learning in both settings. Learning approaches in school and work settings were measured with questionnaires. Using factor analysis and cluster analysis, items and students were grouped, both with respect to school- and work-based learning. The study identified two academic learning dimensions (constructive learning and reproductive learning), and three experiential learning dimensions (analysis, initiative, and immersion). Construction and analysis were correlated positively, and reproduction and initiative negatively. Cluster analysis resulted in the identification of three school-based learning orientations and three work-based learning orientations. The relation between the two types of learning orientations, expressed in Cramér's V, appeared to be weak. It is concluded that learning approaches are relatively context specific, which implies that neither theoretical tradition can claim general applicability.
A systems biology-led insight into the role of the proteome in neurodegenerative diseases.
Fasano, Mauro; Monti, Chiara; Alberio, Tiziana
2016-09-01
Multifactorial disorders are the result of nonlinear interactions of several factors; therefore, a reductionist approach does not appear to be appropriate. Proteomics is a global approach that can be efficiently used to investigate pathogenetic mechanisms of neurodegenerative diseases. Here, we report a general introduction about the systems biology approach and mechanistic insights recently obtained by over-representation analysis of proteomics data of cellular and animal models of Alzheimer's disease, Parkinson's disease and other neurodegenerative disorders, as well as of affected human tissues. Expert commentary: As an inductive method, proteomics is based on unbiased observations that further require validation of generated hypotheses. Pathway databases and over-representation analysis tools allow researchers to assign an expectation value to pathogenetic mechanisms linked to neurodegenerative diseases. The systems biology approach based on omics data may be the key to unravel the complex mechanisms underlying neurodegeneration.
Busch, Anne-Kathrin; Rockenbauch, Katrin; Schmutzer, Gabriele; Brähler, Elmar
2015-01-01
Attitudes towards communication skills of medical undergraduates can be gathered using the Communication Skills Attitude Scale (CSAS). We aimed to develop a German version of the CSAS (CSAS-G) in order to explore attitudes towards communication skills in a German cohort. Additionally the potential influence of demographic factors was examined. We realized the CSAS-G and conducted a survey with 529 participants from 3 different years of study. We then carried out an explorative as well as confirmatory factor analysis and compared the attitudinal scores. Multiple regression analysis was performed. The confirmatory analysis confirmed the two-subscale system revealed by the explorative factor analysis. Students indicate low levels of negative attitudes and moderate levels of positive attitudes. Attitudinal scores differ significantly in relation to gender. The CSAS-G can be used in German cohorts to evaluate attitudes towards communication skills. Medical students in our study show basically a positive approach. Further investigation is necessary to explore and understand attitudes towards communication skills of German medical students.
Analysis of spectrally resolved autofluorescence images by support vector machines
NASA Astrophysics Data System (ADS)
Mateasik, A.; Chorvat, D.; Chorvatova, A.
2013-02-01
Spectral analysis of the autofluorescence images of isolated cardiac cells was performed to evaluate and to classify the metabolic state of the cells in respect to the responses to metabolic modulators. The classification was done using machine learning approach based on support vector machine with the set of the automatically calculated features from recorded spectral profile of spectral autofluorescence images. This classification method was compared with the classical approach where the individual spectral components contributing to cell autofluorescence were estimated by spectral analysis, namely by blind source separation using non-negative matrix factorization. Comparison of both methods showed that machine learning can effectively classify the spectrally resolved autofluorescence images without the need of detailed knowledge about the sources of autofluorescence and their spectral properties.
Dynamic Simulation and Analysis of Human Walking Mechanism
NASA Astrophysics Data System (ADS)
Azahari, Athirah; Siswanto, W. A.; Ngali, M. Z.; Salleh, S. Md.; Yusup, Eliza M.
2017-01-01
Behaviour such as gait or posture may affect a person with the physiological condition during daily activities. The characteristic of human gait cycle phase is one of the important parameter which used to described the human movement whether it is in normal gait or abnormal gait. This research investigates four types of crouch walking (upright, interpolated, crouched and severe) by simulation approach. The assessment are conducting by looking the parameters of hamstring muscle joint, knee joint and ankle joint. The analysis results show that based on gait analysis approach, the crouch walking have a weak pattern of walking and postures. Short hamstring and knee joint is the most influence factor contributing to the crouch walking due to excessive hip flexion that typically accompanies knee flexion.
Chen, Yongsheng; Persaud, Bhagwant
2014-09-01
Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors. Copyright © 2014 Elsevier Ltd. All rights reserved.
Trace DNA analysis: do you know what your neighbour is doing? A multi-jurisdictional survey.
Raymond, Jennifer J; van Oorschot, Roland A H; Walsh, Simon J; Roux, Claude
2008-01-01
Since 1997 the analysis of DNA recovered from handled objects, or 'trace' DNA, has become routine and is frequently demanded from crime scene examinations. However, this analysis often produces unpredictable results. The factors affecting the recovery of full profiles are numerous, and include varying methods of collection and analysis. Communication between forensic laboratories in Australia and New Zealand has been limited in the past, due in some part to sheer distance. Because of its relatively small population and low number of forensic jurisdictions this region is in an excellent position to provide a collective approach. However, the protocols, training methods and research of each jurisdiction had not been widely exchanged. A survey was developed to benchmark the current practices involved in trace DNA analysis, aiming to provide information for training programs and research directions, and to identify factors contributing to the success or failure of the analysis. The survey was divided in to three target groups: crime scene officers, DNA laboratory scientists, and managers of these staff. In late 2004 surveys were sent to forensic organisations in every Australian jurisdiction and New Zealand. A total of 169 completed surveys were received with a return rate of 54%. Information was collated regarding sampling, extraction, amplification and analysis methods, contamination prevention, samples collected, success rates, personnel training and education, and concurrent fingerprinting. The data from the survey responses provided an insight into aspects of trace DNA analysis, from crime scene to interpretation and management. Several concerning factors arose from the survey. Results collation is a significant issue being identified as poor and differing widely, preventing inter-jurisdictional comparison and intra-jurisdictional assessment of both the processes and outputs. A second point of note is the widespread lack of refresher training and proficiency testing, with no set standard for initial training courses. A common theme to these and other issues was the need for a collective approach to training and methodology in trace DNA analysis. Trace DNA is a small fraction of the evidence available in current investigations, and parallels to these results and problems will no doubt be found in other forensic disciplines internationally. The significant point to be realised from this study is the need for effective communication lines between forensic organisations to ensure that best practice is followed, ideally with a cohesive pan-jurisdictional approach.
Akinade, Olugbenga O; Oyedele, Lukumon O; Ajayi, Saheed O; Bilal, Muhammad; Alaka, Hafiz A; Owolabi, Hakeem A; Bello, Sururah A; Jaiyeoba, Babatunde E; Kadiri, Kabir O
2017-02-01
The aim of this paper is to identify Critical Success Factors (CSF) needed for effective material recovery through Design for Deconstruction (DfD). The research approach employed in this paper is based on a sequential exploratory mixed method strategy. After a thorough review of literature and conducting four Focus Group Discussion (FGDs), 43 DfD factors were identified and put together in a questionnaire survey. Data analyses include Cronbach's alpha reliability analysis, mean testing using significance index, and exploratory factor analysis. The result of the factor analysis reveals that an underlying factor structure of five DfD factors groups that include 'stringent legislation and policy', 'deconstruction design process and competencies', 'design for material recovery', 'design for material reuse', and 'design for building flexibility'. These groups of DfD factor groups show that the requirements for DfD goes beyond technical competencies and that non-technical factors such as stringent legislation and policy and design process and competency for deconstruction are key in designing deconstructable buildings. Paying attention to the factors identified in all of these categories will help to tackle impediments that could hinder the effectiveness of DfD. The results of this study would help design and project managers to understand areas of possible improvement in employing DfD as a strategy for diverting waste from landfills. Copyright © 2016 Elsevier Ltd. All rights reserved.
Xie, Weixing; Jin, Daxiang; Ma, Hui; Ding, Jinyong; Xu, Jixi; Zhang, Shuncong; Liang, De
2016-05-01
The risk factors for cement leakage were retrospectively reviewed in 192 patients who underwent percutaneous vertebral augmentation (PVA). To discuss the factors related to the cement leakage in PVA procedure for the treatment of osteoporotic vertebral compression fractures. PVA is widely applied for the treatment of osteoporotic vertebral fractures. Cement leakage is a major complication of this procedure. The risk factors for cement leakage were controversial. A retrospective review of 192 patients who underwent PVA was conducted. The following data were recorded: age, sex, bone density, number of fractured vertebrae before surgery, number of treated vertebrae, severity of the treated vertebrae, operative approach, volume of injected bone cement, preoperative vertebral compression ratio, preoperative local kyphosis angle, intraosseous clefts, preoperative vertebral cortical bone defect, and ratio and type of cement leakage. To study the correlation between each factor and cement leakage ratio, bivariate regression analysis was employed to perform univariate analysis, whereas multivariate linear regression analysis was employed to perform multivariate analysis. The study included 192 patients (282 treated vertebrae), and cement leakage occurred in 100 vertebrae (35.46%). The vertebrae with preoperative cortical bone defects generally exhibited higher cement leakage ratio, and the leakage is typically type C. Vertebrae with intact cortical bones before the procedure tend to experience type S leakage. Univariate analysis showed that patient age, bone density, number of fractured vertebrae before surgery, and vertebral cortical bone were associated with cement leakage ratio (P<0.05). Multivariate analysis showed that the main factors influencing bone cement leakage are bone density and vertebral cortical bone defect, with standardized partial regression coefficients of -0.085 and 0.144, respectively. High bone density and vertebral cortical bone defect are independent risk factors associated with bone cement leakage.
Evaluation of Colorado Learning Attitudes about Science Survey
NASA Astrophysics Data System (ADS)
Douglas, K. A.; Yale, M. S.; Bennett, D. E.; Haugan, M. P.; Bryan, L. A.
2014-12-01
The Colorado Learning Attitudes about Science Survey (CLASS) is a widely used instrument designed to measure student attitudes toward physics and learning physics. Previous research revealed a fairly complex factor structure. In this study, exploratory and confirmatory factor analyses were conducted on data from an undergraduate introductory physics course (n =3844 ) to determine whether a more parsimonious factor structure exists. Exploratory factor analysis results indicate that many of the items from the original CLASS have poor psychometric properties and could not be used in a revised factor structure. The cross validation showed acceptable fit statistics for a three factor model found in the exploratory factor analysis. This research suggests that a more optimum measurement of students' attitudes about physics and learning physics is obtained with a 15-item instrument, which describes the factors of personal application, personal effort, and problem solving. The proposed revised version of the CLASS offers researchers the opportunity to test a shortened version of the instrument that may be able to provide information about students' attitudes in the areas of personal application of physics, personal effort in a physics course, and approaches to problem solving.
Chemical factor analysis of skin cancer FTIR-FEW spectroscopic data
NASA Astrophysics Data System (ADS)
Bruch, Reinhard F.; Sukuta, Sydney
2002-03-01
Chemical Factor Analysis (CFA) algorithms were applied to transform complex Fourier transform infrared fiberoptical evanescent wave (FTIR-FEW) normal and malignant skin tissue spectra into factor spaces for analysis and classification. The factor space approach classified melanoma beyond prior pathological classifications related to specific biochemical alterations to health states in cluster diagrams allowing diagnosis with more biochemical specificity, resolving biochemical component spectra and employing health state eigenvector angular configurations as disease state sensors. This study demonstrated a wealth of new information from in vivo FTIR-FEW spectral tissue data, without extensive a priori information or clinically invasive procedures. In particular, we employed a variety of methods used in CFA to select the rank of spectroscopic data sets of normal benign and cancerous skin tissue. We used the Malinowski indicator function (IND), significance level and F-Tests to rank our data matrices. Normal skin tissue, melanoma and benign tumors were modeled by four, two and seven principal abstract factors, respectively. We also showed that the spectrum of the first eigenvalue was equivalent to the mean spectrum. The graphical depiction of angular disparities between the first abstract factors can be adopted as a new way to characterize and diagnose melanoma cancer.
Bastani, Peivand; Dinarvand, Rasoul; SamadBeik, Mahnaz; Pourmohammadi, Kimia
2016-01-01
Pharmaceutical access for the poor is an essential factor in developing countries that can be improved through strategic purchasing. This study was conducted to identify the elements affecting price in order to enable insurance organizations to put strategic purchasing into practice. This was a qualitative study conducted through content analysis with an inductive approach applying a five-stage framework analysis (familiarization, identifying a thematic framework, indexing, mapping, and interpretation). Data analysis was started right after transcribing each interview applying ATLAS.ti. Data were saturated after 32 semi-structured interviews by experts. These key informants were selected purposefully and through snowball sampling. Findings showed that there are four main themes as Pharmaceutical Strategic Purchasing Requirements in Iran as follows essential and structural factors, international factors, economical factors, and legal factors. Moreover, totally 14 related sub-themes were extracted in this area as the main effective variables. It seems that paying adequate attention to the four present themes and 14 sub-themes affecting price can enable health system policy-makers of developing countries like Iran to make the best decisions through strategic purchasing of drugs by the main insurers in order to improve access and health in the country.
Hannah, Chona T; Lê, Quynh
2012-10-01
Access to health care services is vital for every migrant's health and wellbeing. However, migrants' cultural health beliefs and views can hinder their ability to access available services. This study examined factors affecting access to healthcare services for intermarried Filipino women in rural Tasmania, Australia. A qualitative approach using semi-structured interviews was employed to investigate the factors affecting access to healthcare services for 30 intermarried Filipino women in rural Tasmania. The study used grounded theory and thematic analysis for its data analysis. Nvivo v8 (www.qsrinternational.com) was also used to assist the data coding process and analysis. Five influencing factors were identified: (1) language or communication barriers; (2) area of origin in the Philippines; (3) cultural barriers; (4) length of stay in Tasmania; and (5) expectations of healthcare services before and after migration. Factors affecting intermarried Filipino women in accessing healthcare services are shaped by their socio-demographic and cultural background. The insights gained from this study are useful to health policy-makers, healthcare professionals and to intermarried female migrants. The factors identified can serve as a guide to improve healthcare access for Filipino women and other migrants.
[Application of root cause analysis in healthcare].
Hsu, Tsung-Fu
2007-12-01
The main purpose of this study was to explore various aspects of root cause analysis (RCA), including its definition, rationale concept, main objective, implementation procedures, most common analysis methodology (fault tree analysis, FTA), and advantages and methodologic limitations in regard to healthcare. Several adverse events that occurred at a certain hospital were also analyzed by the author using FTA as part of this study. RCA is a process employed to identify basic and contributing causal factors underlying performance variations associated with adverse events. The rationale concept of RCA offers a systemic approach to improving patient safety that does not assign blame or liability to individuals. The four-step process involved in conducting an RCA includes: RCA preparation, proximate cause identification, root cause identification, and recommendation generation and implementation. FTA is a logical, structured process that can help identify potential causes of system failure before actual failures occur. Some advantages and significant methodologic limitations of RCA were discussed. Finally, we emphasized that errors stem principally from faults attributable to system design, practice guidelines, work conditions, and other human factors, which induce health professionals to make negligence or mistakes with regard to healthcare. We must explore the root causes of medical errors to eliminate potential RCA system failure factors. Also, a systemic approach is needed to resolve medical errors and move beyond a current culture centered on assigning fault to individuals. In constructing a real environment of patient-centered safety healthcare, we can help encourage clients to accept state-of-the-art healthcare services.
Use of herd information for predicting Salmonella status in pig herds.
Baptista, F M; Alban, L; Nielsen, L R; Domingos, I; Pomba, C; Almeida, V
2010-11-01
Salmonella surveillance-and-control programs in pigs are highly resource demanding, so alternative cost-effective approaches are desirable. The aim of this study was to develop and evaluate a tool for predicting the Salmonella test status in pig herds based on herd information collected from 108 industrial farrow-to-finish pig herds in Portugal. A questionnaire including known risk factors for Salmonella was used. A factor analysis model was developed to identify relevant factors that were then tested for association with Salmonella status. Three factors were identified and labelled: general biosecurity (factor 1), herd size (factor 2) and sanitary gap implementation (factor 3). Based on the loadings in factor 1 and factor 3, herds were classified according to their biosecurity practices. In total, 59% of the herds had a good level of biosecurity (interpreted as a loading below zero in factor 1) and 37% of the farms had good biosecurity and implemented sanitary gap (loading below zero in factor 1 and loading above zero in factor 3). This implied that they, among other things, implemented preventive measures for visitors and workers entering the herd, controlled biological vectors, had hygiene procedures in place, water quality assessment, and sanitary gap in the fattening and growing sections. In total, 50 herds were tested for Salmonella. Logistic regression analysis showed that factor 1 was significantly associated with Salmonella test status (P = 0.04). Herds with poor biosecurity had a higher probability of testing Salmonella positive compared with herds with good biosecurity. This study shows the potential for using herd information to classify herds according to their Salmonella status in the absence of good testing options. The method might be used as a potentially cost-effective tool for future development of risk-based approaches to surveillance, targeting interventions to high-risk herds or differentiating sampling strategies in herds with different levels of infection. © 2010 Blackwell Verlag GmbH.
NASA Astrophysics Data System (ADS)
Baioni, Davide; Gallerini, Giuliano; Sgavetti, Maria
2013-04-01
The present work is focused on the distribution of landslides in Foglia river basin area (northern Marche-Romagna), using a heuristic approach supported by GIS tools for the construction of statistical analysis and spatial data. The study area is located in the Adriatic side of the northern Apennine in the boundary that marks the transition between the Marche and Emilia-Romagna regions. The Foglia river basin extends from the Apennines to the Adriatic sea with NE-SE trend occupying an area of about 708 km2. The purpose of this study is to investigate any relationships between factors related to the territory, which were taken into account and divided into classes, and landslides, trying to identify any possible existence of relationships between them. For this aim the study of landslides distribution was performed by using a GIS approach superimposing each thematic map, previously created, with landslides surveyed. Furthermore, we tried to isolate the most recurrent classes, to detect if at the same conditions there is a parameter that affects more than others, so as to recognize every direct relationship of cause and effect. Finally, an analysis was conducted by applying the model of uncertainty CF (Certainity Factor). In the Foglia river basin were surveyed a total of 2821 landslides occupy a total area of 155 km2, corresponding to 22% areal extent of the entire basin. The results of analysis carried out highlighted the importance and role of individual factors that led to the development of landslides analyzed. Moreover, this methodology may be applied to all orders of magnitude and scale without any problem by not requiring a commitment important, both from the economic point of view, and of human resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booker, Paul M.; Maple, Scott A.
2010-06-08
Due to international commerce, cross-border conflicts, and corruption, a holistic, information driven, approach to border security is required to best understand how resources should be applied to affect sustainable improvements in border security. The ability to transport goods and people by land, sea, and air across international borders with relative ease for legitimate commercial purposes creates a challenging environment to detect illicit smuggling activities that destabilize national level border security. Smuggling activities operated for profit or smuggling operations driven by cross border conflicts where militant or terrorist organizations facilitate the transport of materials and or extremists to advance a causemore » add complexity to smuggling interdiction efforts. Border security efforts are further hampered when corruption thwarts interdiction efforts or reduces the effectiveness of technology deployed to enhance border security. These issues necessitate the implementation of a holistic approach to border security that leverages all available data. Large amounts of information found in hundreds of thousands of documents can be compiled to assess national or regional borders to identify variables that influence border security. Location data associated with border topics of interest may be extracted and plotted to better characterize the current border security environment for a given country or region. This baseline assessment enables further analysis, but also documents the initial state of border security that can be used to evaluate progress after border security improvements are made. Then, border security threats are prioritized via a systems analysis approach. Mitigation factors to address risks can be developed and evaluated against inhibiting factor such as corruption. This holistic approach to border security helps address the dynamic smuggling interdiction environment where illicit activities divert to a new location that provides less resistance to smuggling activities after training or technology is deployed at a given location. This paper will present an approach to holistic border security information analysis.« less
Spatial Resolution Effects of Digital Terrain Models on Landslide Susceptibility Analysis
NASA Astrophysics Data System (ADS)
Chang, K. T.; Dou, J.; Chang, Y.; Kuo, C. P.; Xu, K. M.; Liu, J. K.
2016-06-01
The purposes of this study are to identify the maximum number of correlated factors for landslide susceptibility mapping and to evaluate landslide susceptibility at Sihjhong river catchment in the southern Taiwan, integrating two techniques, namely certainty factor (CF) and artificial neural network (ANN). The landslide inventory data of the Central Geological Survey (CGS, MOEA) in 2004-2014 and two digital elevation model (DEM) datasets including a 5-meter LiDAR DEM and a 30-meter Aster DEM were prepared. We collected thirteen possible landslide-conditioning factors. Considering the multi-collinearity and factor redundancy, we applied the CF approach to optimize these thirteen conditioning factors. We hypothesize that if the CF values of the thematic factor layers are positive, it implies that these conditioning factors have a positive relationship with the landslide occurrence. Therefore, based on this assumption and positive CF values, seven conditioning factors including slope angle, slope aspect, elevation, terrain roughness index (TRI), terrain position index (TPI), total curvature, and lithology have been selected for further analysis. The results showed that the optimized-factors model provides a better accuracy for predicting landslide susceptibility in the study area. In conclusion, the optimized-factors model is suggested for selecting relative factors of landslide occurrence.
Technologies to Increase PV Hosting Capacity in Distribution Feeders: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Mather, Barry; Gotseff, Peter
This paper studies the distributed photovoltaic (PV) hosting capacity in distribution feeders by using the stochastic analysis approach. Multiple scenario simulations are conducted to analyze several factors that affect PV hosting capacity, including the existence of voltage regulator, PV location, the power factor of PV inverter and Volt/VAR control. Based on the conclusions obtained from simulation results, three approaches are then proposed to increase distributed PV hosting capacity, which can be formulated as the optimization problem to obtain the optimal solution. All technologies investigated in this paper utilize only existing assets in the feeder and therefore are implementable for amore » low cost. Additionally, the tool developed for these studies is described.« less
Technologies to Increase PV Hosting Capacity in Distribution Feeders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Mather, Barry; Gotseff, Peter
This paper studies the distributed photovoltaic (PV) hosting capacity in distribution feeders by using the stochastic analysis approach. Multiple scenario simulations are conducted to analyze several factors that affect PV hosting capacity, including the existence of voltage regulator, PV location, the power factor of PV inverter and Volt/VAR control. Based on the conclusions obtained from simulation results, three approaches are then proposed to increase distributed PV hosting capacity, which can be formulated as the optimization problem to obtain the optimal solution. All technologies investigated in this paper utilize only existing assets in the feeder and therefore are implementable for amore » low cost. Additionally, the tool developed for these studies is described.« less
Beyene, Kebede; Aspden, Trudi; Sheridan, Janie
2018-04-05
Non-recreational sharing of prescribed medicines can have positive outcomes under some circumstances, but can also result in negative health outcomes. This paper describes a theoretically underpinned and systematic approach to exploring potential interventions to reduce harm. Individual, semi-structured, face-to-face interviews were conducted with purposively sampled pharmacists (n = 8), doctors (n = 4), nurses (n = 6) and patients (n = 17) from Auckland, New Zealand. Thematic analysis of suggested interventions was undertaken, and these were linked to relevant intervention functions of the Behaviour Change Wheel (BCW). Analysis of previously defined factors influencing sharing were mapped onto the "Capability, Opportunity, Motivation - Behaviour" (COM-B) model of the BCW. COM-B analysis of the factors influencing sharing behaviour revealed: (i) 'Capability'-related factors, such as patient misconceptions about the safety of certain medicines, forgetting to refill or to carry around own medicines, and lack of knowledge about safe disposal of leftover/unused medicines; (ii) 'Opportunity'-related factors included lack of access to health facilities, lack of time to see a doctor, linguistic and cultural barriers, lack of information from healthcare providers about risks of sharing, and having leftover/unused medicines, and (iii) 'Motivation'-related factors included altruism, illness denial, embarrassment about seeing a doctor, not carrying around own medicines, habit, and fear of negative health consequences from missing a few doses of medicines. Five intervention functions of the BCW appear to be the most likely candidates for targeting the factors which relate to medicine sharing. These are education, persuasion, enablement, environmental restructuring and restriction. A variety of personal and external factors which influence sharing behaviours were identified, and the BCW provided a means by which theoretically underpinned interventions to reduce potential harms from this behaviour could be proposed. The findings can help with the design of approaches to reduce harm associated with non-recreational medicine sharing. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar
2017-09-01
The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.
Evaluating disease management program effectiveness: an introduction to time-series analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2003-01-01
Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
Use of video-feedback, reflection, and interactive analysis to improve nurse leadership practices.
Crenshaw, Jeannette T
2012-01-01
The chronic shortage of registered nurses (RNs) affects patient safety and health care quality. Many factors affect the RN shortage in the workforce, including negative work environments, exacerbated by ineffective leadership approaches. Improvements in the use of relationship-based leadership approaches lead to healthier work environments that foster RN satisfaction and reduce RN turnover and vacancy rates in acute care settings. In this article, an innovative approach to reduce nurse turnover and decrease vacancy rates in acute care settings is described. Video feedback with reflection and interactive analysis is an untapped resource for nurse leaders and aspiring nurse leaders in their development of effective leadership skills. This unique method may be an effective leadership strategy for addressing recruitment and retention issues in a diverse workforce.
Learning intervention and the approach to study of engineering undergraduates
NASA Astrophysics Data System (ADS)
Solomonides, Ian Paul
The aim of the research was to: investigate the effect of a learning intervention on the Approach to Study of first year engineering degree students. The learning intervention was a local programme of learning to learn' workshops designed and facilitated by the author. The primary aim of these was to develop students' Approaches to Study. Fifty-three first year engineering undergraduates at The Nottingham Trent University participated in the workshops. Approaches to Study were quantified using data obtained from the Revised Approach to Study Inventory (RASI) which was also subjected to a validity and reliability study using local data. Quantitative outcomes were supplemented using a qualitative analysis of essays written by students during the workshops. These were analysed for detail regarding student Approach to Study. It was intended that any findings would inform the local system of Engineering Education, although more general findings also emerged, in particular in relation to the utility of the research instrument. It was concluded that the intervention did not promote the preferential Deep Approach and did not affect Approaches to Study generally as measured by the RASI. This concurred with previous attempts to change student Approaches to Study at the group level. It was also established that subsequent years of the Integrated Engineering degree course are associated with progressively deteriorating Approaches to Study. Students who were exposed to the intervention followed a similar pattern of deteriorating Approaches suggesting that the local course context and its demands had a greater influence over the Approach of students than the intervention did. It was found that academic outcomes were unrelated to the extent to which students took a Deep Approach to the local assessment demands. There appeared therefore to be a mis-match between the Approach students adopted to pass examinations and those that are required for high quality learning outcomes. It is suggested that more co-ordinated and coherent action for changing the local course demands is needed before an improvement in student Approaches will be observed. These conclusions were broadly supported by the results from the qualitative analysis which also indicated the dominating effects of course context over Approach. However, some students appeared to have gained from the intervention in that they reported being in a better position to evaluate their relationships with the course demands following the workshops. It therefore appeared that some students could be described as being in tension between the desire to take a Deep Approach and the adoption of less desirable Approaches as promoted and encouraged by the course context. It is suggested that questions regarding the integrity of the intervention are thereby left unresolved even though the immediate effects of it are quite clear. It is also suggested that the integrity of the research instrument is open to question in that the Strategic Approach to Study scale failed to be defined by one factor under common factor analysis. The intentional or motivational element which previously defined this scale was found to be associated with a Deep Approach factor within the local context. The Strategic Approach was found to be defined by skill rather than motivation. This indicated that some reinterpretation of the RASI and in particular the Strategic Approach to Study scale is needed.
2006-06-01
dynamic programming approach known as a “rolling horizon” approach. This method accounts for state transitions within the simulation rather than modeling ... model is based on the framework developed for Dynamic Allocation of Fires and Sensors used to evaluate factors associated with networking assets in the...of UAVs required by all types of maneuver and support brigades. (Witsken, 2004) The Modeling , Virtual Environments, and Simulations Institute
NASA Astrophysics Data System (ADS)
Hsiao, Y. R.; Tsai, C.
2017-12-01
As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.
A simplified approach for slope stability analysis of uncontrolled waste dumps.
Turer, Dilek; Turer, Ahmet
2011-02-01
Slope stability analysis of municipal solid waste has always been problematic because of the heterogeneous nature of the waste materials. The requirement for large testing equipment in order to obtain representative samples has identified the need for simplified approaches to obtain the unit weight and shear strength parameters of the waste. In the present study, two of the most recently published approaches for determining the unit weight and shear strength parameters of the waste have been incorporated into a slope stability analysis using the Bishop method to prepare slope stability charts. The slope stability charts were prepared for uncontrolled waste dumps having no liner and leachate collection systems with pore pressure ratios of 0, 0.1, 0.2, 0.3, 0.4 and 0.5, considering the most critical slip surface passing through the toe of the slope. As the proposed slope stability charts were prepared by considering the change in unit weight as a function of height, they reflect field conditions better than accepting a constant unit weight approach in the stability analysis. They also streamline the selection of slope or height as a function of the desired factor of safety.
Deciphering Rashomon: an approach to verbal autopsies of maternal deaths.
Iyer, Aditi; Sen, Gita; Sreevathsa, Anuradha
2013-01-01
The paper discusses an approach to verbal autopsies that engages with the Rashomon phenomenon affecting ex post facto constructions of death and responds to the call for maternal safety. This method differs from other verbal autopsies in its approach to data collection and its framework of analysis. In our approach, data collection entails working with and triangulating multiple narratives, and minimising power inequalities in the investigation process. The framework of analysis focuses on the missed opportunities for death prevention as an alternative to (or deepening of) the Three Delays Model. This framework assesses the behavioural responses of health providers, as well as community and family members at each opportunity for death prevention and categorises them into four groups: non-actions, inadequate actions, inappropriate actions and unavoidably delayed actions. We demonstrate the application of this approach to show how verbal autopsies can delve beneath multiple narratives and rigorously identify health system, behavioural and cultural factors that contribute to avoidable maternal mortality.
Cheng, Han; Koning, Katie; O'Hearn, Aileen; Wang, Minxiu; Rumschlag-Booms, Emily; Varhegyi, Elizabeth; Rong, Lijun
2015-11-24
Genome-wide RNAi screening has been widely used to identify host proteins involved in replication and infection of different viruses, and numerous host factors are implicated in the replication cycles of these viruses, demonstrating the power of this approach. However, discrepancies on target identification of the same viruses by different groups suggest that high throughput RNAi screening strategies need to be carefully designed, developed and optimized prior to the large scale screening. Two genome-wide RNAi screens were performed in parallel against the entry of pseudotyped Marburg viruses and avian influenza virus H5N1 utilizing an HIV-1 based surrogate system, to identify host factors which are important for virus entry. A comparative analysis approach was employed in data analysis, which alleviated systematic positional effects and reduced the false positive number of virus-specific hits. The parallel nature of the strategy allows us to easily identify the host factors for a specific virus with a greatly reduced number of false positives in the initial screen, which is one of the major problems with high throughput screening. The power of this strategy is illustrated by a genome-wide RNAi screen for identifying the host factors important for Marburg virus and/or avian influenza virus H5N1 as described in this study. This strategy is particularly useful for highly pathogenic viruses since pseudotyping allows us to perform high throughput screens in the biosafety level 2 (BSL-2) containment instead of the BSL-3 or BSL-4 for the infectious viruses, with alleviated safety concerns. The screening strategy together with the unique comparative analysis approach makes the data more suitable for hit selection and enables us to identify virus-specific hits with a much lower false positive rate.
Design and Analysis of a Stiffened Composite Structure Repair Concept
NASA Technical Reports Server (NTRS)
Przekop, Adam
2011-01-01
A design and analysis of a repair concept applicable to a stiffened thin-skin composite panel based on the Pultruded Rod Stitched Efficient Unitized Structure is presented. Since the repair concept is a bolted repair using metal components, it can easily be applied in the operational environment. Initial analyses are aimed at validating the finite element modeling approach by comparing with available test data. Once confidence in the analysis approach is established several repair configurations are explored and the most efficient one presented. Repairs involving damage to the top of the stiffener alone are considered in addition to repairs involving a damaged stiffener, flange and underlying skin. High fidelity finite element modeling techniques such as mesh-independent definition of compliant fasteners, elastic-plastic metallic material properties and geometrically nonlinear analysis are utilized in the effort. The results of the analysis are presented and factors influencing the design are assessed and discussed.
A refined method for multivariate meta-analysis and meta-regression.
Jackson, Daniel; Riley, Richard D
2014-02-20
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects' standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. Copyright © 2013 John Wiley & Sons, Ltd.
Error and Uncertainty Analysis for Ecological Modeling and Simulation
2001-12-01
management (LRAM) accounting for environmental, training, and economic factors. In the ELVS methodology, soil erosion status is used as a quantitative...Monte-Carlo approach. The optimization is realized through economic functions or on decision constraints, such as, unit sample cost, number of samples... nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G
Life Cycle Costing: A Working Level Approach
1981-06-01
Effects Analysis ( FMEA ) ...... ................ .. 59 Logistics Performance Factors (LPFs) 60 Planning the Use of Life Cycle Cost in the Demonstration...form. Failure Mode and Effects Analysis ( FMEA ). Description. FMEA is a technique that attempts to improve the design of any particular unit. The FMEA ...failure modes and also eliminate extra parts or ones that are used to achieve more performance than is necessary (16:5-14]. Advantages. FMEA forces
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-07-01
In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
Investigation of type-I interferon dysregulation by arenaviruses : a multidisciplinary approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozina, Carol L.; Moorman, Matthew Wallace; Branda, Catherine
2011-09-01
This report provides a detailed overview of the work performed for project number 130781, 'A Systems Biology Approach to Understanding Viral Hemorrhagic Fever Pathogenesis.' We report progress in five key areas: single cell isolation devices and control systems, fluorescent cytokine and transcription factor reporters, on-chip viral infection assays, molecular virology analysis of Arenavirus nucleoprotein structure-function, and development of computational tools to predict virus-host protein interactions. Although a great deal of work remains from that begun here, we have developed several novel single cell analysis tools and knowledge of Arenavirus biology that will facilitate and inform future publications and funding proposals.