Sample records for techniques factor analysis

  1. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  2. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  3. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    PubMed

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  4. A Study of Item Bias for Attitudinal Measurement Using Maximum Likelihood Factor Analysis.

    ERIC Educational Resources Information Center

    Mayberry, Paul W.

    A technique for detecting item bias that is responsive to attitudinal measurement considerations is a maximum likelihood factor analysis procedure comparing multivariate factor structures across various subpopulations, often referred to as SIFASP. The SIFASP technique allows for factorial model comparisons in the testing of various hypotheses…

  5. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  6. Graphical Representation of University Image: A Correspondence Analysis.

    ERIC Educational Resources Information Center

    Yavas, Ugar; Shemwell, Donald J.

    1996-01-01

    Correspondence analysis, an easy-to-interpret interdependence technique, portrays data graphically to show associations of factors more clearly. A study used the technique with 58 students in one university to determine factors in college choice. Results identified the institution's closest competitors and its positioning in terms of college…

  7. Factor weighting in DRASTIC modeling.

    PubMed

    Pacheco, F A L; Pires, L M G R; Santos, R M B; Sanches Fernandes, L F

    2015-02-01

    Evaluation of aquifer vulnerability comprehends the integration of very diverse data, including soil characteristics (texture), hydrologic settings (recharge), aquifer properties (hydraulic conductivity), environmental parameters (relief), and ground water quality (nitrate contamination). It is therefore a multi-geosphere problem to be handled by a multidisciplinary team. The DRASTIC model remains the most popular technique in use for aquifer vulnerability assessments. The algorithm calculates an intrinsic vulnerability index based on a weighted addition of seven factors. In many studies, the method is subject to adjustments, especially in the factor weights, to meet the particularities of the studied regions. However, adjustments made by different techniques may lead to markedly different vulnerabilities and hence to insecurity in the selection of an appropriate technique. This paper reports the comparison of 5 weighting techniques, an enterprise not attempted before. The studied area comprises 26 aquifer systems located in Portugal. The tested approaches include: the Delphi consensus (original DRASTIC, used as reference), Sensitivity Analysis, Spearman correlations, Logistic Regression and Correspondence Analysis (used as adjustment techniques). In all cases but Sensitivity Analysis, adjustment techniques have privileged the factors representing soil characteristics, hydrologic settings, aquifer properties and environmental parameters, by leveling their weights to ≈4.4, and have subordinated the factors describing the aquifer media by downgrading their weights to ≈1.5. Logistic Regression predicts the highest and Sensitivity Analysis the lowest vulnerabilities. Overall, the vulnerability indices may be separated by a maximum value of 51 points. This represents an uncertainty of 2.5 vulnerability classes, because they are 20 points wide. Given this ambiguity, the selection of a weighting technique to integrate a vulnerability index may require additional expertise to be set up satisfactorily. Following a general criterion that weights must be proportional to the range of the ratings, Correspondence Analysis may be recommended as the best adjustment technique. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  9. NECAP 4.1: NASA's Energy-Cost Analysis Program input manual

    NASA Technical Reports Server (NTRS)

    Jensen, R. N.

    1982-01-01

    The computer program NECAP (NASA's Energy Cost Analysis Program) is described. The program is a versatile building design and energy analysis tool which has embodied within it state of the art techniques for performing thermal load calculations and energy use predictions. With the program, comparisons of building designs and operational alternatives for new or existing buildings can be made. The major feature of the program is the response factor technique for calculating the heat transfer through the building surfaces which accounts for the building's mass. The program expands the response factor technique into a space response factor to account for internal building temperature swings; this is extremely important in determining true building loads and energy consumption when internal temperatures are allowed to swing.

  10. How Factor Analysis Can Be Used in Classification.

    ERIC Educational Resources Information Center

    Harman, Harry H.

    This is a methodological study that suggests a taxometric technique for objective classification of yeasts. It makes use of the minres method of factor analysis and groups strains of yeast according to their factor profiles. The similarities are judged in the higher-dimensional space determined by the factor analysis, but otherwise rely on the…

  11. Methods for Improving Information from ’Undesigned’ Human Factors Experiments.

    DTIC Science & Technology

    Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction

  12. What School Psychologists Need to Know about Factor Analysis

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Dombrowski, Stefan C.

    2017-01-01

    Factor analysis is a versatile class of psychometric techniques used by researchers to provide insight into the psychological dimensions (factors) that may account for the relationships among variables in a given dataset. The primary goal of a factor analysis is to determine a more parsimonious set of variables (i.e., fewer than the number of…

  13. Replace-approximation method for ambiguous solutions in factor analysis of ultrasonic hepatic perfusion

    NASA Astrophysics Data System (ADS)

    Zhang, Ji; Ding, Mingyue; Yuchi, Ming; Hou, Wenguang; Ye, Huashan; Qiu, Wu

    2010-03-01

    Factor analysis is an efficient technique to the analysis of dynamic structures in medical image sequences and recently has been used in contrast-enhanced ultrasound (CEUS) of hepatic perfusion. Time-intensity curves (TICs) extracted by factor analysis can provide much more diagnostic information for radiologists and improve the diagnostic rate of focal liver lesions (FLLs). However, one of the major drawbacks of factor analysis of dynamic structures (FADS) is nonuniqueness of the result when only the non-negativity criterion is used. In this paper, we propose a new method of replace-approximation based on apex-seeking for ambiguous FADS solutions. Due to a partial overlap of different structures, factor curves are assumed to be approximately replaced by the curves existing in medical image sequences. Therefore, how to find optimal curves is the key point of the technique. No matter how many structures are assumed, our method always starts to seek apexes from one-dimensional space where the original high-dimensional data is mapped. By finding two stable apexes from one dimensional space, the method can ascertain the third one. The process can be continued until all structures are found. This technique were tested on two phantoms of blood perfusion and compared to the two variants of apex-seeking method. The results showed that the technique outperformed two variants in comparison of region of interest measurements from phantom data. It can be applied to the estimation of TICs derived from CEUS images and separation of different physiological regions in hepatic perfusion.

  14. Simulating the Effects of Common and Specific Abilities on Test Performance: An Evaluation of Factor Analysis

    ERIC Educational Resources Information Center

    McFarland, Dennis J.

    2014-01-01

    Purpose: Factor analysis is a useful technique to aid in organizing multivariate data characterizing speech, language, and auditory abilities. However, knowledge of the limitations of factor analysis is essential for proper interpretation of results. The present study used simulated test scores to illustrate some characteristics of factor…

  15. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    PubMed

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  16. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Dong-Chang; Jans, Hans; McEwan, Sandy

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively.more » The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.« less

  17. A systematic review of the relationship factor between women and health professionals within the multivariant analysis of maternal satisfaction.

    PubMed

    Macpherson, Ignacio; Roqué-Sánchez, María V; Legget Bn, Finola O; Fuertes, Ferran; Segarra, Ignacio

    2016-10-01

    personalised support provided to women by health professionals is one of the prime factors attaining women's satisfaction during pregnancy and childbirth. However the multifactorial nature of 'satisfaction' makes difficult to assess it. Statistical multivariate analysis may be an effective technique to obtain in depth quantitative evidence of the importance of this factor and its interaction with the other factors involved. This technique allows us to estimate the importance of overall satisfaction in its context and suggest actions for healthcare services. systematic review of studies that quantitatively measure the personal relationship between women and healthcare professionals (gynecologists, obstetricians, nurse, midwifes, etc.) regarding maternity care satisfaction. The literature search focused on studies carried out between 1970 and 2014 that used multivariate analyses and included the woman-caregiver relationship as a factor of their analysis. twenty-four studies which applied various multivariate analysis tools to different periods of maternity care (antenatal, perinatal, post partum) were selected. The studies included discrete scale scores and questionnaires from women with low-risk pregnancies. The "personal relationship" factor appeared under various names: care received, personalised treatment, professional support, amongst others. The most common multivariate techniques used to assess the percentage of variance explained and the odds ratio of each factor were principal component analysis and logistic regression. the data, variables and factor analysis suggest that continuous, personalised care provided by the usual midwife and delivered within a family or a specialised setting, generates the highest level of satisfaction. In addition, these factors foster the woman's psychological and physiological recovery, often surpassing clinical action (e.g. medicalization and hospital organization) and/or physiological determinants (e.g. pain, pathologies, etc.). Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Establishing Evidence for Internal Structure Using Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Watson, Joshua C.

    2017-01-01

    Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…

  19. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies.

    PubMed

    Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence

    2010-11-09

    Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  20. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    PubMed

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  1. INSTRUMENTS MEASURING PERCEIVED RACISM/RACIAL DISCRIMINATION: REVIEW AND CRITIQUE OF FACTOR ANALYTIC TECHNIQUES

    PubMed Central

    Atkins, Rahshida

    2015-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225

  2. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    PubMed

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. MULTIVARIATE ANALYSIS OF DRINKING BEHAVIOUR IN A RURAL POPULATION

    PubMed Central

    Mathrubootham, N.; Bashyam, V.S.P.; Shahjahan

    1997-01-01

    This study was carried out to find out the drinking pattern in a rural population, using multivariate techniques. 386 current users identified in a community were assessed with regard to their drinking behaviours using a structured interview. For purposes of the study the questions were condensed into 46 meaningful variables. In bivariate analysis, 14 variables including dependent variables such as dependence, MAST & CAGE (measuring alcoholic status), Q.F. Index and troubled drinking were found to be significant. Taking these variables and other multivariate techniques too such as ANOVA, correlation, regression analysis and factor analysis were done using both SPSS PC + and HCL magnum mainframe computer with FOCUS package and UNIX systems. Results revealed that number of factors such as drinking style, duration of drinking, pattern of abuse, Q.F. Index and various problems influenced drinking and some of them set up a vicious circle. Factor analysis revealed mainly 3 factors, abuse, dependence and social drinking factors. Dependence could be divided into low/moderate dependence. The implications and practical applications of these tests are also discussed. PMID:21584077

  4. A new technique for ordering asymmetrical three-dimensional data sets in ecology.

    PubMed

    Pavoine, Sandrine; Blondel, Jacques; Baguette, Michel; Chessel, Daniel

    2007-02-01

    The aim of this paper is to tackle the problem that arises from asymmetrical data cubes formed by two crossed factors fixed by the experimenter (factor A and factor B, e.g., sites and dates) and a factor which is not controlled for (the species). The entries of this cube are densities in species. We approach this kind of data by the comparison of patterns, that is to say by analyzing first the effect of factor B on the species-factor A pattern, and second the effect of factor A on the species-factor B pattern. The analysis of patterns instead of individual responses requires a correspondence analysis. We use a method we call Foucart's correspondence analysis to coordinate the correspondence analyses of several independent matrices of species x factor A (respectively B) type, corresponding to each modality of factor B (respectively A). Such coordination makes it possible to evaluate the effect of factor B (respectively A) on the species-factor A (respectively B) pattern. The results obtained by such a procedure are much more insightful than those resulting from a classical single correspondence analysis applied to the global matrix that is obtained by simply unrolling the data cube, juxtaposing for example the individual species x factor A matrices through modalities of factor B. This is because a single global correspondence analysis combines three effects of factors in a way that cannot be determined from factorial maps (factor A, factor B, and factor A x factor B interaction) whereas the applications of Foucart's correspondence analysis clearly discriminate two different issues. Using two data sets, we illustrate that this technique proves to be particularly powerful in the analyses of ecological convergence which include several distinct data sets and in the analyses of spatiotemporal variations of species distributions.

  5. Analysis and prediction of Multiple-Site Damage (MSD) fatigue crack growth

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.

    1992-01-01

    A technique was developed to calculate the stress intensity factor for multiple interacting cracks. The analysis was verified through comparison with accepted methods of calculating stress intensity factors. The technique was incorporated into a fatigue crack growth prediction model and used to predict the fatigue crack growth life for multiple-site damage (MSD). The analysis was verified through comparison with experiments conducted on uniaxially loaded flat panels with multiple cracks. Configuration with nearly equal and unequal crack distribution were examined. The fatigue crack growth predictions agreed within 20 percent of the experimental lives for all crack configurations considered.

  6. Factors influencing patient compliance with therapeutic regimens in chronic heart failure: A critical incident technique analysis.

    PubMed

    Strömberg, A; Broström, A; Dahlström, U; Fridlund, B

    1999-01-01

    The aim of this study was to identify factors influencing compliance with prescribed treatment in patients with chronic heart failure. A qualitative design with a critical incident technique was used. Incidents were collected through interviews with 25 patients with heart failure strategically selected from a primary health care clinic, a medical ward, and a specialist clinic. Two hundred sixty critical incidents were identified in the interviews and 2 main areas emerged in the analysis: inward factors and outward factors. The inward factors described how compliance was influenced by the personality of the patient, the disease, and the treatment. The outward factors described how compliance was influenced by social activities, social relationships, and health care professionals. By identifying the inward and outward factors influencing patients with chronic heart failure, health care professionals can assess whether intervention is needed to increase compliance.

  7. Spectral compression algorithms for the analysis of very large multivariate images

    DOEpatents

    Keenan, Michael R.

    2007-10-16

    A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.

  8. The Shock and Vibration Digest. Volume 16, Number 1

    DTIC Science & Technology

    1984-01-01

    investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is

  9. Passive fishing techniques: a cause of turtle mortality in the Mississippi River

    USGS Publications Warehouse

    Barko, V.A.; Briggler, J.T.; Ostendorf, D.E.

    2004-01-01

    We investigated variation of incidentally captured turtle mortality in response to environmental factors and passive fishing techniques. We used Long Term Resource Monitoring Program (LTRMP) data collected from 1996 to 2001 in the unimpounded upper Mississippi River (UMR) adjacent to Missouri and Illinois, USA. We used a principle components analysis (PCA) and a stepwise discriminant function analysis to identify factors correlated with mortality of captured turtles. Furthermore, we were interested in what percentage of turtles died from passive fishing techniques and what techniques caused the most turtle mortality. The main factors influencing captured turtle mortality were water temperature and depth at net deployment. Fyke nets captured the most turtles and caused the most turtle mortality. Almost 90% of mortalities occurred in offshore aquatic areas (i.e., side channel or tributary). Our results provide information on causes of turtle mortality (as bycatch) in a riverine system and implications for river turtle conservation by suggesting management strategies to reduce turtle bycatch and decrease mortality of captured turtles.

  10. Biostatistics Series Module 10: Brief Overview of Multivariate Methods.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2017-01-01

    Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.

  11. Replication Analysis in Exploratory Factor Analysis: What It Is and Why It Makes Your Analysis Better

    ERIC Educational Resources Information Center

    Osborne, Jason W.; Fitzpatrick, David C.

    2012-01-01

    Exploratory Factor Analysis (EFA) is a powerful and commonly-used tool for investigating the underlying variable structure of a psychometric instrument. However, there is much controversy in the social sciences with regard to the techniques used in EFA (Ford, MacCallum, & Tait, 1986; Henson & Roberts, 2006) and the reliability of the outcome.…

  12. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    ERIC Educational Resources Information Center

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  13. Application of Information-Theoretic Data Mining Techniques in a National Ambulatory Practice Outcomes Research Network

    PubMed Central

    Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin

    2005-01-01

    The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156

  14. Random vibration analysis of space flight hardware using NASTRAN

    NASA Technical Reports Server (NTRS)

    Thampi, S. K.; Vidyasagar, S. N.

    1990-01-01

    During liftoff and ascent flight phases, the Space Transportation System (STS) and payloads are exposed to the random acoustic environment produced by engine exhaust plumes and aerodynamic disturbances. The analysis of payloads for randomly fluctuating loads is usually carried out using the Miles' relationship. This approximation technique computes an equivalent load factor as a function of the natural frequency of the structure, the power spectral density of the excitation, and the magnification factor at resonance. Due to the assumptions inherent in Miles' equation, random load factors are often over-estimated by this approach. In such cases, the estimates can be refined using alternate techniques such as time domain simulations or frequency domain spectral analysis. Described here is the use of NASTRAN to compute more realistic random load factors through spectral analysis. The procedure is illustrated using Spacelab Life Sciences (SLS-1) payloads and certain unique features of this problem are described. The solutions are compared with Miles' results in order to establish trends at over or under prediction.

  15. A human factors analysis of EVA time requirements

    NASA Technical Reports Server (NTRS)

    Pate, D. W.

    1996-01-01

    Human Factors Engineering (HFE), also known as Ergonomics, is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. A human factors motion and time study was initiated with the goal of developing a database of EVA task times and a method of utilizing the database to predict how long an ExtraVehicular Activity (EVA) should take. Initial development relied on the EVA activities performed during the STS-61 mission (Hubble repair). The first step of the analysis was to become familiar with EVAs and with the previous studies and documents produced on EVAs. After reviewing these documents, an initial set of task primitives and task time modifiers was developed. Videotaped footage of STS-61 EVAs were analyzed using these primitives and task time modifiers. Data for two entire EVA missions and portions of several others, each with two EVA astronauts, was collected for analysis. Feedback from the analysis of the data will be used to further refine the primitives and task time modifiers used. Analysis of variance techniques for categorical data will be used to determine which factors may, individually or by interactions, effect the primitive times and how much of an effect they have.

  16. Triangular covariance factorizations for. Ph.D. Thesis. - Calif. Univ.

    NASA Technical Reports Server (NTRS)

    Thornton, C. L.

    1976-01-01

    An improved computational form of the discrete Kalman filter is derived using an upper triangular factorization of the error covariance matrix. The covariance P is factored such that P = UDUT where U is unit upper triangular and D is diagonal. Recursions are developed for propagating the U-D covariance factors together with the corresponding state estimate. The resulting algorithm, referred to as the U-D filter, combines the superior numerical precision of square root filtering techniques with an efficiency comparable to that of Kalman's original formula. Moreover, this method is easily implemented and involves no more computer storage than the Kalman algorithm. These characteristics make the U-D method an attractive realtime filtering technique. A new covariance error analysis technique is obtained from an extension of the U-D filter equations. This evaluation method is flexible and efficient and may provide significantly improved numerical results. Cost comparisons show that for a large class of problems the U-D evaluation algorithm is noticeably less expensive than conventional error analysis methods.

  17. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies

    PubMed Central

    2010-01-01

    Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data. PMID:21062443

  18. Error analysis of multi-needle Langmuir probe measurement technique.

    PubMed

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  19. Error analysis of multi-needle Langmuir probe measurement technique

    NASA Astrophysics Data System (ADS)

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  20. Understanding the Support Needs of People with Intellectual and Related Developmental Disabilities through Cluster Analysis and Factor Analysis of Statewide Data

    ERIC Educational Resources Information Center

    Viriyangkura, Yuwadee

    2014-01-01

    Through a secondary analysis of statewide data from Colorado, people with intellectual and related developmental disabilities (ID/DD) were classified into five clusters based on their support needs characteristics using cluster analysis techniques. Prior latent factor models of support needs in the field of ID/DD were examined to investigate the…

  1. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  2. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  3. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  4. Simple Assessment Techniques for Soil and Water. Environmental Factors in Small Scale Development Projects. Workshops.

    ERIC Educational Resources Information Center

    Coordination in Development, New York, NY.

    This booklet was produced in response to the growing need for reliable environmental assessment techniques that can be applied to small-scale development projects. The suggested techniques emphasize low-technology environmental analysis. Although these techniques may lack precision, they can be extremely valuable in helping to assure the success…

  5. Fluctuations in alliance and use of techniques over time: A bidirectional relation between use of "common factors" techniques and the development of the working alliance.

    PubMed

    Solomonov, Nili; McCarthy, Kevin S; Keefe, John R; Gorman, Bernard S; Blanchard, Mark; Barber, Jacques P

    2018-01-01

    The aim of this study was twofold: (a) Investigate whether therapists are consistent in their use of therapeutic techniques throughout supportive-expressive therapy (SET) and (b) Examine the bi-directional relation between therapists' use of therapeutic techniques and the working alliance over the course of SET. Thirty-seven depressed patients were assigned to 16 weeks of SET as part of a larger randomized clinical trial (Barber, Barrett, Gallop, Rynn, & Rickels, ). Working Alliance Inventory-Short Form (WAI-SF) was collected at Weeks 2, 4, and 8. Use of therapeutic interventions was rated by independent observers using the Multitheoretical List of Therapeutic Interventions (MULTI). Intraclass correlation coefficients assessed therapists' consistency in use of techniques. A cross-lagged path analysis estimated the working alliance inventory- Multitheoretical List of Therapeutic Interventions bidirectional relation across time. Therapists were moderately consistent in their use of prescribed techniques (psychodynamic, process-experiential, and person-centred). However, they were inconsistent, or more flexible, in their use of "common factors" techniques (e.g., empathy, active listening, hope, and encouragements). A positive bidirectional relation was found between use of common factors techniques and the working alliance, such that initial high levels of common factors (but not prescribed) techniques predicted higher alliance later on and vice versa. Therapists tend to modulate their use of common factors techniques across treatment. Additionally, when a strong working alliance is developed early in treatment, therapists tend to use more common factors later on. Moreover, high use of common factors techniques is predictive of later improvement in the alliance. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Changes in frontal plane dynamics and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis application of a multidimensional analysis technique.

    PubMed

    Astephen, J L; Deluzio, K J

    2005-02-01

    Osteoarthritis of the knee is related to many correlated mechanical factors that can be measured with gait analysis. Gait analysis results in large data sets. The analysis of these data is difficult due to the correlated, multidimensional nature of the measures. A multidimensional model that uses two multivariate statistical techniques, principal component analysis and discriminant analysis, was used to discriminate between the gait patterns of the normal subject group and the osteoarthritis subject group. Nine time varying gait measures and eight discrete measures were included in the analysis. All interrelationships between and within the measures were retained in the analysis. The multidimensional analysis technique successfully separated the gait patterns of normal and knee osteoarthritis subjects with a misclassification error rate of <6%. The most discriminatory feature described a static and dynamic alignment factor. The second most discriminatory feature described a gait pattern change during the loading response phase of the gait cycle. The interrelationships between gait measures and between the time instants of the gait cycle can provide insight into the mechanical mechanisms of pathologies such as knee osteoarthritis. These results suggest that changes in frontal plane loading and alignment and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis gait patterns. Subsequent investigations earlier in the disease process may suggest the importance of these factors to the progression of knee osteoarthritis.

  7. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  8. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  9. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Trace elements in lake sediments measured by the PIXE technique

    NASA Astrophysics Data System (ADS)

    Gatti, Luciana V.; Mozeto, Antônio A.; Artaxo, Paulo

    1999-04-01

    Lakes are ecosystems where there is a great potential of metal accumulation in sediments due to their depositional characteristics. Total concentration of trace elements was measured on a 50 cm long sediment core from the Infernão Lake, that is an oxbow lake of the Moji-Guaçu River basin, in the state of São Paulo, Brazil. Dating of the core shows up to 180 yrs old sediment layers. The use of the PIXE technique for elemental analysis avoids the traditional acid digestion procedure common in other techniques. The multielemental characteristic of PIXE allows a simultaneous determination of about 20 elements in the sediment samples, such as, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Rb, Sr, Zr, Ba, and Pb. Average values for the elemental composition were found to be similar to the bulk crustal composition. The lake flooding pattern strongly influences the time series of the elemental profiles. Factor analysis of the elemental variability shows five factors. Two of the factors represent the mineralogical matrix, and others represent the organic component, a factor with lead, and another loaded with chromium. The mineralogical component consists of elements such as, Fe, Al, V, Ti, Mn, Ni, K, Zr, Sr, Cu and Zn. The variability of Si is explained by two distinct factors, because it is influenced by two different sources, aluminum-silicates and quartz, and the effect of inundation are different for each other. The organic matter is strongly associated with calcium, and also bounded with S, Zn, Cu and P. Lead and chromium appears as separated factors, although it is not clear the evidences for their anthropogenic origin. The techniques developed for sample preparation and PIXE analysis was proven as advantageous and provided very good reproducibility and accuracy.

  11. Eversion Technique to Prevent Biliary Stricture After Living Donor Liver Transplantation in the Universal Minimal Hilar Dissection Era.

    PubMed

    Ikegami, Toru; Shimagaki, Tomonari; Kawasaki, Junji; Yoshizumi, Tomoharu; Uchiyama, Hideaki; Harada, Noboru; Harimoto, Norifumi; Itoh, Shinji; Soejima, Yuji; Maehara, Yoshihiko

    2017-01-01

    Biliary anastomosis stricture (BAS) is still among the major concerns after living donor liver transplantation (LDLT), even after the technical refinements including the universal use of the blood flow-preserving hilar dissection technique. The aim of this study is to investigate what are still the factors for BAS after LDLT. An analysis of 279 adult-to-adult LDLT grafts (left lobe, n = 161; right lobe, n = 118) with duct-to-duct biliary reconstruction, since the universal application of minimal hilar dissection technique and gradual introduction of eversion technique, was performed. There were 39 patients with BAS. Univariate analysis showed that a right lobe graft (P = 0.008), multiple bile ducts (P < 0.001), ductoplasty (P < 0.001), not using the eversion technique (P = 0.004) and fewer biliary stents than bile duct orifices (P = 0.002) were among the factors associated with BAS. The 1-year and 5-year BAS survival rates were 17.7% and 21.2% in the noneversion group (n = 134), and 6.2% and 7.9% in the eversion group (n = 145), respectively (P = 0.002). The perioperative factors including graft biliary anatomy were not different between everted (n = 134) and noneverted (n = 145) patients. The application of eversion technique under minimal hilar dissection technique could be a key for preventing BAS in duct-to-duct biliary reconstruction in LDLT.

  12. Testing all six person-oriented principles in dynamic factor analysis.

    PubMed

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  13. Application of Factor Analysis on the Financial Ratios of Indian Cement Industry and Validation of the Results by Cluster Analysis

    NASA Astrophysics Data System (ADS)

    De, Anupam; Bandyopadhyay, Gautam; Chakraborty, B. N.

    2010-10-01

    Financial ratio analysis is an important and commonly used tool in analyzing financial health of a firm. Quite a large number of financial ratios, which can be categorized in different groups, are used for this analysis. However, to reduce number of ratios to be used for financial analysis and regrouping them into different groups on basis of empirical evidence, Factor Analysis technique is being used successfully by different researches during the last three decades. In this study Factor Analysis has been applied over audited financial data of Indian cement companies for a period of 10 years. The sample companies are listed on the Stock Exchange India (BSE and NSE). Factor Analysis, conducted over 44 variables (financial ratios) grouped in 7 categories, resulted in 11 underlying categories (factors). Each factor is named in an appropriate manner considering the factor loads and constituent variables (ratios). Representative ratios are identified for each such factor. To validate the results of Factor Analysis and to reach final conclusion regarding the representative ratios, Cluster Analysis had been performed.

  14. Machine learning techniques applied to the determination of road suitability for the transportation of dangerous substances.

    PubMed

    Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G

    2007-08-17

    This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.

  15. Exploratory Factor Analysis with Small Sample Sizes

    ERIC Educational Resources Information Center

    de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.

    2009-01-01

    Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…

  16. An Analysis of Losses to the Southern Commercial Timberland Base

    Treesearch

    Ian A. Munn; David Cleaves

    1998-01-01

    Demographic and physical factors influencing the conversion of commercial timberland iu the south to non-forestry uses between the last two Forest Inventory Analysis (FIA) surveys were investigated. GIS techniques linked Census data and FIA plot level data. Multinomial logit regression identified factors associated with losses to the timberland base. Conversion to...

  17. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  18. An experimental extrapolation technique using the Gafchromic EBT3 film for relative output factor measurements in small x-ray fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morales, Johnny E., E-mail: johnny.morales@lh.org.

    Purpose: An experimental extrapolation technique is presented, which can be used to determine the relative output factors for very small x-ray fields using the Gafchromic EBT3 film. Methods: Relative output factors were measured for the Brainlab SRS cones ranging in diameters from 4 to 30 mm{sup 2} on a Novalis Trilogy linear accelerator with 6 MV SRS x-rays. The relative output factor was determined from an experimental reducing circular region of interest (ROI) extrapolation technique developed to remove the effects of volume averaging. This was achieved by scanning the EBT3 film measurements with a high scanning resolution of 1200 dpi.more » From the high resolution scans, the size of the circular regions of interest was varied to produce a plot of relative output factors versus area of analysis. The plot was then extrapolated to zero to determine the relative output factor corresponding to zero volume. Results: Results have shown that for a 4 mm field size, the extrapolated relative output factor was measured as a value of 0.651 ± 0.018 as compared to 0.639 ± 0.019 and 0.633 ± 0.021 for 0.5 and 1.0 mm diameter of analysis values, respectively. This showed a change in the relative output factors of 1.8% and 2.8% at these comparative regions of interest sizes. In comparison, the 25 mm cone had negligible differences in the measured output factor between zero extrapolation, 0.5 and 1.0 mm diameter ROIs, respectively. Conclusions: This work shows that for very small fields such as 4.0 mm cone sizes, a measureable difference can be seen in the relative output factor based on the circular ROI and the size of the area of analysis using radiochromic film dosimetry. The authors recommend to scan the Gafchromic EBT3 film at a resolution of 1200 dpi for cone sizes less than 7.5 mm and to utilize an extrapolation technique for the output factor measurements of very small field dosimetry.« less

  19. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  20. Integration of different data gap filling techniques to facilitate assessment of polychlorinated biphenyls: A proof of principle case study (ASCCT meeting)

    EPA Science Inventory

    Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...

  1. Application of commercial aircraft accident investigation techniques to a railroad derailment.

    DOT National Transportation Integrated Search

    1973-01-01

    Crash investigation techniques utilized by human factors teams in investigating commercial airline crashes have been applied in the analysis of a railroad train derailment - crash. Passengers in cars that remained upright experienced very low deceler...

  2. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  3. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  4. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  5. Effect of various putty-wash impression techniques on marginal fit of cast crowns.

    PubMed

    Nissan, Joseph; Rosner, Ofir; Bukhari, Mohammed Amin; Ghelfan, Oded; Pilo, Raphael

    2013-01-01

    Marginal fit is an important clinical factor that affects restoration longevity. The accuracy of three polyvinyl siloxane putty-wash impression techniques was compared by marginal fit assessment using the nondestructive method. A stainless steel master cast containing three abutments with three metal crowns matching the three preparations was used to make 45 impressions: group A = single-step technique (putty and wash impression materials used simultaneously), group B = two-step technique with a 2-mm relief (putty as a preliminary impression to create a 2-mm wash space followed by the wash stage), and group C = two-step technique with a polyethylene spacer (plastic spacer used with the putty impression followed by the wash stage). Accuracy was assessed using a toolmaker microscope to measure and compare the marginal gaps between each crown and finish line on the duplicated stone casts. Each abutment was further measured at the mesial, buccal, and distal aspects. One-way analysis of variance was used for statistical analysis. P values and Scheffe post hoc contrasts were calculated. Significance was determined at .05. One-way analysis of variance showed significant differences among the three impression techniques in all three abutments and at all three locations (P < .001). Group B yielded dies with minimal gaps compared to groups A and C. The two-step impression technique with 2-mm relief was the most accurate regarding the crucial clinical factor of marginal fit.

  6. Using Multilevel Factor Analysis with Clustered Data: Investigating the Factor Structure of the Positive Values Scale

    ERIC Educational Resources Information Center

    Huang, Francis L.; Cornell, Dewey G.

    2016-01-01

    Advances in multilevel modeling techniques now make it possible to investigate the psychometric properties of instruments using clustered data. Factor models that overlook the clustering effect can lead to underestimated standard errors, incorrect parameter estimates, and model fit indices. In addition, factor structures may differ depending on…

  7. Dimensions of Early Speech Sound Disorders: A Factor Analytic Study

    ERIC Educational Resources Information Center

    Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry

    2006-01-01

    The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…

  8. Physical and Cognitive-Affective Factors Associated with Fatigue in Individuals with Fibromyalgia: A Multiple Regression Analysis

    ERIC Educational Resources Information Center

    Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong

    2015-01-01

    Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…

  9. The influence of hand positions on biomechanical injury risk factors at the wrist joint during the round-off skills in female gymnastics.

    PubMed

    Farana, Roman; Jandacka, Daniel; Uchytil, Jaroslav; Zahradnik, David; Irwin, Gareth

    2017-01-01

    The aim of this study was to examine the biomechanical injury risk factors at the wrist, including joint kinetics, kinematics and stiffness in the first and second contact limb for parallel and T-shape round-off (RO) techniques. Seven international-level female gymnasts performed 10 trials of the RO to back handspring with parallel and T-shape hand positions. Synchronised kinematic (3D motion analysis system; 247 Hz) and kinetic (two force plates; 1235 Hz) data were collected for each trial. A two-way repeated measure analysis of variance (ANOVA) assessed differences in the kinematic and kinetic parameters between the techniques for each contact limb. The main findings highlighted that in both the RO techniques, the second contact limb wrist joint is exposed to higher mechanical loads than the first contact limb demonstrated by increased axial compression force and loading rate. In the parallel technique, the second contact limb wrist joint is exposed to higher axial compression load. Differences between wrist joint kinetics highlight that the T-shape technique may potentially lead to reducing these bio-physical loads and consequently protect the second contact limb wrist joint from overload and biological failure. Highlighting the biomechanical risk factors facilitates the process of technique selection making more objective and safe.

  10. Hospital survey on patient safety culture: psychometric analysis on a Scottish sample.

    PubMed

    Sarac, Cakil; Flin, Rhona; Mearns, Kathryn; Jackson, Jeanette

    2011-10-01

    To investigate the psychometric properties of the Hospital Survey on Patient Safety Culture on a Scottish NHS data set. The data were collected from 1969 clinical staff (estimated 22% response rate) from one acute hospital from each of seven Scottish Health boards. Using a split-half validation technique, the data were randomly split; an exploratory factor analysis was conducted on the calibration data set, and confirmatory factor analyses were conducted on the validation data set to investigate and check the original US model fit in a Scottish sample. Following the split-half validation technique, exploratory factor analysis results showed a 10-factor optimal measurement model. The confirmatory factor analyses were then performed to compare the model fit of two competing models (10-factor alternative model vs 12-factor original model). An S-B scaled χ(2) square difference test demonstrated that the original 12-factor model performed significantly better in a Scottish sample. Furthermore, reliability analyses of each component yielded satisfactory results. The mean scores on the climate dimensions in the Scottish sample were comparable with those found in other European countries. This study provided evidence that the original 12-factor structure of the Hospital Survey on Patient Safety Culture scale has been replicated in this Scottish sample. Therefore, no modifications are required to the original 12-factor model, which is suggested for use, since it would allow researchers the possibility of cross-national comparisons.

  11. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.

    An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.

  12. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  13. Adaptation of the Practice Environment Scale for military nurses: a psychometric analysis.

    PubMed

    Swiger, Pauline A; Raju, Dheeraj; Breckenridge-Sproat, Sara; Patrician, Patricia A

    2017-09-01

    The aim of this study was to confirm the psychometric properties of Practice Environment Scale of the Nursing Work Index in a military population. This study also demonstrates association rule analysis, a contemporary exploratory technique. One of the instruments most commonly used to evaluate the nursing practice environment is the Practice Environment Scale of the Nursing Work Index. Although the instrument has been widely used, the reliability, validity and individual item function are not commonly evaluated. Gaps exist with regard to confirmatory evaluation of the subscale factors, individual item analysis and evaluation in the outpatient setting and with non-registered nursing staff. This was a secondary data analysis of existing survey data. Multiple psychometric methods were used for this analysis using survey data collected in 2014. First, descriptive analyses were conducted, including exploration using association rules. Next, internal consistency was tested and confirmatory factor analysis was performed to test the factor structure. The specified factor structure did not hold; therefore, exploratory factor analysis was performed. Finally, item analysis was executed using item response theory. The differential item functioning technique allowed the comparison of responses by care setting and nurse type. The results of this study indicate that responses differ between groups and that several individual items could be removed without altering the psychometric properties of the instrument. The instrument functions moderately well in a military population; however, researchers may want to consider nurse type and care setting during analysis to identify any meaningful variation in responses. © 2017 John Wiley & Sons Ltd.

  14. Noise-band factor analysis of cancer Fourier transform infrared evanescent-wave fiber optical (FTIR-FEW) spectra

    NASA Astrophysics Data System (ADS)

    Sukuta, Sydney; Bruch, Reinhard F.

    2002-05-01

    The goal of this study is to test the feasibility of using noise factor/eigenvector bands as general clinical analytical tools for diagnoses. We developed a new technique, Noise Band Factor Cluster Analysis (NBFCA), to diagnose benign tumors via their Fourier transform IR fiber optic evanescent wave spectral data for the first time. The middle IR region of human normal skin tissue and benign and melanoma tumors, were analyzed using this new diagnostic technique. Our results are not in full-agreement with pathological classifications hence there is a possibility that our approaches could complement or improve these traditional classification schemes. Moreover, the use of NBFCA make it much easier to delineate class boundaries hence this method provides results with much higher certainty.

  15. Analysis of the regulation of viral transcription.

    PubMed

    Gloss, Bernd; Kalantari, Mina; Bernard, Hans-Ulrich

    2005-01-01

    Despite the small genomes and number of genes of papillomaviruses, regulation of their transcription is very complex and governed by numerous transcription factors, cis-responsive elements, and epigenetic phenomena. This chapter describes the strategies of how one can approach a systematic analysis of these factors, elements, and mechanisms. From the numerous different techniques useful for studying transcription, we describe in detail three selected protocols of approaches that have been relevant in shaping our knowledge of human papillomavirus transcription. These are DNAse I protection ("footprinting") for location of transcription-factor binding sites, electrophoretic mobility shifts ("gelshifts") for analysis of bound transcription factors, and bisulfite sequencing for analysis of DNA methylation as a prerequisite for epigenetic transcriptional regulation.

  16. Job Performance as Multivariate Dynamic Criteria: Experience Sampling and Multiway Component Analysis.

    PubMed

    Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz

    2010-08-06

    Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.

  17. Economic and Demographic Factors Impacting Placement of Students with Autism

    ERIC Educational Resources Information Center

    Kurth, Jennifer A.; Mastergeorge, Ann M.; Paschall, Katherine

    2016-01-01

    Educational placement of students with autism is often associated with child factors, such as IQ and communication skills. However, variability in placement patterns across states suggests that other factors are at play. This study used hierarchical cluster analysis techniques to identify demographic, economic, and educational covariates…

  18. Using Interactive Graphics to Teach Multivariate Data Analysis to Psychology Students

    ERIC Educational Resources Information Center

    Valero-Mora, Pedro M.; Ledesma, Ruben D.

    2011-01-01

    This paper discusses the use of interactive graphics to teach multivariate data analysis to Psychology students. Three techniques are explored through separate activities: parallel coordinates/boxplots; principal components/exploratory factor analysis; and cluster analysis. With interactive graphics, students may perform important parts of the…

  19. Bayesian linkage and segregation analysis: factoring the problem.

    PubMed

    Matthysse, S

    2000-01-01

    Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.

  20. Electron-Beam-Induced Deposition as a Technique for Analysis of Precursor Molecule Diffusion Barriers and Prefactors.

    PubMed

    Cullen, Jared; Lobo, Charlene J; Ford, Michael J; Toth, Milos

    2015-09-30

    Electron-beam-induced deposition (EBID) is a direct-write chemical vapor deposition technique in which an electron beam is used for precursor dissociation. Here we show that Arrhenius analysis of the deposition rates of nanostructures grown by EBID can be used to deduce the diffusion energies and corresponding preexponential factors of EBID precursor molecules. We explain the limitations of this approach, define growth conditions needed to minimize errors, and explain why the errors increase systematically as EBID parameters diverge from ideal growth conditions. Under suitable deposition conditions, EBID can be used as a localized technique for analysis of adsorption barriers and prefactors.

  1. Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)

    NASA Astrophysics Data System (ADS)

    De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.

    1993-01-01

    The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.

  2. Streamflow characterization using functional data analysis of the Potomac River

    NASA Astrophysics Data System (ADS)

    Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.

    2013-12-01

    Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.

  3. Embedded expert system for space shuttle main engine maintenance

    NASA Technical Reports Server (NTRS)

    Pooley, J.; Thompson, W.; Homsley, T.; Teoh, W.; Jones, J.; Lewallen, P.

    1987-01-01

    The SPARTA Embedded Expert System (SEES) is an intelligent health monitoring system that directs analysis by placing confidence factors on possible engine status and then recommends a course of action to an engineer or engine controller. The technique can prevent catastropic failures or costly rocket engine down time because of false alarms. Further, the SEES has potential as an on-board flight monitor for reusable rocket engine systems. The SEES methodology synergistically integrates vibration analysis, pattern recognition and communications theory techniques with an artificial intelligence technique - the Embedded Expert System (EES).

  4. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  5. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  6. Factors Which Influence The Fish Purchasing Decision: A study on Traditional Market in Riau Mainland

    NASA Astrophysics Data System (ADS)

    Siswati, Latifa; Putri, Asgami

    2018-05-01

    The purposes of the research are to analyze and assess the factors which influence fish purchasing by the community at Tenayan Raya district Pekanbaru.Research methodology which used is survey method, especially interview and observation technique or direct supervision on the market which located at Tenayan Raya district. Determination technique of sampling location/region is done by purposive sampling. The sampling method is done by accidental sampling. Technique analysis of factors which used using the data that derived from the respondent opinion to various fish variable. The result of this research are the factors which influence fish purchasing decision done in a traditional market which located at Tenayan Raya district are product factor, price factors, social factor and individual factor. Product factor which influences fish purchasing decision as follows: the eyelets condition, the nutrition of fresh fish, the diversity of sold fish. Price factors influence the fish purchasing decision, such as: the price of fresh fish, the convincing price and the suitability price and benefits of the fresh fish. Individual factors which influence a fish purchasing decision, such as education and income levels. Social factors which influence a fish purchasing decision, such as family, colleagues and feeding habits of fish.

  7. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  8. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    NASA Astrophysics Data System (ADS)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  9. Using Linear Regression To Determine the Number of Factors To Retain in Factor Analysis and the Number of Issues To Retain in Delphi Studies and Other Surveys.

    ERIC Educational Resources Information Center

    Jurs, Stephen; And Others

    The scree test and its linear regression technique are reviewed, and results of its use in factor analysis and Delphi data sets are described. The scree test was originally a visual approach for making judgments about eigenvalues, which considered the relationships of the eigenvalues to one another as well as their actual values. The graph that is…

  10. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research

    PubMed Central

    Golino, Hudson F.; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman’s eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study. PMID:28594839

  11. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research.

    PubMed

    Golino, Hudson F; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman's eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study.

  12. An improved technique for the 2H/1H analysis of urines from diabetic volunteers

    USGS Publications Warehouse

    Coplen, T.B.; Harper, I.T.

    1994-01-01

    The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, ~ 1-2???, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, approximately 1-2%, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.

  13. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  14. Do not blame the driver: a systems analysis of the causes of road freight crashes.

    PubMed

    Newnam, Sharon; Goode, Natassia

    2015-03-01

    Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Specialty Selections of Jefferson Medical College Students: A Conjoint Analysis.

    ERIC Educational Resources Information Center

    Diamond, James J.; And Others

    1994-01-01

    A consumer research technique, conjoint analysis, was used to assess the relative importance of several factors in 104 fourth-year medical students' selection of specialty. Conjoint analysis appears to be a useful method for investigating the complex process of specialty selection. (SLD)

  16. An analysis of thermal response factors and how to reduce their computational time requirement

    NASA Technical Reports Server (NTRS)

    Wiese, M. R.

    1982-01-01

    Te RESFAC2 version of the Thermal Response Factor Program (RESFAC) is the result of numerous modifications and additions to the original RESFAC. These modifications and additions have significantly reduced the program's computational time requirement. As a result of this work, the program is more efficient and its code is both readable and understandable. This report describes what a thermal response factor is; analyzes the original matrix algebra calculations and root finding techniques; presents a new root finding technique and streamlined matrix algebra; supplies ten validation cases and their results.

  17. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  19. Atmospheric heavy metal deposition in Northern Vietnam: Hanoi and Thainguyen case study using the moss biomonitoring technique, INAA and AAS.

    PubMed

    Viet, Hung Nguyen; Frontasyeva, Marina Vladimirovna; Thi, Thu My Trinh; Gilbert, Daniel; Bernard, Nadine

    2010-06-01

    The moss technique is widely used to monitor atmospheric deposition of heavy metals in many countries in Europe, whereas this technique is scarcely used in Asia. To implement this international reliable and cheap methodology in the Asian countries, it is necessary to find proper moss types typical for the Asian environment and suitable for the biomonitoring purposes. Such a case study was undertaken in Vietnam for assessing the environmental situation in strongly contaminated areas using local species of moss Barbula indica. The study is focused on two areas characterized by different pollution sources: the Hanoi urban area and the Thainguyen metallurgical zone. Fifty-four moss samples were collected there according to standard sampling procedure adopted in Europe. Two complementary analytical techniques, atomic absorption spectrometry (AAS) and instrumental neutron activation analysis (INAA), were used for determination of elemental concentrations in moss samples. To characterize the pollution sources, multivariate statistical analysis was applied. A total of 38 metal elements were determined in the moss by the two analytical techniques. The results of descriptive statistics of metal concentration in moss from the city center and periphery of Hanoi determined by AAS are presented. The similar results for moss from Thainguyen province determined by INAA and AAS are given also. A comparison of mean elemental concentrations in moss of this work with those in different environmental conditions of other authors provides reasonable information on heavy metal atmospheric deposition levels. Factor loadings and factor scores were used to identify and apportion contamination sources at the sampling sites. The values of percentage of total of factors show two highly different types of pollution in the two examined areas-the Hanoi pollution composition with high portion of urban-traffic activity and soil dust (62%), and the one of Thainguyen with factors related to industrial activities (75%). Besides, the scatter of factors in factor planes represents the greater diversity of activities in Hanoi than in Thainguyen. Good relationship between the result of factor analysis and the pollution sources evidences that the moss technique is a potential method to assess the air quality in Vietnam. Moss B. indica widely distributed in Vietnam and Indo-China is shown to be a reliable bryophyte for biomonitoring purposes in sub-tropic and tropic climate. However, the necessity of moss interspecies calibration is obvious for further studies in the area to provide results compatible with those for other Asian countries and Europe.

  20. Investigating the effects of PDC cutters geometry on ROP using the Taguchi technique

    NASA Astrophysics Data System (ADS)

    Jamaludin, A. A.; Mehat, N. M.; Kamaruddin, S.

    2017-10-01

    At times, the polycrystalline diamond compact (PDC) bit’s performance dropped and affects the rate of penetration (ROP). The objective of this project is to investigate the effect of PDC cutter geometry and optimize them. An intensive study in cutter geometry would further enhance the ROP performance. The relatively extended analysis was carried out and four significant geometry factors have been identified that directly improved the ROP. Cutter size, back rake angle, side rake angle and chamfer angle are the stated geometry factors. An appropriate optimization technique that effectively controls all influential geometry factors during cutters manufacturing is introduced and adopted in this project. By adopting L9 Taguchi OA, simulation experiment is conducted by using explicit dynamics finite element analysis. Through a structure Taguchi analysis, ANOVA confirms that the most significant geometry to improve ROP is cutter size (99.16% percentage contribution). The optimized cutter is expected to drill with high ROP that can reduce the rig time, which in its turn, may reduce the total drilling cost.

  1. Relationship between the structure of anxiety and the self-educational ability in new pharmacists.

    PubMed

    Hirashima, Yutaka; Ito, Marika; Doshi, Masaru; Kunii, Midori; Ideguchi, Naoko

    2009-05-01

    The present study was conducted to evaluate the relationship between the structure of anxiety and the self-educational ability in new pharmacists. Ninety seven new pharmacists rated the 42 items of our anxiety scale toward working in the pharmacy in June and October, 2006 and 40 items of established self-educational ability scale in June, 2006. A factor analysis of anxiety scale indicated four factors including communication ability, professional technique of pharmacist, working condition, and self-respecting. From the evaluation of correlation between factors of anxiety scale and factors of self-educational ability scale, the anxiety concerning communication ability or the problem concerning self-respecting correlated significantly with the poorness of all four factors of self-educational ability such as the aim of self-growth and self-development, self-objectifying, practice and technique of study, and self-confidence and pride. However, working condition did not correlate all four factors. For 4 months, the anxiety of professional technique of pharmacist decreased significantly although three other factors did not indicated significant changes.

  2. A Human Factors Analysis of EVA Time Requirements

    NASA Technical Reports Server (NTRS)

    Pate, Dennis W.

    1997-01-01

    Human Factors Engineering (HFE) is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. During the summer of 1995, a human factors motion and time study was initiated with the goals of developing a database of EVA task times and developing a method of utilizing the database to predict how long an EVA should take. Initial development relied on the EVA activities performed during the STS-61 (Hubble) mission. The first step of the study was to become familiar with EVA's, the previous task-time studies, and documents produced on EVA's. After reviewing these documents, an initial set of task primitives and task-time modifiers was developed. Data was collected from videotaped footage of two entire STS-61 EVA missions and portions of several others, each with two EVA astronauts. Feedback from the analysis of the data was used to further refine the primitives and modifiers used. The project was continued during the summer of 1996, during which data on human errors was also collected and analyzed. Additional data from the STS-71 mission was also collected. Analysis of variance techniques for categorical data was used to determine which factors may affect the primitive times and how much of an effect they have. Probability distributions for the various task were also generated. Further analysis of the modifiers and interactions is planned.

  3. Development and Validation of the Negative Attitudes towards CBT Scale.

    PubMed

    Parker, Zachary J; Waller, Glenn

    2017-11-01

    Clinicians commonly fail to use cognitive behavioural therapy (CBT) adequately, but the reasons for such omissions are not well understood. The objective of this study was to create and validate a measure to assess clinicians' attitudes towards CBT - the Negative Attitudes towards CBT Scale (NACS). The participants were 204 clinicians from various mental healthcare fields. Each completed the NACS, measures of anxiety and self-esteem, and a measure of therapists' use of CBT and non-CBT techniques and their confidence in using those techniques. Exploratory factor analysis was used to determine the factor structure of the NACS, and scale internal consistency was tested. A single, 16-item scale emerged from the factor analysis of the NACS, and that scale had good internal consistency. Clinicians' negative attitudes and their anxiety had different patterns of association with the use of CBT and other therapeutic techniques. The findings suggest that clinicians' attitudes and emotions each need to be considered when understanding why many clinicians fail to deliver the optimum version of evidence-based CBT. They also suggest that training effective CBT clinicians might depend on understanding and targeting such internal states.

  4. Evaluation of the Risk Factors for a Rotator Cuff Retear After Repair Surgery.

    PubMed

    Lee, Yeong Seok; Jeong, Jeung Yeol; Park, Chan-Deok; Kang, Seung Gyoon; Yoo, Jae Chul

    2017-07-01

    A retear is a significant clinical problem after rotator cuff repair. However, no study has evaluated the retear rate with regard to the extent of footprint coverage. To evaluate the preoperative and intraoperative factors for a retear after rotator cuff repair, and to confirm the relationship with the extent of footprint coverage. Cohort study; Level of evidence, 3. Data were retrospectively collected from 693 patients who underwent arthroscopic rotator cuff repair between January 2006 and December 2014. All repairs were classified into 4 types of completeness of repair according to the amount of footprint coverage at the end of surgery. All patients underwent magnetic resonance imaging (MRI) after a mean postoperative duration of 5.4 months. Preoperative demographic data, functional scores, range of motion, and global fatty degeneration on preoperative MRI and intraoperative variables including the tear size, completeness of rotator cuff repair, concomitant subscapularis repair, number of suture anchors used, repair technique (single-row or transosseous-equivalent double-row repair), and surgical duration were evaluated. Furthermore, the factors associated with failure using the single-row technique and transosseous-equivalent double-row technique were analyzed separately. The retear rate was 7.22%. Univariate analysis revealed that rotator cuff retears were affected by age; the presence of inflammatory arthritis; the completeness of rotator cuff repair; the initial tear size; the number of suture anchors; mean operative time; functional visual analog scale scores; Simple Shoulder Test findings; American Shoulder and Elbow Surgeons scores; and fatty degeneration of the supraspinatus, infraspinatus, and subscapularis. Multivariate logistic regression analysis revealed patient age, initial tear size, and fatty degeneration of the supraspinatus as independent risk factors for a rotator cuff retear. Multivariate logistic regression analysis of the single-row group revealed patient age and fatty degeneration of the supraspinatus as independent risk factors for a rotator cuff retear. Multivariate logistic regression analysis of the transosseous-equivalent double-row group revealed a frozen shoulder as an independent risk factor for a rotator cuff retear. Our results suggest that patient age, initial tear size, and fatty degeneration of the supraspinatus are independent risk factors for a rotator cuff retear, whereas the completeness of rotator cuff repair based on the extent of footprint coverage and repair technique are not.

  5. An epidemiologic approach to toothbrushing and dental abrasion.

    PubMed

    Bergström, J; Lavstedt, S

    1979-02-01

    Abrasion lesions were recorded in 818 individuals representing the adult population of 430,000 residents of the Stockholm region, Sweden. The subjects were asked about toothbrushing habits, toothbrush quality and dentifrice usage; these factors were related to abrasion criteria. Abrasion was prevalent in 30% and wedge-like or deep depressions were observed in 12%. The relationship between abrasion and toothbrushing was evident, the prevalence and severity of abrasion being correlated to toothbrushing consumption. The importance of the toothbrushing technique for the development of abrasion lesions was elucidated. Horizontal brushing technique was strongly correlated to abrasion. It was demonstrated by treating the data with the statistical AID analysis that toothbrushing factors related to the individual (brushing frequency and brushing technique) exert a greater influence than material-oriented toothbrushing factor such as dentifrice abrasivity and bristle stiffness.

  6. Analysis of the correlative factors for velopharyngeal closure of patients with cleft palate after primary repair.

    PubMed

    Chen, Qi; Li, Yang; Shi, Bing; Yin, Heng; Zheng, Guang-Ning; Zheng, Qian

    2013-12-01

    The objective of this study was to analyze the correlative factors for velopharyngeal closure of patients with cleft palate after primary repair. Ninety-five nonsyndromic patients with cleft palate were enrolled. Two surgical techniques were applied in the patients: simple palatoplasty and combined palatoplasty with pharyngoplasty. All patients were assessed 6 months after the operation. The postoperative velopharyngeal closure (VPC) rate was compared by χ(2) test and the correlative factors were analyzed with logistic regression model. The postoperative VPC rate of young patients was higher than that of old patients, the group with incomplete cleft palate was higher than the group with complete cleft palate, and combined palatoplasty with pharyngoplasty was higher than simple palatoplasty. Operative age, cleft type, and surgical technique were the contributing factors for postoperative VPC rate. Operative age, cleft type, and surgical technique were significant factors influencing postoperative VPC rate of patients with cleft palate. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  8. Logistic regression for risk factor modelling in stuttering research.

    PubMed

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Frequencies and Flutter Speed Estimation for Damaged Aircraft Wing Using Scaled Equivalent Plate Analysis

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, Thiagarajan

    2010-01-01

    Equivalent plate analysis is often used to replace the computationally expensive finite element analysis in initial design stages or in conceptual design of aircraft wing structures. The equivalent plate model can also be used to design a wind tunnel model to match the stiffness characteristics of the wing box of a full-scale aircraft wing model while satisfying strength-based requirements An equivalent plate analysis technique is presented to predict the static and dynamic response of an aircraft wing with or without damage. First, a geometric scale factor and a dynamic pressure scale factor are defined to relate the stiffness, load and deformation of the equivalent plate to the aircraft wing. A procedure using an optimization technique is presented to create scaled equivalent plate models from the full scale aircraft wing using geometric and dynamic pressure scale factors. The scaled models are constructed by matching the stiffness of the scaled equivalent plate with the scaled aircraft wing stiffness. It is demonstrated that the scaled equivalent plate model can be used to predict the deformation of the aircraft wing accurately. Once the full equivalent plate geometry is obtained, any other scaled equivalent plate geometry can be obtained using the geometric scale factor. Next, an average frequency scale factor is defined as the average ratio of the frequencies of the aircraft wing to the frequencies of the full-scaled equivalent plate. The average frequency scale factor combined with the geometric scale factor is used to predict the frequency response of the aircraft wing from the scaled equivalent plate analysis. A procedure is outlined to estimate the frequency response and the flutter speed of an aircraft wing from the equivalent plate analysis using the frequency scale factor and geometric scale factor. The equivalent plate analysis is demonstrated using an aircraft wing without damage and another with damage. Both of the problems show that the scaled equivalent plate analysis can be successfully used to predict the frequencies and flutter speed of a typical aircraft wing.

  10. The Surface Brightness Contribution of II Peg: A Comparison of TiO Band Analysis and Doppler Imaging

    NASA Astrophysics Data System (ADS)

    Senavci, H. V.; O'Neal, D.; Hussain, G. A. J.; Barnes, J. R.

    2015-01-01

    We investigate the surface brightness contribution of the very well known active SB1 binary II Pegasi , to determine the star spot filling factor and the spot temperature parameters. In this context, we analyze 54 spectra of the system taken over 6 nights in September - October of 1996, using the 2.1m Otto Struve Telescope equipped with SES at the McDonald Observatory. We measure the spot temperatures and spot filling factors by fitting TiO molecular bands in this spectroscopic dataset, with model atmosphere approximation using ATLAS9 and with proxy stars obtained with the same instrument. The same dataset is then used to also produce surface spot maps using the Doppler imaging technique. We compare the spot filling factors obtained with the two independent techniques in order to better characterise the spot properties of the system and to better assess the limitations inherent to both techniques. The results obtained from both techniques show that the variation of spot filling factor as a function of phase agree well with each other, while the amount of TiO and DI spot

  11. Using data mining techniques to predict the severity of bicycle crashes.

    PubMed

    Prati, Gabriele; Pietrantoni, Luca; Fraboni, Federico

    2017-04-01

    To investigate the factors predicting severity of bicycle crashes in Italy, we used an observational study of official statistics. We applied two of the most widely used data mining techniques, CHAID decision tree technique and Bayesian network analysis. We used data provided by the Italian National Institute of Statistics on road crashes that occurred on the Italian road network during the period ranging from 2011 to 2013. In the present study, the dataset contains information about road crashes occurred on the Italian road network during the period ranging from 2011 to 2013. We extracted 49,621 road accidents where at least one cyclist was injured or killed from the original database that comprised a total of 575,093 road accidents. CHAID decision tree technique was employed to establish the relationship between severity of bicycle crashes and factors related to crash characteristics (type of collision and opponent vehicle), infrastructure characteristics (type of carriageway, road type, road signage, pavement type, and type of road segment), cyclists (gender and age), and environmental factors (time of the day, day of the week, month, pavement condition, and weather). CHAID analysis revealed that the most important predictors were, in decreasing order of importance, road type (0.30), crash type (0.24), age of cyclist (0.19), road signage (0.08), gender of cyclist (0.07), type of opponent vehicle (0.05), month (0.04), and type of road segment (0.02). These eight most important predictors of the severity of bicycle crashes were included as predictors of the target (i.e., severity of bicycle crashes) in Bayesian network analysis. Bayesian network analysis identified crash type (0.31), road type (0.19), and type of opponent vehicle (0.18) as the most important predictors of severity of bicycle crashes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    NASA Astrophysics Data System (ADS)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  13. Accelerated Bayesian model-selection and parameter-estimation in continuous gravitational-wave searches with pulsar-timing arrays

    NASA Astrophysics Data System (ADS)

    Taylor, Stephen; Ellis, Justin; Gair, Jonathan

    2014-11-01

    We describe several new techniques which accelerate Bayesian searches for continuous gravitational-wave emission from supermassive black-hole binaries using pulsar-timing arrays. These techniques mitigate the problematic increase of search dimensionality with the size of the pulsar array which arises from having to include an extra parameter per pulsar as the array is expanded. This extra parameter corresponds to searching over the phase of the gravitational wave as it propagates past each pulsar so that we can coherently include the pulsar term in our search strategies. Our techniques make the analysis tractable with powerful evidence-evaluation packages like MultiNest. We find good agreement of our techniques with the parameter-estimation and Bayes factor evaluation performed with full signal templates and conclude that these techniques make excellent first-cut tools for detection and characterization of continuous gravitational-wave signals with pulsar-timing arrays. Crucially, at low to moderate signal-to-noise ratios the factor by which the analysis is sped up can be ≳100 , permitting rigorous programs of systematic injection and recovery of signals to establish robust detection criteria within a Bayesian formalism.

  14. Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Bryant, Larry

    2014-01-01

    Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.

  15. Mapping brain activity in gradient-echo functional MRI using principal component analysis

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Singh, Manbir; Don, Manuel

    1997-05-01

    The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.

  16. Chemical Fingerprinting of Materials Developed Due To Environmental Issues

    NASA Technical Reports Server (NTRS)

    Smith, Doris A.; McCool, A. (Technical Monitor)

    2000-01-01

    This paper presents viewgraphs on chemical fingerprinting of materials developed due to environmental issues. Some of the topics include: 1) Aerospace Materials; 2) Building Blocks of Capabilities; 3) Spectroscopic Techniques; 4) Chromatographic Techniques; 5) Factors that Determine Fingerprinting Approach; and 6) Fingerprinting: Combination of instrumental analysis methods that diagnostically characterize a material.

  17. Endoscopic versus traditional saphenous vein harvesting: a prospective, randomized trial.

    PubMed

    Allen, K B; Griffith, G L; Heimansohn, D A; Robison, R J; Matheny, R G; Schier, J J; Fitzgerald, E B; Shaar, C J

    1998-07-01

    Saphenous vein harvested with a traditional longitudinal technique often results in leg wound complications. An alternative endoscopic harvest technique may decrease these complications. One hundred twelve patients scheduled for elective coronary artery bypass grafting were prospectively randomized to have vein harvested using either an endoscopic (group A, n = 54) or traditional technique (group B, n = 58). Groups A and B, respectively, were similar with regard to length of vein harvested (41 +/- 8 cm versus 40 +/- 14 cm), bypasses done (4.1 +/- 1.1 versus 4.2 +/- 1.4), age, preoperative risk stratification, and risks for wound complication (diabetes, sex, obesity, preoperative anemia, hypoalbuminemia, and peripheral vascular disease). Leg wound complications were significantly (p < or = 0.02) reduced in group A (4% [2 of 51] versus 19% [11 of 58]). Univariate analysis identified traditional incision (p < or = 0.02) and diabetes (p < or = 0.05) as wound complication risk factors. Multiple logistic regression analysis identified only the traditional harvest technique as a risk factor for leg wound complications with no significant interaction between harvest technique and any preoperative risk factor (p < or = 0.03). Harvest rate (0.9 +/- 0.4 cm/min versus 1.2 +/- 0.5 cm/min) was slower for group A (p < or = 0.02) and conversion from endoscopic to a traditional harvest occurred in 5.6% (3 of 54) of patients. In a prospective, randomized trial, saphenous vein harvested endoscopically was associated with fewer wound complications than the traditional longitudinal method.

  18. Using factor analysis to identify neuromuscular synergies during treadmill walking

    NASA Technical Reports Server (NTRS)

    Merkle, L. A.; Layne, C. S.; Bloomberg, J. J.; Zhang, J. J.

    1998-01-01

    Neuroscientists are often interested in grouping variables to facilitate understanding of a particular phenomenon. Factor analysis is a powerful statistical technique that groups variables into conceptually meaningful clusters, but remains underutilized by neuroscience researchers presumably due to its complicated concepts and procedures. This paper illustrates an application of factor analysis to identify coordinated patterns of whole-body muscle activation during treadmill walking. Ten male subjects walked on a treadmill (6.4 km/h) for 20 s during which surface electromyographic (EMG) activity was obtained from the left side sternocleidomastoid, neck extensors, erector spinae, and right side biceps femoris, rectus femoris, tibialis anterior, and medial gastrocnemius. Factor analysis revealed 65% of the variance of seven muscles sampled aligned with two orthogonal factors, labeled 'transition control' and 'loading'. These two factors describe coordinated patterns of muscular activity across body segments that would not be evident by evaluating individual muscle patterns. The results show that factor analysis can be effectively used to explore relationships among muscle patterns across all body segments to increase understanding of the complex coordination necessary for smooth and efficient locomotion. We encourage neuroscientists to consider using factor analysis to identify coordinated patterns of neuromuscular activation that would be obscured using more traditional EMG analyses.

  19. Use of Latent Profile Analysis in Studies of Gifted Students

    ERIC Educational Resources Information Center

    Mammadov, Sakhavat; Ward, Thomas J.; Cross, Jennifer Riedl; Cross, Tracy L.

    2016-01-01

    To date, in gifted education and related fields various conventional factor analytic and clustering techniques have been used extensively for investigation of the underlying structure of data. Latent profile analysis is a relatively new method in the field. In this article, we provide an introduction to latent profile analysis for gifted education…

  20. Measuring the Impact of Education on Productivity. Working Paper #261.

    ERIC Educational Resources Information Center

    Plant, Mark; Welch, Finis

    A theoretical and conceptual analysis of techniques used to measure education's contribution to productivity is followed by a discussion of the empirical measures implemented by various researchers. Standard methods of growth accounting make sense for simple measurement of factor contributions where outputs are well measured and when factor growth…

  1. Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.

    DOT National Transportation Integrated Search

    2002-07-01

    Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...

  2. AGARD Flight Test Techniques Series. Volume 14. Introduction to Flight Test Engineering (Introduction a la Technique d’essais en vol)

    DTIC Science & Technology

    1995-09-01

    path and aircraft attitude and other flight or aircraft parameters • Calculations in the frequency domain ( Fast Fourier Transform) • Data analysis...Signal filtering Image processing of video and radar data Parameter identification Statistical analysis Power spectral density Fast Fourier Transform...airspeeds both fast and slow, altitude, load factor both above and below 1g, centers of gravity (fore and aft), and with system/subsystem failures. Whether

  3. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    A review of the techniques used at Lewis Research Center (LeRC) in trace metals analysis is presented, including the results of Atomic Absorption Spectrometry and DC Arc Emission Spectrometry of blank levels and recovery experiments for several metals. The design of an Interlaboratory Study conducted by LeRC is presented. Several factors were investigated, including: laboratory, analytical technique, fuel type, concentration, and ashing additive. Conclusions drawn from the statistical analysis will help direct research efforts toward those areas most responsible for the poor interlaboratory analytical results.

  4. Matrix Assisted Laser Desorption Ionization Mass Spectrometric Analysis of Bacillus anthracis: From Fingerprint Analysis of the Bacterium to Quantification of its Toxins in Clinical Samples

    NASA Astrophysics Data System (ADS)

    Woolfitt, Adrian R.; Boyer, Anne E.; Quinn, Conrad P.; Hoffmaster, Alex R.; Kozel, Thomas R.; de, Barun K.; Gallegos, Maribel; Moura, Hercules; Pirkle, James L.; Barr, John R.

    A range of mass spectrometry-based techniques have been used to identify, characterize and differentiate Bacillus anthracis, both in culture for forensic applications and for diagnosis during infection. This range of techniques could usefully be considered to exist as a continuum, based on the degrees of specificity involved. We show two examples here, a whole-organism fingerprinting method and a high-specificity assay for one unique protein, anthrax lethal factor.

  5. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2017-10-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  6. Impact of Damping Uncertainty on SEA Model Response Variance

    NASA Technical Reports Server (NTRS)

    Schiller, Noah; Cabell, Randolph; Grosveld, Ferdinand

    2010-01-01

    Statistical Energy Analysis (SEA) is commonly used to predict high-frequency vibroacoustic levels. This statistical approach provides the mean response over an ensemble of random subsystems that share the same gross system properties such as density, size, and damping. Recently, techniques have been developed to predict the ensemble variance as well as the mean response. However these techniques do not account for uncertainties in the system properties. In the present paper uncertainty in the damping loss factor is propagated through SEA to obtain more realistic prediction bounds that account for both ensemble and damping variance. The analysis is performed on a floor-equipped cylindrical test article that resembles an aircraft fuselage. Realistic bounds on the damping loss factor are determined from measurements acquired on the sidewall of the test article. The analysis demonstrates that uncertainties in damping have the potential to significantly impact the mean and variance of the predicted response.

  7. A stiffness derivative finite element technique for determination of crack tip stress intensity factors

    NASA Technical Reports Server (NTRS)

    Parks, D. M.

    1974-01-01

    A finite element technique for determination of elastic crack tip stress intensity factors is presented. The method, based on the energy release rate, requires no special crack tip elements. Further, the solution for only a single crack length is required, and the crack is 'advanced' by moving nodal points rather than by removing nodal tractions at the crack tip and performing a second analysis. The promising straightforward extension of the method to general three-dimensional crack configurations is presented and contrasted with the practical impossibility of conventional energy methods.

  8. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  9. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  10. Boundary formulations for sensitivity analysis without matrix derivatives

    NASA Technical Reports Server (NTRS)

    Kane, J. H.; Guru Prasad, K.

    1993-01-01

    A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.

  11. Quality Assessment of College Admissions Processes.

    ERIC Educational Resources Information Center

    Fisher, Caroline; Weymann, Elizabeth; Todd, Amy

    2000-01-01

    This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…

  12. A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin McCarthy; Milos Manic

    Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less

  13. Investigation of machinability characteristics on EN47 steel for cutting force and tool wear using optimization technique

    NASA Astrophysics Data System (ADS)

    M, Vasu; Shivananda Nayaka, H.

    2018-06-01

    In this experimental work dry turning process carried out on EN47 spring steel with coated tungsten carbide tool insert with 0.8 mm nose radius are optimized by using statistical technique. Experiments were conducted at three different cutting speeds (625, 796 and 1250 rpm) with three different feed rates (0.046, 0.062 and 0.093 mm/rev) and depth of cuts (0.2, 0.3 and 0.4 mm). Experiments are conducted based on full factorial design (FFD) 33 three factors and three levels. Analysis of variance is used to identify significant factor for each output response. The result reveals that feed rate is the most significant factor influencing on cutting force followed by depth of cut and cutting speed having less significance. Optimum machining condition for cutting force obtained from the statistical technique. Tool wear measurements are performed with optimum condition of Vc = 796 rpm, ap = 0.2 mm, f = 0.046 mm/rev. The minimum tool wear observed as 0.086 mm with 5 min machining. Analysis of tool wear was done by confocal microscope it was observed that tool wear increases with increasing cutting time.

  14. Optimization of sol-gel technique for coating of metallic substrates by hydroxyapatite using the Taguchi method

    NASA Astrophysics Data System (ADS)

    Pourbaghi-Masouleh, M.; Asgharzadeh, H.

    2013-08-01

    In this study, the Taguchi method of design of experiment (DOE) was used to optimize the hydroxyapatite (HA) coatings on various metallic substrates deposited by sol-gel dip-coating technique. The experimental design consisted of five factors including substrate material (A), surface preparation of substrate (B), dipping/withdrawal speed (C), number of layers (D), and calcination temperature (E) with three levels of each factor. An orthogonal array of L18 type with mixed levels of the control factors was utilized. The image processing of the micrographs of the coatings was conducted to determine the percentage of coated area ( PCA). Chemical and phase composition of HA coatings were studied by XRD, FT-IR, SEM, and EDS techniques. The analysis of variance (ANOVA) indicated that the PCA of HA coatings was significantly affected by the calcination temperature. The optimum conditions from signal-to-noise ( S/N) ratio analysis were A: pure Ti, B: polishing and etching for 24 h, C: 50 cm min-1, D: 1, and E: 300 °C. In the confirmation experiment using the optimum conditions, the HA coating with high PCA of 98.5 % was obtained.

  15. Confirmatory factor analysis using Microsoft Excel.

    PubMed

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  16. How Does One Assess the Accuracy of Academic Success Predictors? ROC Analysis Applied to University Entrance Factors

    ERIC Educational Resources Information Center

    Vivo, Juana-Maria; Franco, Manuel

    2008-01-01

    This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…

  17. Lamp mapping technique for independent determination of the water vapor mixing ratio calibration factor for a Raman lidar system

    NASA Astrophysics Data System (ADS)

    Venable, Demetrius D.; Whiteman, David N.; Calhoun, Monique N.; Dirisu, Afusat O.; Connell, Rasheen M.; Landulfo, Eduardo

    2011-08-01

    We have investigated a technique that allows for the independent determination of the water vapor mixing ratio calibration factor for a Raman lidar system. This technique utilizes a procedure whereby a light source of known spectral characteristics is scanned across the aperture of the lidar system's telescope and the overall optical efficiency of the system is determined. Direct analysis of the temperature-dependent differential scattering cross sections for vibration and vibration-rotation transitions (convolved with narrowband filters) along with the measured efficiency of the system, leads to a theoretical determination of the water vapor mixing ratio calibration factor. A calibration factor was also obtained experimentally from lidar measurements and radiosonde data. A comparison of the theoretical and experimentally determined values agrees within 5%. We report on the sensitivity of the water vapor mixing ratio calibration factor to uncertainties in parameters that characterize the narrowband transmission filters, the temperature-dependent differential scattering cross section, and the variability of the system efficiency ratios as the lamp is scanned across the aperture of the telescope used in the Howard University Raman Lidar system.

  18. Assessing Suicide Risk Among Callers to Crisis Hotlines: A Confirmatory Factor Analysis

    PubMed Central

    Witte, Tracy K.; Gould, Madelyn S.; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E.; Kalafat, John

    2012-01-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. 1,085 suicidal callers to crisis hotlines were divided into three sub-samples, which allowed us to conduct an independent Exploratory Factor Analysis (EFA), EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and CFA. Similar to previous factor analytic studies (Beck et al., 1997; Holden & DeLisle, 2005; Joiner, Rudd, & Rajab, 1997; Witte et al., 2006), we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations) and one factor representing more mild suicidal ideation (i.e., Suicidal Desire and Ideation). Using structural equation modeling techniques, we found preliminary evidence that the Resolved Plans and Preparations factor trended toward being more predictive of suicidal ideation than the Suicidal Desire and Ideation factor. This factor analytic study is the first longitudinal study of the obtained factors. PMID:20578186

  19. Application of Semiparametric Spline Regression Model in Analyzing Factors that In uence Population Density in Central Java

    NASA Astrophysics Data System (ADS)

    Sumantari, Y. D.; Slamet, I.; Sugiyanto

    2017-06-01

    Semiparametric regression is a statistical analysis method that consists of parametric and nonparametric regression. There are various approach techniques in nonparametric regression. One of the approach techniques is spline. Central Java is one of the most densely populated province in Indonesia. Population density in this province can be modeled by semiparametric regression because it consists of parametric and nonparametric component. Therefore, the purpose of this paper is to determine the factors that in uence population density in Central Java using the semiparametric spline regression model. The result shows that the factors which in uence population density in Central Java is Family Planning (FP) active participants and district minimum wage.

  20. Determining the Number of Factors to Retain in EFA: Using the SPSS R-Menu v2.0 to Make More Judicious Estimations

    ERIC Educational Resources Information Center

    Courtney, Matthew Gordon Ray

    2013-01-01

    Exploratory factor analysis (EFA) is a common technique utilized in the development of assessment instruments. The key question when performing this procedure is how to best estimate the number of factors to retain. This is especially important as under- or over-extraction may lead to erroneous conclusions. Although recent advancements have been…

  1. The contribute of DInSAR techniques to landslide hazard evaluation in mountain and hilly regions: a case study from Agno Valley (North-Eastern Italian Alps)

    NASA Astrophysics Data System (ADS)

    De Agostini, A.; Floris, M.; Pasquali, P.; Barbieri, M.; Cantone, A.; Riccardi, P.; Stevan, G.; Genevois, R.

    2012-04-01

    In the last twenty years, Differential Synthetic Aperture Radar Interferometry (DInSAR) techniques have been widely used to investigate geological processes, such as subsidence, earthquakes and landslides, through the evaluation of earth surface displacements caused by these processes. In the study of mass movements, contribution of interferometry can be limited due to the acquisition geometry of RADAR images and the rough morphology of mountain and hilly regions which represent typical landslide-prone areas. In this study, the advanced DInSAR techniques (i.e. Small Baseline Subset and Persistent Scatterers techniques), available in SARscape software, are used. These methods involve the use of multiple acquisitions stacks (large SAR temporal series) allowing improvements and refinements in landslide identification, characterization and hazard evaluation at the basin scale. Potential and limits of above mentioned techniques are outlined and discussed. The study area is the Agno Valley, located in the North-Eastern sector of Italian Alps and included in the Vicenza Province (Veneto Region, Italy). This area and the entire Vicenza Province were hit by an exceptional rainfall event on November 2010 that triggered more than 500 slope instabilities. The main aim of the work is to verify if spatial information available before the rainfall event, including ERS and ENVISAT RADAR data from 1992 to 2010, were able to predict the landslides occurred in the study area, in order to implement an effectiveness forecasting model. In the first step of the work a susceptibility analysis is carried out using landslide dataset from the IFFI project (Inventario Fenomeni Franosi in Italia, Landslide Italian Inventory) and related predisposing factors, which consist of morphometric (elevation, slope, aspect and curvature) and non-morphometric (land use, distance of roads and distance of river) factors available from the Veneto Region spatial database. Then, to test the prediction, the results of susceptibility analysis are compared with the location of landslides occurred in the study area during the November 2010 rainfall event. In the second step, results of DInSAR analysis (displacement maps over the time) are added on the prediction analysis to build up a map containing both spatial and temporal information on landslides and, as in the previous case, the prediction is tested by using November 2010 instabilities dataset. Comparison of the two tests allows to evaluate the contribution of interferometric techniques. Finally, morphometric factors and interferometric RADAR data are combined to design a preliminary analysis scheme that provide information on possible use of DInSAR techniques in landslide hazard evaluation of a given area.

  2. Comparison of data inversion techniques for remotely sensed wide-angle observations of Earth emitted radiation

    NASA Technical Reports Server (NTRS)

    Green, R. N.

    1981-01-01

    The shape factor, parameter estimation, and deconvolution data analysis techniques were applied to the same set of Earth emitted radiation measurements to determine the effects of different techniques on the estimated radiation field. All three techniques are defined and their assumptions, advantages, and disadvantages are discussed. Their results are compared globally, zonally, regionally, and on a spatial spectrum basis. The standard deviations of the regional differences in the derived radiant exitance varied from 7.4 W-m/2 to 13.5 W-m/2.

  3. Otoplasty: A graduated approach.

    PubMed

    Foda, H M

    1999-01-01

    Numerous otoplastic techniques have been described for the correction of protruding ears. Technique selection in otoplasty should be done only after careful analysis of the abnormal anatomy responsible for the protruding ear deformity. A graduated surgical approach is presented which is designed to address all contributing factors to the presenting auricular deformity. The approach starts with the more conservative cartilage-sparing suturing techniques, then proceeds to incorporate other more aggressive cartilage weakening maneuvers. Applying this approach resulted in better long-term results with less postoperative lateralization than that encountered on using the cartilage-sparing techniques alone.

  4. Glutenite bodies sequence division of the upper Es4 in northern Minfeng zone of Dongying Sag, Bohai Bay Basin, China

    NASA Astrophysics Data System (ADS)

    Shao, Xupeng

    2017-04-01

    Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy

  5. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  6. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  7. Phasor Analysis of Binary Diffraction Gratings with Different Fill Factors

    ERIC Educational Resources Information Center

    Martinez, Antonio; Sanchez-Lopez, Ma del Mar; Moreno, Ignacio

    2007-01-01

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving…

  8. Ranking the strategies for Indian medical tourism sector through the integration of SWOT analysis and TOPSIS method.

    PubMed

    Ajmera, Puneeta

    2017-10-09

    Purpose Organizations have to evaluate their internal and external environments in this highly competitive world. Strengths, weaknesses, opportunities and threats (SWOT) analysis is a very useful technique which analyzes the strengths, weaknesses, opportunities and threats of an organization for taking strategic decisions and it also provides a foundation for the formulation of strategies. But the drawback of SWOT analysis is that it does not quantify the importance of individual factors affecting the organization and the individual factors are described in brief without weighing them. Because of this reason, SWOT analysis can be integrated with any multiple attribute decision-making (MADM) technique like the technique for order preference by similarity to ideal solution (TOPSIS), analytical hierarchy process, etc., to evaluate the best alternative among the available strategic alternatives. The paper aims to discuss these issues. Design/methodology/approach In this study, SWOT analysis is integrated with a multicriteria decision-making technique called TOPSIS to rank different strategies for Indian medical tourism in order of priority. Findings SO strategy (providing best facilitation and care to the medical tourists at par to developed countries) is the best strategy which matches with the four elements of S, W, O and T of SWOT matrix and 35 strategic indicators. Practical implications This paper proposes a solution based on a combined SWOT analysis and TOPSIS approach to help the organizations to evaluate and select strategies. Originality/value Creating a new technology or administering a new strategy always has some degree of resistance by employees. To minimize resistance, the author has used TOPSIS as it involves group thinking, requiring every manager of the organization to analyze and evaluate different alternatives and average measure of each parameter in final decision matrix.

  9. A method to enhance the measurement accuracy of Raman shift based on high precision calibration technique

    NASA Astrophysics Data System (ADS)

    Ding, Xiang; Li, Fei; Zhang, Jiyan; Liu, Wenli

    2016-10-01

    Raman spectrometers are usually calibrated periodically to ensure their measurement accuracy of Raman shift. A combination of a piece of monocrystalline silicon chip and a low pressure discharge lamp is proposed as a candidate for the reference standard of Raman shift. A high precision calibration technique is developed to accurately determine the standard value of the silicon's Raman shift around 520cm-1. The technique is described and illustrated by measuring a piece of silicon chip against three atomic spectral lines of a neon lamp. A commercial Raman spectrometer is employed and its error characteristics of Raman shift are investigated. Error sources are evaluated based on theoretical analysis and experiments, including the sample factor, the instrumental factor, the laser factor and random factors. Experimental results show that the expanded uncertainty of the silicon's Raman shift around 520cm-1 can acheive 0.3 cm-1 (k=2), which is more accurate than most of currently used reference materials. The results are validated by comparison measurement between three Raman spectrometers. It is proved that the technique can remarkably enhance the accuracy of Raman shift, making it possible to use the silicon and the lamp to calibrate Raman spectrometers.

  10. Measurement of the transition form factor of {eta} meson with WASA-at-COSY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatt, H.

    2011-10-24

    Reaction {eta}{yields}e{sup +}e{sup -}{gamma} is used to investigate the transition form factor of {eta} meson with WASA detector at COSY. Where the {eta} meson is produced in pp collision at 1.4 GeV. We present the analysis techniques and preliminary results of {eta} Dalitz decays.

  11. Job Satisfaction: Factor Analysis of Greek Primary School Principals' Perceptions

    ERIC Educational Resources Information Center

    Saiti, Anna; Fassoulis, Konstantinos

    2012-01-01

    Purpose: The purpose of this paper is to investigate the factors that affect the level of job satisfaction that school principals experience and, based on the findings, to suggest policies or techniques for improving it. Design/methodology/approach: Questionnaires were administered to 180 primary school heads in 13 prefectures--one from each of…

  12. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  13. Multivariate analysis of heavy metal contamination using river sediment cores of Nankan River, northern Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, An-Sheng; Lu, Wei-Li; Huang, Jyh-Jaan; Chang, Queenie; Wei, Kuo-Yen; Lin, Chin-Jung; Liou, Sofia Ya Hsuan

    2016-04-01

    Through the geology and climate characteristic in Taiwan, generally rivers carry a lot of suspended particles. After these particles settled, they become sediments which are good sorbent for heavy metals in river system. Consequently, sediments can be found recording contamination footprint at low flow energy region, such as estuary. Seven sediment cores were collected along Nankan River, northern Taiwan, which is seriously contaminated by factory, household and agriculture input. Physico-chemical properties of these cores were derived from Itrax-XRF Core Scanner and grain size analysis. In order to interpret these complex data matrices, the multivariate statistical techniques (cluster analysis, factor analysis and discriminant analysis) were introduced to this study. Through the statistical determination, the result indicates four types of sediment. One of them represents contamination event which shows high concentration of Cu, Zn, Pb, Ni and Fe, and low concentration of Si and Zr. Furthermore, three possible contamination sources of this type of sediment were revealed by Factor Analysis. The combination of sediment analysis and multivariate statistical techniques used provides new insights into the contamination depositional history of Nankan River and could be similarly applied to other river systems to determine the scale of anthropogenic contamination.

  14. Texture analysis of medical images for radiotherapy applications

    PubMed Central

    Rizzo, Giovanna

    2017-01-01

    The high-throughput extraction of quantitative information from medical images, known as radiomics, has grown in interest due to the current necessity to quantitatively characterize tumour heterogeneity. In this context, texture analysis, consisting of a variety of mathematical techniques that can describe the grey-level patterns of an image, plays an important role in assessing the spatial organization of different tissues and organs. For these reasons, the potentiality of texture analysis in the context of radiotherapy has been widely investigated in several studies, especially for the prediction of the treatment response of tumour and normal tissues. Nonetheless, many different factors can affect the robustness, reproducibility and reliability of textural features, thus limiting the impact of this technique. In this review, an overview of the most recent works that have applied texture analysis in the context of radiotherapy is presented, with particular focus on the assessment of tumour and tissue response to radiations. Preliminary, the main factors that have an influence on features estimation are discussed, highlighting the need of more standardized image acquisition and reconstruction protocols and more accurate methods for region of interest identification. Despite all these limitations, texture analysis is increasingly demonstrating its ability to improve the characterization of intratumour heterogeneity and the prediction of clinical outcome, although prospective studies and clinical trials are required to draw a more complete picture of the full potential of this technique. PMID:27885836

  15. Multifactor valuation models of energy futures and options on futures

    NASA Astrophysics Data System (ADS)

    Bertus, Mark J.

    The intent of this dissertation is to investigate continuous time pricing models for commodity derivative contracts that consider mean reversion. The motivation for pricing commodity futures and option on futures contracts leads to improved practical risk management techniques in markets where uncertainty is increasing. In the dissertation closed-form solutions to mean reverting one-factor, two-factor, three-factor Brownian motions are developed for futures contracts. These solutions are obtained through risk neutral pricing methods that yield tractable expressions for futures prices, which are linear in the state variables, hence making them attractive for estimation. These functions, however, are expressed in terms of latent variables (i.e. spot prices, convenience yield) which complicate the estimation of the futures pricing equation. To address this complication a discussion on Dynamic factor analysis is given. This procedure documents latent variables using a Kalman filter and illustrations show how this technique may be used for the analysis. In addition, to the futures contracts closed form solutions for two option models are obtained. Solutions to the one- and two-factor models are tailored solutions of the Black-Scholes pricing model. Furthermore, since these contracts are written on the futures contracts, they too are influenced by the same underlying parameters of the state variables used to price the futures contracts. To conclude, the analysis finishes with an investigation of commodity futures options that incorporate random discrete jumps.

  16. Breaking the conflict of tionghoa-java in surakarta at reformation period 1998

    NASA Astrophysics Data System (ADS)

    Riyadi; Hermawan, ES; Aji, RNB; Trilaksana, A.; Mastuti, S.

    2018-01-01

    The issues raised in this paper are potential conflicts and efforts to create harmony of the socio-cultural environment of ethnic Chinese-Javanese. This research is to know the historical background of the process, and the development of ethnic Chinese descent in Surakarta City and how far the potential conflict and causal factor of conflict between ethnic Chinese and ethnic indigenous of Java so that known factors become obstacle of social integration process of ethnic of Chinese and indigenous Java in Surakarta. Approach of this research is descriptive qualitative. Data collection techniques were initially used in the questionnaire distribution model, followed by: in-depth interviews and (2) involved observation, document content analysis and FGD. To obtain degree of high validity, done by technique triangulation, recheck and peer debriefing. This research using interactive technique analysis. The result of the research can be concluded that the conflict arising from the existence of domestic economic and political pressure has forced Chinese people to migrate to Southeast Asia, including Indonesia and then there are several conflicts in many areas in Indonesia. The conflict between ethnic Chinese and Javanese in Surakarta occurred in 1980 and 1998. The conflict resolution can be done by optimizing social, cultural, and economic factors. This factor is used as a social adhesive to the integration between ethnic Chinese and Javanese in Surakarta.

  17. Chromatin Immunoprecipitation Sequencing (ChIP-Seq) for Transcription Factors and Chromatin Factors in Arabidopsis thaliana Roots: From Material Collection to Data Analysis.

    PubMed

    Cortijo, Sandra; Charoensawan, Varodom; Roudier, François; Wigge, Philip A

    2018-01-01

    Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-seq) is a powerful technique to investigate in vivo transcription factor (TF) binding to DNA, as well as chromatin marks. Here we provide a detailed protocol for all the key steps to perform ChIP-seq in Arabidopsis thaliana roots, also working on other A. thaliana tissues and in most non-ligneous plants. We detail all steps from material collection, fixation, chromatin preparation, immunoprecipitation, library preparation, and finally computational analysis based on a combination of publicly available tools.

  18. Some dissociating factors in the analysis of structural and functional progressive damage in open-angle glaucoma.

    PubMed

    Hudson, C J W; Kim, L S; Hancock, S A; Cunliffe, I A; Wild, J M

    2007-05-01

    To identify the presence, and origin, of any "dissociating factors" inherent to the techniques for evaluating progression that mask the relationship between structural and functional progression in open-angle glaucoma (OAG). 23 patients (14 with OAG and 9 with ocular hypertension (OHT)) who had received serial Heidelberg Retina Tomograph (HRT II) and Humphrey Field Analyser (HFA) examinations for >or=5 years (mean 78.4 months (SD 9.5), range 60-101 months) were identified. Evidence of progressive disease was retrospectively evaluated in one eye of each patient using the Topographic Change Analysis (TCA) and Glaucoma Progression Analysis (GPA) for the HRT II and HFA, respectively. Six patients were stable by both techniques; four exhibited both structural and functional progression; seven exhibited structural progression, only, and six showed functional progression, only. Three types of dissociating factors were identified. TCA failed to identify progressive structural damage in the presence of advanced optic nerve head damage. GPA failed to identify progressive functional damage at stimulus locations, with sensitivities exhibiting test-retest variability beyond the maximum stimulus luminance of the perimeter, and where a perimetric learning effect was apparent. The three dissociating factors accounted for nine of the 13 patients who exhibited a lack of concordance between structural and functional progressive damage.

  19. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  20. Exploring Incomplete Rating Designs with Mokken Scale Analysis

    ERIC Educational Resources Information Center

    Wind, Stefanie A.; Patil, Yogendra J.

    2018-01-01

    Recent research has explored the use of models adapted from Mokken scale analysis as a nonparametric approach to evaluating rating quality in educational performance assessments. A potential limiting factor to the widespread use of these techniques is the requirement for complete data, as practical constraints in operational assessment systems…

  1. Foresight begins with FMEA. Delivering accurate risk assessments.

    PubMed

    Passey, R D

    1999-03-01

    If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.

  2. Factors of Compliance of a Child with Rules in a Russian Cultural Context

    ERIC Educational Resources Information Center

    Bayanova, Larisa F.; Mustafin, Timur R.

    2016-01-01

    The article covers the analysis of the child's psychology compliance with culture rules--the cultural congruence. The description of the technique aimed to detect the cultural congruence of five- to six-year-old children is presented. The technique is made on the basis of the revealed range of rules of a child's and adult's interaction in a social…

  3. Older driver highway design handbook

    DOT National Transportation Integrated Search

    1998-01-01

    This project included literature reviews and research syntheses, using meta-analytic techniques where : appropriate, in the areas of age-related (diminished) functional capabilities, and human factors and : highway safety. A User-Requirements Analysi...

  4. Ergonomic study on wrist posture when using laparoscopic tools in four different techniques regarding minimally invasive surgery.

    PubMed

    Bartnicka, Joanna; Zietkiewicz, Agnieszka A; Kowalski, Grzegorz J

    2018-03-19

    With reference to four different minimally invasive surgery (MIS) cholecystectomy the aims were: to recognize the factors influencing dominant wrist postures manifested by the surgeon; to detect risk factors involved in maintaining deviated wrist postures; to compare the wrist postures of surgeons while using laparoscopic tools. Video films were recorded during live surgeries. The films were synchronized with wrist joint angles obtained from wireless electrogoniometers placed on the surgeon's hand. The analysis was conducted for five different laparoscopic tools used during all surgical techniques. The most common wrist posture was extension. In the case of one laparoscopic tool, the mean values defining extended wrist posture were distinct in all four surgical techniques. For one type of surgical technique, considered to be the most beneficial for patients, more extreme postures were noticed regarding all laparoscopic tools. We recognized a new factor, apart from the tool's handle design, that influences extreme and deviated wrist postures. It involves three areas of task specification including the type of action, type of motion patterns and motion dynamism. The outcomes proved that the surgical technique which is most beneficial for the patient imposes the greatest strain on the surgeon's wrist.

  5. Early Oscillation Detection for Hybrid DC/DC Converter Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Wang, Bright L.

    2011-01-01

    This paper describes a novel fault detection technique for hybrid DC/DC converter oscillation diagnosis. The technique is based on principles of feedback control loop oscillation and RF signal modulations, and Is realized by using signal spectral analysis. Real-circuit simulation and analytical study reveal critical factors of the oscillation and indicate significant correlations between the spectral analysis method and the gain/phase margin method. A stability diagnosis index (SDI) is developed as a quantitative measure to accurately assign a degree of stability to the DC/DC converter. This technique Is capable of detecting oscillation at an early stage without interfering with DC/DC converter's normal operation and without limitations of probing to the converter.

  6. DNA-PCR analysis of bloodstains sampled by the polyvinyl-alcohol method.

    PubMed

    Schyma, C; Huckenbeck, W; Bonte, W

    1999-01-01

    Among the usual techniques of sampling gunshot residues (GSR), the polyvinyl-alcohol method (PVAL) includes the advantage of embedding all particles, foreign bodies and stains on the surface of the shooter's hand in exact and reproducible topographic localization. The aim of the present study on ten persons killed by firearms was to check the possibility of DNA-PCR typing of blood traces embedded in the PVAL gloves in a second step following GSR analysis. The results of these examinations verify that the PVAL technique does not include factors that inhibit successful PCR typing. Thus the PVAL method can be recommended as a combination technique to secure and preserve inorganic and biological traces at the same time.

  7. Applying the ICF framework to study changes in quality-of-life for youth with chronic conditions

    PubMed Central

    McDougall, Janette; Wright, Virginia; Schmidt, Jonathan; Miller, Linda; Lowry, Karen

    2011-01-01

    Objective The objective of this paper is to describe how the ICF framework was applied as the foundation for a longitudinal study of changes in quality-of-life (QoL) for youth with chronic conditions. Method This article will describe the study’s aims, methods, measures and data analysis techniques. It will point out how the ICF framework was used—and expanded upon—to provide a model for studying the impact of factors on changes in QoL for youth with chronic conditions. Further, it will describe the instruments that were chosen to measure the components of the ICF framework and the data analysis techniques that will be used to examine the impact of factors on changes in youths’ QoL. Conclusions Qualitative and longitudinal designs for studying QoL based on the ICF framework can be useful for unraveling the complex ongoing inter-relationships among functioning, contextual factors and individuals’ perceptions of their QoL. PMID:21034288

  8. Solar cycle signal in air temperature in North America - Amplitude, gradient, phase and distribution

    NASA Technical Reports Server (NTRS)

    Currie, R. G.

    1981-01-01

    The considered investigation was motivated by three factors. One is related to an extension of single-channel MESA to multi-channel by Strand (1977), Morf et al. (1978), and Jones (1978). MESA is a high-resolution signal processing and spectrum analysis technique due to Burg (1975). The considered developments resulted in the discovery of the 11-year solar cycle signal in the change of the length of day by Currie (1980, 1981). They also led Currie (1981) to study the phase spectrum of the 11-year term in height H of sea level. The investigation tries to clarify the phase relations among the involved parameters. The second factor is connected with an application of the linear time domain technique used by Currie (1981) to temperature records to obtain more accurate information regarding the signal amplitude. The third factor of motivation is related to increases in the number of stations available for an analysis, the greater average length of the records, and the more accurate data set.

  9. Exploring how surgeon teachers motivate residents in the operating room.

    PubMed

    Dath, Deepak; Hoogenes, Jen; Matsumoto, Edward D; Szalay, David A

    2013-02-01

    Motivation in teaching, mainly studied in disciplines outside of surgery, may also be an important part of intraoperative teaching. We explored techniques surgeons use to motivate learners in the operating room (OR). Forty-four experienced surgeon teachers from multiple specialties participated in 9 focus groups about teaching in the OR. Focus groups were transcribed and subjected to qualitative thematic analysis by 3 reviewers through an iterative, rigorous process. Analysis revealed 8 motivational techniques. Surgeons used motivation techniques tacitly, describing multiple ways that they facilitate resident motivation while teaching. Two major categories of motivational techniques emerged: (1) the facilitation of intrinsic motivation; and (2) the provision of factors to stimulate extrinsic motivation. Surgeons unknowingly but tacitly and commonly use motivation in intraoperative teaching and use a variety of techniques to foster learners' intrinsic and extrinsic motivation. Motivating learners is 1 vital role that surgeon teachers play in nontechnical intraoperative teaching. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Radiation therapy in the treatment of cervical cancer: The University of Chicago/Michael Reese Hospital experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rader, J.S.; Haraf, D.J.; Halpern, H.J.

    1990-07-01

    A retrospective analysis was conducted on 307 patients referred for radiation therapy at The University of Chicago and Michael Reese Hospital between 1971 and 1986. Median follow-up was 6.4 years. Treatment techniques varied during the time of the study. Actuarial disease-free survivals were 78%, 64%, 55%, 33%, 41%, and 60% for stage IB, IIA, IIB, IIIA, IIIB, and IVA, respectively. Stage, size of the cervical lesion, and hemoglobin level during treatment were prognostic factors. Treatment technique as well as time dose factors were analyzed with respect to survival, failures, and complications.

  11. The workload book: Assessment of operator workload to engineering systems

    NASA Technical Reports Server (NTRS)

    Gopher, D.

    1983-01-01

    The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.

  12. A pilot study measuring changes in student impressions before and after clinical training using a questionnaire based on the semantic differential technique.

    PubMed

    Tamura, Naomi; Terashita, Takayoshi; Ogasawara, Katsuhiko

    2013-01-01

    Students with a positive impression of their studies can become more motivated. This study measured the learning impact of clinical training by comparing student impressions before and after clinical training. The study included 32 students of radiological technology in their final year with the Division of Radiological Science and Technology, Department of Health Sciences, School of Medicine, Hokkaido University. To measure student impressions of x-ray examination training, we developed a questionnaire using the semantic differential technique. The resulting factor analysis identified 2 factors that accounted for 44.9% of the 10 bipolar adjective scales. Factor 1 represented a "resistance" impression of x-ray examination training, and factor 2 represented a "responsibility" impression. The differences in factor scores before and after the clinical training suggest that student impressions are affected by clinical training.

  13. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. Moderating Factors of Video-Modeling with Other as Model: A Meta-Analysis of Single-Case Studies

    ERIC Educational Resources Information Center

    Mason, Rose A.; Ganz, Jennifer B.; Parker, Richard I.; Burke, Mack D.; Camargo, Siglia P.

    2012-01-01

    Video modeling with other as model (VMO) is a more practical method for implementing video-based modeling techniques, such as video self-modeling, which requires significantly more editing. Despite this, identification of contextual factors such as participant characteristics and targeted outcomes that moderate the effectiveness of VMO has not…

  15. Community factors to promote parents' quality of child-nurturing life.

    PubMed

    Aoyama, Megumi; Wei, Chang Nian; Chang-nian, Wei; Harada, Koichi; Ueda, Kimiyo; Takano, Miyuki; Ueda, Atsushi

    2013-01-01

    The purpose of this study was to clarify the role of community factors in parents' quality of child-nurturing life (QCNL). We developed a questionnaire to evaluate the degree of QCNL and determine the structural factors related to QCNL as community factors related to parents' QCNL derived from focus group interviews and the Delphi technique. The questionnaire also included the battery of the self-rating depression scale and Tsumori-Inage Infant's Developmental Test. Using the questionnaire, we then conducted a quantitative survey of parents whose children attended nursery schools in Kumamoto Prefecture. Factor analysis, calculation of the mean score and/or ratio to each item, Pearson's correlation coefficient, t test, multiple regression analysis, and covariance structure analysis were performed. The questionnaire we developed consisted of seven items with 75 elements, involving ten elements as community factors. Subjects included 699 parents (mean age 33.6 ± 5.4 years) and 965 children (age range 0-6 years). Factor analysis revealed that community factors consisted of five factors, such as "lifestyle rooted in the ground," "balance of housekeeping and work," "community network," "amenity," and "regeneration of life". These factors may be dominant in a rural area. Finally, we developed a structural model with "community factors," QCNL, QOL, and "child growth" by covariance structural analysis. The analysis revealed that community factors had a positive relation to parents' QCNL (r = 0.81, p < 0.001) and that parental SDS score had a negative relation to parents' QCNL (r = -0.59, p < 0.001). The analysis did show that community factors were positively related to the sound growth of children. The covariance structure analysis revealed that community factors were associated with parents' QCNL, SDS, and "child growth."

  16. Comparative factor analysis models for an empirical study of EEG data, II: A data-guided resolution of the rotation indeterminacy.

    PubMed

    Rogers, L J; Douglas, R R

    1984-02-01

    In this paper (the second in a series), we consider a (generic) pair of datasets, which have been analyzed by the techniques of the previous paper. Thus, their "stable subspaces" have been established by comparative factor analysis. The pair of datasets must satisfy two confirmable conditions. The first is the "Inclusion Condition," which requires that the stable subspace of one of the datasets is nearly identical to a subspace of the other dataset's stable subspace. On the basis of that, we have assumed the pair to have similar generating signals, with stochastically independent generators. The second verifiable condition is that the (presumed same) generating signals have distinct ratios of variances for the two datasets. Under these conditions a small elaboration of some elementary linear algebra reduces the rotation problem to several eigenvalue-eigenvector problems. Finally, we emphasize that an analysis of each dataset by the method of Douglas and Rogers (1983) is an essential prerequisite for the useful application of the techniques in this paper. Nonempirical methods of estimating the number of factors simply will not suffice, as confirmed by simulations reported in the previous paper.

  17. Identifying key hospital service quality factors in online health communities.

    PubMed

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain; Kim, Minki

    2015-04-07

    The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. We defined social media-based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea's two biggest online portals were used to test the effectiveness of detection of social media-based key quality factors for hospitals. To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is 78% on average. Extraction and classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media-based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies.

  18. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  19. Design and statistical problems in prevention.

    PubMed

    Gullberg, B

    1996-01-01

    Clinical and epidemiological research in osteoporosis can benefit from using the methods and techniques established in the area of chronic disease epidemiology. However, attention has to be given to the special characteristics such as the multifactorial nature and the fact that the subjects usually are of high ages. In order to evaluate prevention it is of course first necessary to detect and confirm reversible risk factors. The advantage and disadvantage of different design (cross-sectional, cohort and case-control) are well known. The effects of avoidable biases, e.g. selection, observation and confounding have to be balanced against practical conveniences like time, expenses, recruitment etc. The translation of relative risks into population attributable risks (etiologic fractions, prevented fractions) are complex and are usually performed under unrealistic, simplified assumptions. The consequences of interactions (synergy) between risk factors are often neglected. The multifactorial structure requires application of more advanced multi-level statistical techniques. The common strategy in prevention to target a cluster of risk factors in order to avoid the multifactorial nature implies that in the end it is impossible to separate each unique factor. Experimental designs for evaluating prevention like clinical trials and intervention have to take into account the distinction between explanatory and pragmatic studies. An explanatory approach is similar to an idealized laboratory trial while the pragmatic design is more realistic, practical and has a general public health perspective. The statistical techniques to be used in osteoporosis research are implemented in easy available computer-packages like SAS, SPSS, BMDP and GLIM. In addition to the traditional logistic regression methods like Cox analysis and Poisson regression also analysis of repeated measurement and cluster analysis are relevant.

  20. A Qualitative Study on Organizational Factors Affecting Occupational Accidents.

    PubMed

    Eskandari, Davood; Jafari, Mohammad Javad; Mehrabi, Yadollah; Kian, Mostafa Pouya; Charkhand, Hossein; Mirghotbi, Mostafa

    2017-03-01

    Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts' experiences and perception of organizational factors. This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Eleven organizational factors' sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents.

  1. Evaluation of drinking quality of groundwater through multivariate techniques in urban area.

    PubMed

    Das, Madhumita; Kumar, A; Mohapatra, M; Muduli, S D

    2010-07-01

    Groundwater is a major source of drinking water in urban areas. Because of the growing threat of debasing water quality due to urbanization and development, monitoring water quality is a prerequisite to ensure its suitability for use in drinking. But analysis of a large number of properties and parameter to parameter basis evaluation of water quality is not feasible in a regular interval. Multivariate techniques could streamline the data without much loss of information to a reasonably manageable data set. In this study, using principal component analysis, 11 relevant properties of 58 water samples were grouped into three statistical factors. Discriminant analysis identified "pH influence" as the most distinguished factor and pH, Fe, and NO₃⁻ as the most discriminating variables and could be treated as water quality indicators. These were utilized to classify the sampling sites into homogeneous clusters that reflect location-wise importance of specific indicator/s for use to monitor drinking water quality in the whole study area.

  2. Insight into dementia care management using social-behavioral theory and mixed methods.

    PubMed

    Connor, Karen; McNeese-Smith, Donna; van Servellen, Gwen; Chang, Betty; Lee, Martin; Cheng, Eric; Hajar, Abdulrahman; Vickrey, Barbara G

    2009-01-01

    For health organizations (private and public) to advance their care-management programs, to use resources effectively and efficiently, and to improve patient outcomes, it is germane to isolate and quantify care-management activities and to identify overarching domains. The aims of this study were to identify and report on an application of mixed methods of qualitative statistical techniques, based on a theoretical framework, and to construct variables for factor analysis and exploratory factor analytic steps for identifying domains of dementia care management. Care-management activity data were extracted from the care plans of 181 pairs of individuals (with dementia and their informal caregivers) who had participated in the intervention arm of a randomized controlled trial of a dementia care-management program. Activities were organized into types, using card-sorting methods, influenced by published theoretical constructs on self-efficacy and general strain theory. These activity types were mapped in the initial data set to construct variables for exploratory factor analysis. Principal components extraction with varimax and promax rotations was used to estimate the number of factors. Cronbach's alpha was calculated for the items in each factor to assess internal consistency reliability. The two-phase card-sorting technique yielded 45 activity types out of 450 unique activities. Exploratory factor analysis produced four care-management domains (factors): behavior management, clinical strategies and caregiver support, community agency, and safety. Internal consistency reliability (Cronbach's alpha) of items for each factor ranged from.63 for the factor "safety" to.89 for the factor "behavior management" (Factor 1). Applying a systematic method to a large set of care-management activities can identify a parsimonious number of higher order categories of variables and factors to guide the understanding of dementia care-management processes. Further application of this methodology in outcome analyses and to other data sets is necessary to test its practicality.

  3. Aqueous Mesocosm Techniques Enabling the Real-Time Measurement of the Chemical and Isotopic Kinetics of Dissolved Methane and Carbon Dioxide.

    PubMed

    Chan, Eric W; Kessler, John D; Shiller, Alan M; Joung, DongJoo; Colombo, Frank

    2016-03-15

    Previous studies of microbially mediated methane oxidation in oceanic environments have examined the many different factors that control the rates of oxidation. However, there is debate on what factor(s) are limiting in these types of environments. These factors include the availability of methane, O2, trace metals, nutrients, the density of cell population, and the influence that CO2 production may have on pH. To look at this process in its entirety, we developed an automated mesocosm incubation system with a Dissolved Gas Analysis System (DGAS) coupled to a myriad of analytical tools to monitor chemical changes during methane oxidation. Here, we present new high temporal resolution techniques for investigating dissolved methane and carbon dioxide concentrations and stable isotopic dynamics during aqueous mesocosm and pure culture incubations. These techniques enable us to analyze the gases dissolved in solution and are nondestructive to both the liquid media and the analyzed gases enabling the investigation of a mesocosm or pure culture experiment in a completely closed system, if so desired.

  4. Fourier transform ion cyclotron resonance mass spectrometry

    NASA Astrophysics Data System (ADS)

    Marshall, Alan G.

    1998-06-01

    As for Fourier transform infrared (FT-IR) interferometry and nuclear magnetic resonance (NMR) spectroscopy, the introduction of pulsed Fourier transform techniques revolutionized ion cyclotron resonance mass spectrometry: increased speed (factor of 10,000), increased sensitivity (factor of 100), increased mass resolution (factor of 10,000-an improvement not shared by the introduction of FT techniques to IR or NMR spectroscopy), increased mass range (factor of 500), and automated operation. FT-ICR mass spectrometry is the most versatile technique for unscrambling and quantifying ion-molecule reaction kinetics and equilibria in the absence of solvent (i.e., the gas phase). In addition, FT-ICR MS has the following analytically important features: speed (~1 second per spectrum); ultrahigh mass resolution and ultrahigh mass accuracy for analysis of mixtures and polymers; attomole sensitivity; MSn with one spectrometer, including two-dimensional FT/FT-ICR/MS; positive and/or negative ions; multiple ion sources (especially MALDI and electrospray); biomolecular molecular weight and sequencing; LC/MS; and single-molecule detection up to 108 Dalton. Here, some basic features and recent developments of FT-ICR mass spectrometry are reviewed, with applications ranging from crude oil to molecular biology.

  5. Restructuring the rotor analysis program C-60

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.

  6. Development and Validation of a Behavioral Screener for Preschool-Age Children

    ERIC Educational Resources Information Center

    DiStefano, Christine A.; Kamphaus, Randy W.

    2007-01-01

    The purpose of this study was to document the development of a short behavioral scale that could be used to assess preschoolers' behavior while still retaining adequate scale coverage, reliability, and validity. Factor analysis and item analysis techniques were applied to data from a nationally representative, normative database to create a…

  7. Factors Influencing International Students' Choice of an Education Destination--A Correspondence Analysis

    ERIC Educational Resources Information Center

    Shanka, Tekle; Quintal, Vanessa; Taylor, Ruth

    2005-01-01

    A correspondence analysis technique was employed to elicit information from international students pertaining to their choice of study destination. A survey of international students at a major Australian university revealed that the proximity of the city to the students' home countries, in addition to safety, the educational quality/variety, etc.…

  8. Critical incident technique analysis applied to perianesthetic cardiac arrests at a university teaching hospital.

    PubMed

    Hofmeister, Erik H; Reed, Rachel A; Barletta, Michele; Shepard, Molly; Quandt, Jane

    2018-05-01

    To apply the critical incident technique (CIT) methodology to a series of perianesthetic cardiac arrest events at a university teaching hospital to describe the factors that contributed to cardiac arrest. CIT qualitative analysis of a case series. A group of 16 dogs and cats that suffered a perioperative cardiac arrest between November 2013 and November 2016. If an arrest occurred, the event was discussed among the anesthesiologists. The discussion included a description of the case, a description of the sequence of events leading up to the arrest and a discussion of what could have been done to affect the outcome. A written description of the case and the event including animal signalment and a timeline of events was provided by the supervising anesthesiologist following discussion among the anesthesiologists. Only dogs or cats were included. After the data collection period, information from the medical record was collected. A qualitative document analysis was performed on the summaries provided about each case by the supervising anesthesiologist, the medical record and any supporting documents. Each case was then classified into one or more of the following: animal, human, equipment, drug and procedural factors for cardiac arrest. The most common factor was animal (n=14), followed by human (n=12), procedural (n=4), drugs (n=1) and equipment (n=1). The majority (n=11) of animals had multiple factors identified. Cardiac arrests during anesthesia at a referral teaching hospital were primarily a result of animal and human factors. Arrests because of procedural, drug and equipment factors were uncommon. Most animals experienced more than one factor and two animals arrested after a change in recumbency. Future work should focus on root cause analysis and interventions designed to minimize all factors, particularly human ones. Copyright © 2018 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. Published by Elsevier Ltd. All rights reserved.

  9. Potential barriers to the application of multi-factor portfolio analysis in public hospitals: evidence from a pilot study in the Netherlands.

    PubMed

    Pavlova, Milena; Tsiachristas, Apostolos; Vermaeten, Gerhard; Groot, Wim

    2009-01-01

    Portfolio analysis is a business management tool that can assist health care managers to develop new organizational strategies. The application of portfolio analysis to US hospital settings has been frequently reported. In Europe however, the application of this technique has received little attention, especially concerning public hospitals. Therefore, this paper examines the peculiarities of portfolio analysis and its applicability to the strategic management of European public hospitals. The analysis is based on a pilot application of a multi-factor portfolio analysis in a Dutch university hospital. The nature of portfolio analysis and the steps in a multi-factor portfolio analysis are reviewed along with the characteristics of the research setting. Based on these data, a multi-factor portfolio model is developed and operationalized. The portfolio model is applied in a pilot investigation to analyze the market attractiveness and hospital strengths with regard to the provision of three orthopedic services: knee surgery, hip surgery, and arthroscopy. The pilot portfolio analysis is discussed to draw conclusions about potential barriers to the overall adoption of portfolio analysis in the management of a public hospital. Copyright (c) 2008 John Wiley & Sons, Ltd.

  10. Transportation Systems Evaluation

    NASA Technical Reports Server (NTRS)

    Fanning, M. L.; Michelson, R. A.

    1972-01-01

    A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.

  11. Steam generator tubing NDE performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, G.; Welty, C.S. Jr.

    1997-02-01

    Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less

  12. Application of Monte Carlo techniques to transient thermal modeling of cavity radiometers having diffuse-specular surfaces

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.; Eskin, L. D.

    1981-01-01

    A viable alternative to the net exchange method of radiative analysis which is equally applicable to diffuse and diffuse-specular enclosures is presented. It is particularly more advantageous to use than the net exchange method in the case of a transient thermal analysis involving conduction and storage of energy as well as radiative exchange. A new quantity, called the distribution factor is defined which replaces the angle factor and the configuration factor. Once obtained, the array of distribution factors for an ensemble of surface elements which define an enclosure permits the instantaneous net radiative heat fluxes to all of the surfaces to be computed directly in terms of the known surface temperatures at that instant. The formulation of the thermal model is described, as is the determination of distribution factors by application of a Monte Carlo analysis. The results show that when fewer than 10,000 packets are emitted, an unsatisfactory approximation for the distribution factors is obtained, but that 10,000 packets is sufficient.

  13. Piggyback technique in adult orthotopic liver transplantation: an analysis of 1067 liver transplants at a single center

    PubMed Central

    Nakamura, Noboru; Vaidya, Anil; Levi, David M.; Kato, Tomoaki; Nery, Jose R.; Madariaga, Juan R.; Molina, Enrique; Ruiz, Phillip; Gyamfi, Anthony; Tzakis, Andreas G.

    2006-01-01

    Background. Orthotopic liver transplantation (OLT) in adult patients has traditionally been performed using conventional caval reconstruction technique (CV) with veno-venous bypass. Recently, the piggyback technique (PB) without veno-venous bypass has begun to be widely used. The aim of this study was to assess the effect of routine use of PB on OLTs in adult patients. Patients and methods. A retrospective analysis was undertaken of 1067 orthotopic cadaveric whole liver transplantations in adult patients treated between June 1994 and July 2001. PB was used as the routine procedure. Patient demographics, factors including cold ischemia time (CIT), warm ischemia time (WIT), operative time, transfusions, blood loss, and postoperative results were assessed. The effects of clinical factors on graft survival were assessed by univariate and multivariate analyses.In all, 918 transplantations (86%) were performed with PB. Blood transfusion, WIT, and usage of veno-venous bypass were less with PB. Seventy-five (8.3%) cases with PB had refractory ascites following OLT (p=NS). Five venous outflow stenosis cases (0.54%) with PB were noted (p=NS). The liver and renal function during the postoperative periods was similar. Overall 1-, 3-, and 5-year patient survival rates were 85%, 78%, and 72% with PB. Univariate analysis showed that cava reconstruction method, CIT, WIT, amount of transfusion, length of hospital stay, donor age, and tumor presence were significant factors influencing graft survival. Multivariate analysis further reinforced the fact that CIT, donor age, amount of transfusion, and hospital stay were prognostic factors for graft survival. Conclusions. PB can be performed safely in the majority of adult OLTs. Results of OLT with PB are as same as for CV. Liver function, renal function, morbidity, mortality, and patient and graft survival are similar to CV. However, amount of transfusion, WIT, and use of veno-venous bypass are less with PB. PMID:18333273

  14. Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences

    PubMed Central

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566

  15. Comparative Analysis of Academic Grades in Compulsory Secondary Education in Spain Using Statistical Techniques

    ERIC Educational Resources Information Center

    Veas, Alejandro; Gilar, Raquel; Miñano, Pablo; Castejón, Juan Luis

    2017-01-01

    The present study, based on the construct comparability approach, performs a comparative analysis of general points average for seven courses, using exploratory factor analysis (EFA) and the Partial Credit model (PCM) with a sample of 1398 student subjects (M = 12.5, SD = 0.67) from 8 schools in the province of Alicante (Spain). EFA confirmed a…

  16. The support-control continuum: An investigation of staff perspectives on factors influencing the success or failure of de-escalation techniques for the management of violence and aggression in mental health settings.

    PubMed

    Price, Owen; Baker, John; Bee, Penny; Lovell, Karina

    2018-01-01

    De-escalation techniques are recommended to manage violence and aggression in mental health settings yet restrictive practices continue to be frequently used. Barriers and enablers to the implementation and effectiveness of de-escalation techniques in practice are not well understood. To obtain staff descriptions of de-escalation techniques currently used in mental health settings and explore factors perceived to influence their implementation and effectiveness. Qualitative, semi-structured interviews and Framework Analysis. Five in-patient wards including three male psychiatric intensive care units, one female acute ward and one male acute ward in three UK Mental Health NHS Trusts. 20 ward-based clinical staff. Individual semi-structured interviews were digitally recorded, transcribed verbatim and analysed using a qualitative data analysis software package. Participants described 14 techniques used in response to escalated aggression applied on a continuum between support and control. Techniques along the support-control continuum could be classified in three groups: 'support' (e.g. problem-solving, distraction, reassurance) 'non-physical control' (e.g. reprimands, deterrents, instruction) and 'physical control' (e.g. physical restraint and seclusion). Charting the reasoning staff provided for technique selection against the described behavioural outcome enabled a preliminary understanding of staff, patient and environmental influences on de-escalation success or failure. Importantly, the more coercive 'non-physical control' techniques are currently conceptualised by staff as a feature of de-escalation techniques, yet, there was evidence of a link between these and increased aggression/use of restrictive practices. Risk was not a consistent factor in decisions to adopt more controlling techniques. Moral judgements regarding the function of the aggression; trial-and-error; ingrained local custom (especially around instruction to low stimulus areas); knowledge of the patient; time-efficiency and staff anxiety had a key role in escalating intervention. This paper provides a new model for understanding staff intervention in response to escalated aggression, a continuum between support and control. It further provides a preliminary explanatory framework for understanding the relationship between patient behaviour, staff response and environmental influences on de-escalation success and failure. This framework reveals potentially important behaviour change targets for interventions seeking to reduce violence and use of restrictive practices through enhanced de-escalation techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Predictive analysis effectiveness in determining the epidemic disease infected area

    NASA Astrophysics Data System (ADS)

    Ibrahim, Najihah; Akhir, Nur Shazwani Md.; Hassan, Fadratul Hafinaz

    2017-10-01

    Epidemic disease outbreak had caused nowadays community to raise their great concern over the infectious disease controlling, preventing and handling methods to diminish the disease dissemination percentage and infected area. Backpropagation method was used for the counter measure and prediction analysis of the epidemic disease. The predictive analysis based on the backpropagation method can be determine via machine learning process that promotes the artificial intelligent in pattern recognition, statistics and features selection. This computational learning process will be integrated with data mining by measuring the score output as the classifier to the given set of input features through classification technique. The classification technique is the features selection of the disease dissemination factors that likely have strong interconnection between each other in causing infectious disease outbreaks. The predictive analysis of epidemic disease in determining the infected area was introduced in this preliminary study by using the backpropagation method in observation of other's findings. This study will classify the epidemic disease dissemination factors as the features for weight adjustment on the prediction of epidemic disease outbreaks. Through this preliminary study, the predictive analysis is proven to be effective method in determining the epidemic disease infected area by minimizing the error value through the features classification.

  18. The clinico-radiological paradox of cognitive function and MRI burden of white matter lesions in people with multiple sclerosis: A systematic review and meta-analysis.

    PubMed

    Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter

    2017-01-01

    Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.

  19. An Analysis of Factors that Inhibit Business Use of User-Centered Design Principles: A Delphi Study

    ERIC Educational Resources Information Center

    Hilton, Tod M.

    2010-01-01

    The use of user-centered design (UCD) principles has a positive impact on the use of web-based interactive systems in customer-centric organizations. User-centered design methodologies are not widely adopted in organizations due to intraorganizational factors. A qualitative study using a modified Delphi technique was used to identify the factors…

  20. State-Space Modeling of Dynamic Psychological Processes via the Kalman Smoother Algorithm: Rationale, Finite Sample Properties, and Applications

    ERIC Educational Resources Information Center

    Song, Hairong; Ferrer, Emilio

    2009-01-01

    This article presents a state-space modeling (SSM) technique for fitting process factor analysis models directly to raw data. The Kalman smoother via the expectation-maximization algorithm to obtain maximum likelihood parameter estimates is used. To examine the finite sample properties of the estimates in SSM when common factors are involved, a…

  1. Spatial diffusion of influenza outbreak-related climate factors in Chiang Mai Province, Thailand.

    PubMed

    Nakapan, Supachai; Tripathi, Nitin Kumar; Tipdecho, Taravudh; Souris, Marc

    2012-10-24

    Influenza is one of the most important leading causes of respiratory illness in the countries located in the tropical areas of South East Asia and Thailand. In this study the climate factors associated with influenza incidence in Chiang Mai Province, Northern Thailand, were investigated. Identification of factors responsible for influenza outbreaks and the mapping of potential risk areas in Chiang Mai are long overdue. This work examines the association between yearly climate patterns between 2001 and 2008 and influenza outbreaks in the Chiang Mai Province. The climatic factors included the amount of rainfall, percent of rainy days, relative humidity, maximum, minimum temperatures and temperature difference. The study develops a statistical analysis to quantitatively assess the relationship between climate and influenza outbreaks and then evaluate its suitability for predicting influenza outbreaks. A multiple linear regression technique was used to fit the statistical model. The Inverse Distance Weighted (IDW) interpolation and Geographic Information System (GIS) techniques were used in mapping the spatial diffusion of influenza risk zones. The results show that there is a significance correlation between influenza outbreaks and climate factors for the majority of the studied area. A statistical analysis was conducted to assess the validity of the model comparing model outputs and actual outbreaks.

  2. A Qualitative Study on Organizational Factors Affecting Occupational Accidents

    PubMed Central

    ESKANDARI, Davood; JAFARI, Mohammad Javad; MEHRABI, Yadollah; KIAN, Mostafa Pouya; CHARKHAND, Hossein; MIRGHOTBI, Mostafa

    2017-01-01

    Background: Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts’ experiences and perception of organizational factors. Methods: This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Results: Eleven organizational factors’ sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. Conclusion: The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents. PMID:28435824

  3. A Cross-Cultural Analysis of Personality Structure Through the Lens of the HEXACO Model.

    PubMed

    Ion, Andrei; Iliescu, Dragos; Aldhafri, Said; Rana, Neeti; Ratanadilok, Kattiya; Widyanti, Ari; Nedelcea, Cătălin

    2017-01-01

    Across 5 different samples, totaling more than 1,600 participants from India, Indonesia, Oman, Romania, and Thailand, the authors address the question of cross-cultural replicability of a personality structure, while exploring the utility of exploratory structural equation modeling (ESEM) as a data analysis technique in cross-cultural personality research. Personality was measured with an alternative, non-Five-Factor Model (FFM) personality framework, provided by the HEXACO-PI (Lee & Ashton, 2004 ). The results show that the HEXACO framework was replicated in some of the investigated cultures. The ESEM data analysis technique proved to be especially useful in investigating the between-group measurement equivalence of broad personality measures across different cultures.

  4. Human Factors Research in Anesthesia Patient Safety

    PubMed Central

    Weinger, Matthew B.; Slagle, Jason

    2002-01-01

    Patient safety has become a major public concern. Human factors research in other high-risk fields has demonstrated how rigorous study of factors that affect job performance can lead to improved outcome and reduced errors after evidence-based redesign of tasks or systems. These techniques have increasingly been applied to the anesthesia work environment. This paper describes data obtained recently using task analysis and workload assessment during actual patient care and the use of cognitive task analysis to study clinical decision making. A novel concept of “non-routine events” is introduced and pilot data are presented. The results support the assertion that human factors research can make important contributions to patient safety. Information technologies play a key role in these efforts.

  5. Human factors research in anesthesia patient safety.

    PubMed Central

    Weinger, M. B.; Slagle, J.

    2001-01-01

    Patient safety has become a major public concern. Human factors research in other high-risk fields has demonstrated how rigorous study of factors that affect job performance can lead to improved outcome and reduced errors after evidence-based redesign of tasks or systems. These techniques have increasingly been applied to the anesthesia work environment. This paper describes data obtained recently using task analysis and workload assessment during actual patient care and the use of cognitive task analysis to study clinical decision making. A novel concept of "non-routine events" is introduced and pilot data are presented. The results support the assertion that human factors research can make important contributions to patient safety. Information technologies play a key role in these efforts. PMID:11825287

  6. CrossTalk: The Journal of Defense Software Engineering. Volume 27, Number 1, January/February 2014

    DTIC Science & Technology

    2014-02-01

    deficit in trustworthiness and will permit analysis on how this deficit needs to be overcome. This analysis will help identify adaptations that are...approaches to trustworthy analysis split into two categories: product-based and process-based. Product-based techniques [9] identify factors that...Criticalities may also be assigned to decompositions and contributions. 5. Evaluation and analysis : in this task the propagation rules of the NFR

  7. The standard deviation of extracellular water/intracellular water is associated with all-cause mortality and technique failure in peritoneal dialysis patients.

    PubMed

    Tian, Jun-Ping; Wang, Hong; Du, Feng-He; Wang, Tao

    2016-09-01

    The mortality rate of peritoneal dialysis (PD) patients is still high, and the predicting factors for PD patient mortality remain to be determined. This study aimed to explore the relationship between the standard deviation (SD) of extracellular water/intracellular water (E/I) and all-cause mortality and technique failure in continuous ambulatory PD (CAPD) patients. All 152 patients came from the PD Center between January 1st 2006 and December 31st 2007. Clinical data and at least five-visit E/I ratio defined by bioelectrical impedance analysis were collected. The patients were followed up till December 31st 2010. The primary outcomes were death from any cause and technique failure. Kaplan-Meier analysis and Cox proportional hazards models were used to identify risk factors for mortality and technique failure in CAPD patients. All patients were followed up for 59.6 ± 23.0 months. The patients were divided into two groups according to their SD of E/I values: lower SD of E/I group (≤0.126) and higher SD of E/I group (>0.126). The patients with higher SD of E/I showed a higher all-cause mortality (log-rank χ (2) = 10.719, P = 0.001) and technique failure (log-rank χ (2) = 9.724, P = 0.002) than those with lower SD of E/I. Cox regression analysis found that SD of E/I independently predicted all-cause mortality (HR  3.551, 95 % CI 1.442-8.746, P = 0.006) and technique failure (HR  2.487, 95 % CI 1.093-5.659, P = 0.030) in CAPD patients after adjustment for confounders except when sensitive C-reactive protein was added into the model. The SD of E/I was a strong independent predictor of all-cause mortality and technique failure in CAPD patients.

  8. Multiparametric in situ mRNA hybridization analysis to predict disease recurrence in patients with colon carcinoma.

    PubMed Central

    Kitadai, Y.; Ellis, L. M.; Tucker, S. L.; Greene, G. F.; Bucana, C. D.; Cleary, K. R.; Takahashi, Y.; Tahara, E.; Fidler, I. J.

    1996-01-01

    We examined the expression level of several genes that regulate different steps of metastasis in formalin-fixed, paraffin-embedded archival specimens of primary human colon carcinomas from patients with at least 5 years of follow-up. The expression of epidermal growth factor receptor, basic fibroblast growth factor, type IV collagenase, E-cadherin, and multidrug resistance (mdr-1) was examined by a colorimetric in situ mRNA hybridization technique concentrating on reactivity at the periphery of the neoplasms. The in situ hybridization technique revealed inter- and intratumor heterogeneity for expression of the metastasis-related genes. The expression of basic fibroblast growth factor, collagenase type IV, epidermal growth factor receptor, and mdr-1 mRNA was higher in Dukes's stage D than in Dukes' stage B tumors. Among the 22 Dukes' stage B neoplasms, 5 specimens exhibited a high expression level of epidermal growth factor receptor, basic fibroblast growth factor, and collagenase type IV. Clinical outcome data (5-year follow-up) revealed that all 5 patients with Dukes' stage B tumors developed distant metastasis (recurrent disease), whereas the other 17 patients with Dukes' stage B tumors expressing low levels of the metastasis-related genes were disease-free. Multivariate analysis identified high levels of expression of collagenase type IV and low levels of expression of E-cadherin as independent factors significantly associated with metastasis or recurrent disease. More specifically, metastatic or recurrent disease was associated with a high ratio (> 1.35) of expression of collagenase type IV to E-cadherin (specificity of 95%). Collectively, the data show that multiparametric in situ hybridization analysis for several metastasis-related genes may predict the metastatic potential, and hence the clinical outcome, of individual lymph-node-negative human colon cancers. Images Figure 1 Figure 2 PMID:8909244

  9. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  10. Identifying the arterial input function from dynamic contrast-enhanced magnetic resonance images using an apex-seeking technique

    NASA Astrophysics Data System (ADS)

    Martel, Anne L.

    2004-04-01

    In order to extract quantitative information from dynamic contrast-enhanced MR images (DCE-MRI) it is usually necessary to identify an arterial input function. This is not a trivial problem if there are no major vessels present in the field of view. Most existing techniques rely on operator intervention or use various curve parameters to identify suitable pixels but these are often specific to the anatomical region or the acquisition method used. They also require the signal from several pixels to be averaged in order to improve the signal to noise ratio, however this introduces errors due to partial volume effects. We have described previously how factor analysis can be used to automatically separate arterial and venous components from DCE-MRI studies of the brain but although that method works well for single slice images through the brain when the blood brain barrier technique is intact, it runs into problems for multi-slice images with more complex dynamics. This paper will describe a factor analysis method that is more robust in such situations and is relatively insensitive to the number of physiological components present in the data set. The technique is very similar to that used to identify spectral end-members from multispectral remote sensing images.

  11. The horizontal working mobility of employees with garment technique educational background

    NASA Astrophysics Data System (ADS)

    Supraptono, Eko; Sudana, I. Made; Rini, Sri Hastuti Eko

    2018-03-01

    The purposes of this report are: 1) to know how is the working mobility for garment employees, 2) to analyze the factors that caused working mobility, and new working orientation who searched by garment employees. This research is using qualitative and quantitative approach. The Informant in this research is gotten by purposive action. The data collecting techniques are observations, interviews, and documentations. The data analysis is using descriptive qualitative analysis by observing every aspect. The result of research shows that the criteria of the labor migration was high. It can be seen from Ungaran Sari Garment Company. The length of the migration is high, between 1 until 6 months. and the types of new job that searched by the employees is appropriate job vacancy with their competence. Some factors that influence the working mobility are mental of the workers and company management system. The orientation of the new job is feeling comfortable while working.

  12. A technique for estimating the absolute gain of a photomultiplier tube

    NASA Astrophysics Data System (ADS)

    Takahashi, M.; Inome, Y.; Yoshii, S.; Bamba, A.; Gunji, S.; Hadasch, D.; Hayashida, M.; Katagiri, H.; Konno, Y.; Kubo, H.; Kushida, J.; Nakajima, D.; Nakamori, T.; Nagayoshi, T.; Nishijima, K.; Nozaki, S.; Mazin, D.; Mashuda, S.; Mirzoyan, R.; Ohoka, H.; Orito, R.; Saito, T.; Sakurai, S.; Takeda, J.; Teshima, M.; Terada, Y.; Tokanai, F.; Yamamoto, T.; Yoshida, T.

    2018-06-01

    Detection of low-intensity light relies on the conversion of photons to photoelectrons, which are then multiplied and detected as an electrical signal. To measure the actual intensity of the light, one must know the factor by which the photoelectrons have been multiplied. To obtain this amplification factor, we have developed a procedure for estimating precisely the signal caused by a single photoelectron. The method utilizes the fact that the photoelectrons conform to a Poisson distribution. The average signal produced by a single photoelectron can then be estimated from the number of noise events, without requiring analysis of the distribution of the signal produced by a single photoelectron. The signal produced by one or more photoelectrons can be estimated experimentally without any assumptions. This technique, and an example of the analysis of a signal from a photomultiplier tube, are described in this study.

  13. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  14. Potable water taste enhancement

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis was conducted to determine the causes of and remedies for the unpalatability of potable water in manned spacecraft. Criteria and specifications for palatable water were established and a quantitative laboratory analysis technique was developed for determinig the amounts of volatile organics in good tasting water. Prototype spacecraft water reclamation systems are evaluated in terms of the essential palatability factors.

  15. Study of the role of tumor necrosis factor-α (-308 G/A) and interleukin-10 (-1082 G/A) polymorphisms as potential risk factors to acute kidney injury in patients with severe sepsis using high-resolution melting curve analysis.

    PubMed

    Hashad, Doaa I; Elsayed, Eman T; Helmy, Tamer A; Elawady, Samier M

    2017-11-01

    Septic acute kidney injury (AKI) is a prevalent complication in intensive care units with an increased incidence of complications. The aim of the present study was to assess the use of high-resolution melting curve (HRM) analysis in investigating whether the genetic polymorphisms; -308 G/A of tumor necrosis factor-α (TNF-α), and -1082 G /A of Interleukin-10 (IL-10) genes may predispose patients diagnosed with severe sepsis to the development of AKI. One hundred and fifty patients with severe sepsis participated in the present study; only sixty-six developed AKI. Both polymorphisms were studied using HRM analysis. The low producer genotype of both studied polymorphism of TNF-α and IL-10 genes was associated with AKI. Using logistic regression analysis, the low producer genotypes remained an independent risk factor for AKI. A statistically significant difference was detected between both studied groups as regards the low producer genotype in both TNF-α (-308 G/A) and interleukin-10 (IL-10) (-1082 G/A) polymorphisms being prevalent in patients developing AKI. Principle conclusions: The low producer genotypes of both TNF-α (-308 G/A) and IL-10 (-1082 G/A) polymorphisms could be considered a risk factor for the development of AKI in critically ill patients with severe sepsis, thus management technique implemented for this category should be modulated rescuing this sector of patients from the grave deterioration to acute kidney injury. Using HRM for genotyping proved to be a highly efficient, simple, cost-effective genotyping technique that is most appropriate for the routine study of large-scale samples.

  16. Multivariate analysis on unilateral cleft lip and palate treatment outcome by EUROCRAN index: A retrospective study.

    PubMed

    Yew, Ching Ching; Alam, Mohammad Khursheed; Rahman, Shaifulizan Abdul

    2016-10-01

    This study is to evaluate the dental arch relationship and palatal morphology of unilateral cleft lip and palate patients by using EUROCRAN index, and to assess the factors that affect them using multivariate statistical analysis. A total of one hundred and seven patients from age five to twelve years old with non-syndromic unilateral cleft lip and palate were included in the study. These patients have received cheiloplasty and one stage palatoplasty surgery but yet to receive alveolar bone grafting procedure. Five assessors trained in the use of the EUROCRAN index underwent calibration exercise and ranked the dental arch relationships and palatal morphology of the patients' study models. For intra-rater agreement, the examiners scored the models twice, with two weeks interval in between sessions. Variable factors of the patients were collected and they included gender, site, type and, family history of unilateral cleft lip and palate; absence of lateral incisor on cleft side, cheiloplasty and palatoplasty technique used. Associations between various factors and dental arch relationships were assessed using logistic regression analysis. Dental arch relationship among unilateral cleft lip and palate in local population had relatively worse scoring than other parts of the world. Crude logistics regression analysis did not demonstrate any significant associations among the various socio-demographic factors, cheiloplasty and palatoplasty techniques used with the dental arch relationship outcome. This study has limitations that might have affected the results, example: having multiple operators performing the surgeries and the inability to access the influence of underlying genetic predisposed cranio-facial variability. These may have substantial influence on the treatment outcome. The factors that can affect unilateral cleft lip and palate treatment outcome is multifactorial in nature and remained controversial in general. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. The Influence of Accreditation on the Sustainability of Organizations with the Brazilian Accreditation Methodology

    PubMed Central

    de Paiva, Anderson Paulo

    2018-01-01

    This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939

  18. Diffraction analysis of customized illumination technique

    NASA Astrophysics Data System (ADS)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  19. The art of spacecraft design: A multidisciplinary challenge

    NASA Technical Reports Server (NTRS)

    Abdi, F.; Ide, H.; Levine, M.; Austel, L.

    1989-01-01

    Actual design turn-around time has become shorter due to the use of optimization techniques which have been introduced into the design process. It seems that what, how and when to use these optimization techniques may be the key factor for future aircraft engineering operations. Another important aspect of this technique is that complex physical phenomena can be modeled by a simple mathematical equation. The new powerful multilevel methodology reduces time-consuming analysis significantly while maintaining the coupling effects. This simultaneous analysis method stems from the implicit function theorem and system sensitivity derivatives of input variables. Use of the Taylor's series expansion and finite differencing technique for sensitivity derivatives in each discipline makes this approach unique for screening dominant variables from nondominant variables. In this study, the current Computational Fluid Dynamics (CFD) aerodynamic and sensitivity derivative/optimization techniques are applied for a simple cone-type forebody of a high-speed vehicle configuration to understand basic aerodynamic/structure interaction in a hypersonic flight condition.

  20. Recent Advances in Techniques for Starch Esters and the Applications: A Review

    PubMed Central

    Hong, Jing; Zeng, Xin-An; Brennan, Charles S.; Brennan, Margaret; Han, Zhong

    2016-01-01

    Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented. PMID:28231145

  1. Bi-Frequency Modulated Quasi-Resonant Converters: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Zhang, Yuefeng

    1995-01-01

    To avoid the variable frequency operation of quasi -resonant converters, many soft-switching PWM converters have been proposed, all of them require an auxiliary switch, which will increase the cost and complexity of the power supply system. In this thesis, a new kind of technique for quasi -resonant converters has been proposed, which is called the bi-frequency modulation technique. By operating the quasi-resonant converters at two switching frequencies, this technique enables quasi-resonant converters to achieve the soft-switching, at fixed switching frequencies, without an auxiliary switch. The steady-state analysis of four commonly used quasi-resonant converters, namely, ZVS buck, ZCS buck, ZVS boost, and ZCS boost converter has been presented. Using the concepts of equivalent sources, equivalent sinks, and resonant tank, the large signal models of these four quasi -resonant converters were developed. Based on these models, the steady-state control characteristics of BFM ZVS buck, BFM ZCS buck, BFM ZVS boost, and BFM ZCS boost converter have been derived. The functional block and design consideration of the bi-frequency controller were presented, and one of the implementations of the bi-frequency controller was given. A complete design example has been presented. Both computer simulations and experimental results have verified that the bi-frequency modulated quasi-resonant converters can achieve soft-switching, at fixed switching frequencies, without an auxiliary switch. One of the application of bi-frequency modulation technique is for EMI reduction. The basic principle of using BFM technique for EMI reduction was introduced. Based on the spectral analysis, the EMI performances of the PWM, variable-frequency, and bi-frequency modulated control signals was evaluated, and the BFM control signals show the lowest EMI emission. The bi-frequency modulated technique has also been applied to the power factor correction. A BFM zero -current switching boost converter has been designed for the power factor correction, and the simulation results show that the power factor has been improved.

  2. Maternal infection rates after cesarean delivery by Pfannenstiel or Joel-Cohen incision: a multicenter surveillance study.

    PubMed

    Dumas, Anne Marie; Girard, Raphaële; Ayzac, Louis; Caillat-Vallet, Emmanuelle; Tissot-Guerraz, Françoise; Vincent-Bouletreau, Agnès; Berland, Michel

    2009-12-01

    Our purpose was to evaluate maternal nosocomial infection rates according to the incision technique used for caesarean delivery, in a routine surveillance study. This was a prospective study of 5123 cesarean deliveries (43.2% Joel-Cohen, 56.8% Pfannenstiel incisions) in 35 maternity units (Mater Sud Est network). Data on routine surveillance variables, operative duration, and three additional variables (manual removal of the placenta, uterine exteriorization, and/or cleaning of the parieto-colic gutter) were collected. Multiple logistic regression analysis was used to identify independent risk factors for infection. The overall nosocomial infection and endometritis rates were higher for the Joel-Cohen than Pfannenstiel incision (4.5% vs. 3.3%, 0.8% vs. 0.3%, respectively). The higher rate of nosocomial infections with the Joel-Cohen incision was due to a greater proportion of patients presenting risk factors (i.e., emergency delivery, primary cesarean, blood loss > or =800 mL, no manual removal of the placenta and no uterine exteriorization). However, the Joel-Cohen technique was an independent risk factor for endometritis. The Joel-Cohen technique is faster than the Pfannenstiel technique but is associated with a higher incidence of endometritis.

  3. Temporal and spatial assessment of river surface water quality using multivariate statistical techniques: a study in Can Tho City, a Mekong Delta area, Vietnam.

    PubMed

    Phung, Dung; Huang, Cunrui; Rutherford, Shannon; Dwirahmadi, Febi; Chu, Cordia; Wang, Xiaoming; Nguyen, Minh; Nguyen, Nga Huy; Do, Cuong Manh; Nguyen, Trung Hieu; Dinh, Tuan Anh Diep

    2015-05-01

    The present study is an evaluation of temporal/spatial variations of surface water quality using multivariate statistical techniques, comprising cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA). Eleven water quality parameters were monitored at 38 different sites in Can Tho City, a Mekong Delta area of Vietnam from 2008 to 2012. Hierarchical cluster analysis grouped the 38 sampling sites into three clusters, representing mixed urban-rural areas, agricultural areas and industrial zone. FA/PCA resulted in three latent factors for the entire research location, three for cluster 1, four for cluster 2, and four for cluster 3 explaining 60, 60.2, 80.9, and 70% of the total variance in the respective water quality. The varifactors from FA indicated that the parameters responsible for water quality variations are related to erosion from disturbed land or inflow of effluent from sewage plants and industry, discharges from wastewater treatment plants and domestic wastewater, agricultural activities and industrial effluents, and contamination by sewage waste with faecal coliform bacteria through sewer and septic systems. Discriminant analysis (DA) revealed that nephelometric turbidity units (NTU), chemical oxygen demand (COD) and NH₃ are the discriminating parameters in space, affording 67% correct assignation in spatial analysis; pH and NO₂ are the discriminating parameters according to season, assigning approximately 60% of cases correctly. The findings suggest a possible revised sampling strategy that can reduce the number of sampling sites and the indicator parameters responsible for large variations in water quality. This study demonstrates the usefulness of multivariate statistical techniques for evaluation of temporal/spatial variations in water quality assessment and management.

  4. Factor Analysis of Drawings: Application to college student models of the greenhouse effect

    NASA Astrophysics Data System (ADS)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-09-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.

  5. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  6. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  7. Apportionment of urban aerosol sources in Cork (Ireland) by synergistic measurement techniques.

    PubMed

    Dall'Osto, Manuel; Hellebust, Stig; Healy, Robert M; O'Connor, Ian P; Kourtchev, Ivan; Sodeau, John R; Ovadnevaite, Jurgita; Ceburnis, Darius; O'Dowd, Colin D; Wenger, John C

    2014-09-15

    The sources of ambient fine particulate matter (PM2.5) during wintertime at a background urban location in Cork city (Ireland) have been determined. Aerosol chemical analyses were performed by multiple techniques including on-line high resolution aerosol time-of-flight mass spectrometry (Aerodyne HR-ToF-AMS), on-line single particle aerosol time-of-flight mass spectrometry (TSI ATOFMS), on-line elemental carbon-organic carbon analysis (Sunset_EC-OC), and off-line gas chromatography/mass spectrometry and ion chromatography analysis of filter samples collected at 6-h resolution. Positive matrix factorization (PMF) has been carried out to better elucidate aerosol sources not clearly identified when analyzing results from individual aerosol techniques on their own. Two datasets have been considered: on-line measurements averaged over 2-h periods, and both on-line and off-line measurements averaged over 6-h periods. Five aerosol sources were identified by PMF in both datasets, with excellent agreement between the two solutions: (1) regional domestic solid fuel burning--"DSF_Regional," 24-27%; (2) local urban domestic solid fuel burning--"DSF_Urban," 22-23%; (3) road vehicle emissions--"Traffic," 15-20%; (4) secondary aerosols from regional anthropogenic sources--"SA_Regional" 9-13%; and (5) secondary aged/processed aerosols related to urban anthropogenic sources--"SA_Urban," 21-26%. The results indicate that, despite regulations for restricting the use of smoky fuels, solid fuel burning is the major source (46-50%) of PM2.5 in wintertime in Cork, and also likely other areas of Ireland. Whilst wood combustion is strongly associated with OC and EC, it was found that peat and coal combustion is linked mainly with OC and the aerosol from these latter sources appears to be more volatile than that produced by wood combustion. Ship emissions from the nearby port were found to be mixed with the SA_Regional factor. The PMF analysis allowed us to link the AMS cooking organic aerosol factor (AMS_PMF_COA) to oxidized organic aerosol, chloride and locally produced nitrate, indicating that AMS_PMF_COA cannot be attributed to primary cooking emissions only. Overall, there are clear benefits from factor analysis applied to results obtained from multiple techniques, which allows better association of aerosols with sources and atmospheric processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Heisenberg principle applied to the analysis of speckle interferometry fringes

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Sciammarella, F. M.

    2003-11-01

    Optical techniques that are used to measure displacements utilize a carrier. When a load is applied the displacement field modulates the carrier. The accuracy of the information that can be recovered from the modulated carrier is limited by a number of factors. In this paper, these factors are analyzed and conclusions concerning the limitations in information recovery are illustrated with examples taken from experimental data.

  9. Executive Order 12898 and Social, Economic, and Sociopolitical Factors Influencing Toxic Release Inventory Facility Location in EPA Region 6: A Multi-Scale Spatial Assessment of Environmental Justice

    ERIC Educational Resources Information Center

    Moore, Andrea Lisa

    2013-01-01

    Toxic Release Inventory facilities are among the many environmental hazards shown to create environmental inequities in the United States. This project examined four factors associated with Toxic Release Inventory, specifically, manufacturing facility location at multiple spatial scales using spatial analysis techniques (i.e., O-ring statistic and…

  10. A Novel Technique to Detect Code for SAC-OCDMA System

    NASA Astrophysics Data System (ADS)

    Bharti, Manisha; Kumar, Manoj; Sharma, Ajay K.

    2018-04-01

    The main task of optical code division multiple access (OCDMA) system is the detection of code used by a user in presence of multiple access interference (MAI). In this paper, new method of detection known as XOR subtraction detection for spectral amplitude coding OCDMA (SAC-OCDMA) based on double weight codes has been proposed and presented. As MAI is the main source of performance deterioration in OCDMA system, therefore, SAC technique is used in this paper to eliminate the effect of MAI up to a large extent. A comparative analysis is then made between the proposed scheme and other conventional detection schemes used like complimentary subtraction detection, AND subtraction detection and NAND subtraction detection. The system performance is characterized by Q-factor, BER and received optical power (ROP) with respect to input laser power and fiber length. The theoretical and simulation investigations reveal that the proposed detection technique provides better quality factor, security and received power in comparison to other conventional techniques. The wide opening of eye in case of proposed technique also proves its robustness.

  11. Machine learning methods reveal the temporal pattern of dengue incidence using meteorological factors in metropolitan Manila, Philippines.

    PubMed

    Carvajal, Thaddeus M; Viacrusis, Katherine M; Hernandez, Lara Fides T; Ho, Howell T; Amalin, Divina M; Watanabe, Kozo

    2018-04-17

    Several studies have applied ecological factors such as meteorological variables to develop models and accurately predict the temporal pattern of dengue incidence or occurrence. With the vast amount of studies that investigated this premise, the modeling approaches differ from each study and only use a single statistical technique. It raises the question of whether which technique would be robust and reliable. Hence, our study aims to compare the predictive accuracy of the temporal pattern of Dengue incidence in Metropolitan Manila as influenced by meteorological factors from four modeling techniques, (a) General Additive Modeling, (b) Seasonal Autoregressive Integrated Moving Average with exogenous variables (c) Random Forest and (d) Gradient Boosting. Dengue incidence and meteorological data (flood, precipitation, temperature, southern oscillation index, relative humidity, wind speed and direction) of Metropolitan Manila from January 1, 2009 - December 31, 2013 were obtained from respective government agencies. Two types of datasets were used in the analysis; observed meteorological factors (MF) and its corresponding delayed or lagged effect (LG). After which, these datasets were subjected to the four modeling techniques. The predictive accuracy and variable importance of each modeling technique were calculated and evaluated. Among the statistical modeling techniques, Random Forest showed the best predictive accuracy. Moreover, the delayed or lag effects of the meteorological variables was shown to be the best dataset to use for such purpose. Thus, the model of Random Forest with delayed meteorological effects (RF-LG) was deemed the best among all assessed models. Relative humidity was shown to be the top-most important meteorological factor in the best model. The study exhibited that there are indeed different predictive outcomes generated from each statistical modeling technique and it further revealed that the Random forest model with delayed meteorological effects to be the best in predicting the temporal pattern of Dengue incidence in Metropolitan Manila. It is also noteworthy that the study also identified relative humidity as an important meteorological factor along with rainfall and temperature that can influence this temporal pattern.

  12. Estuarial fingerprinting through multidimensional fluorescence and multivariate analysis.

    PubMed

    Hall, Gregory J; Clow, Kerin E; Kenny, Jonathan E

    2005-10-01

    As part of a strategy for preventing the introduction of aquatic nuisance species (ANS) to U.S. estuaries, ballast water exchange (BWE) regulations have been imposed. Enforcing these regulations requires a reliable method for determining the port of origin of water in the ballast tanks of ships entering U.S. waters. This study shows that a three-dimensional fluorescence fingerprinting technique, excitation emission matrix (EEM) spectroscopy, holds great promise as a ballast water analysis tool. In our technique, EEMs are analyzed by multivariate classification and curve resolution methods, such as N-way partial least squares Regression-discriminant analysis (NPLS-DA) and parallel factor analysis (PARAFAC). We demonstrate that classification techniques can be used to discriminate among sampling sites less than 10 miles apart, encompassing Boston Harbor and two tributaries in the Mystic River Watershed. To our knowledge, this work is the first to use multivariate analysis to classify water as to location of origin. Furthermore, it is shown that curve resolution can show seasonal features within the multidimensional fluorescence data sets, which correlate with difficulty in classification.

  13. On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Ibraheem, S. O.; Demuren, A. O.

    1994-01-01

    A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.

  14. Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors

    NASA Astrophysics Data System (ADS)

    Mahalingam, R.; Olsen, M. J.

    2014-12-01

    Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : LiDAR, Landslides, Oregon, Inventory, Hazard

  15. Polymerase Chain Reaction/Rapid Methods Are Gaining a Foothold in Developing Countries.

    PubMed

    Ragheb, Suzan Mohammed; Jimenez, Luis

    Detection of microbial contamination in pharmaceutical raw materials and finished products is a critical factor to guarantee their safety, stability, and potency. Rapid microbiological methods-such as polymerase chain reaction-have been widely applied to clinical and food quality control analysis. However, polymerase chain reaction applications to pharmaceutical quality control have been rather slow and sporadic. Successful implementation of these methods in pharmaceutical companies in developing countries requires important considerations to provide sensitive and robust assays that will comply with good manufacturing practices. In recent years several publications have encouraged the application of molecular techniques in the microbiological assessment of pharmaceuticals. One of these techniques is polymerase chain reaction (PCR). The successful application of PCR in the pharmaceutical industry in developing countries is governed by considerable factors and requirements. These factors include the setting up of a PCR laboratory and the choice of appropriate equipment and reagents. In addition, the presence of well-trained analysts and establishment of quality control and quality assurance programs are important requirements. The pharmaceutical firms should take into account these factors to allow better chances for regulatory acceptance and wide application of this technique. © PDA, Inc. 2014.

  16. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    PubMed

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  18. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.

  19. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  20. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  1. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  2. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  3. Factors Associated With Healthcare-Acquired Catheter-Associated Urinary Tract Infections: Analysis Using Multiple Data Sources and Data Mining Techniques.

    PubMed

    Park, Jung In; Bliss, Donna Z; Chi, Chih-Lin; Delaney, Connie W; Westra, Bonnie L

    The purpose of this study was to identify factors associated with healthcare-acquired catheter-associated urinary tract infections (HA-CAUTIs) using multiple data sources and data mining techniques. Three data sets were integrated for analysis: electronic health record data from a university hospital in the Midwestern United States was combined with staffing and environmental data from the hospital's National Database of Nursing Quality Indicators and a list of patients with HA-CAUTIs. Three data mining techniques were used for identification of factors associated with HA-CAUTI: decision trees, logistic regression, and support vector machines. Fewer total nursing hours per patient-day, lower percentage of direct care RNs with specialty nursing certification, higher percentage of direct care RNs with associate's degree in nursing, and higher percentage of direct care RNs with BSN, MSN, or doctoral degree are associated with HA-CAUTI occurrence. The results also support the association of the following factors with HA-CAUTI identified by previous studies: female gender; older age (>50 years); longer length of stay; severe underlying disease; glucose lab results (>200 mg/dL); longer use of the catheter; and RN staffing. Additional findings from this study demonstrated that the presence of more nurses with specialty nursing certifications can reduce HA-CAUTI occurrence. While there may be valid reasons for leaving in a urinary catheter, findings show that having a catheter in for more than 48 hours contributes to HA-CAUTI occurrence. Finally, the findings suggest that more nursing hours per patient-day are related to better patient outcomes.

  4. Long-term follow-up results of umbilical hernia repair.

    PubMed

    Venclauskas, Linas; Jokubauskas, Mantas; Zilinskas, Justas; Zviniene, Kristina; Kiudelis, Mindaugas

    2017-12-01

    Multiple suture techniques and various mesh repairs are used in open or laparoscopic umbilical hernia (UH) surgery. To compare long-term follow-up results of UH repair in different hernia surgery groups and to identify risk factors for UH recurrence. A retrospective analysis of 216 patients who underwent elective surgery for UH during a 10-year period was performed. The patients were divided into three groups according to surgery technique (suture, mesh and laparoscopic repair). Early and long-term follow-up results including hospital stay, postoperative general and wound complications, recurrence rate and postoperative patient complaints were reviewed. Risk factors for recurrence were also analyzed. One hundred and forty-six patients were operated on using suture repair, 52 using open mesh and 18 using laparoscopic repair technique. 77.8% of patients underwent long-term follow-up. The postoperative wound complication rate and long-term postoperative complaints were significantly higher in the open mesh repair group. The overall hernia recurrence rate was 13.1%. Only 2 (1.7%) patients with small hernias (< 2 cm) had a recurrence in the suture repair group. Logistic regression analysis showed that body mass index (BMI) > 30 kg/m 2 , diabetes and wound infection were independent risk factors for umbilical hernia recurrence. The overall umbilical hernia recurrence rate was 13.1%. Body mass index > 30 kg/m 2 , diabetes and wound infection were independent risk factors for UH recurrence. According to our study results, laparoscopic medium and large umbilical hernia repair has slight advantages over open mesh repair concerning early postoperative complications, long-term postoperative pain and recurrence.

  5. Identifying Key Hospital Service Quality Factors in Online Health Communities

    PubMed Central

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain

    2015-01-01

    Background The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. Objective As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. Methods We defined social media–based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea’s two biggest online portals were used to test the effectiveness of detection of social media–based key quality factors for hospitals. Results To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is 78% on average. Extraction and classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media–based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). Conclusions These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies. PMID:25855612

  6. Design Criteria For Networked Image Analysis System

    NASA Astrophysics Data System (ADS)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  7. Study of relationship between clinical factors and velopharyngeal closure in cleft palate patients

    PubMed Central

    Chen, Qi; Zheng, Qian; Shi, Bing; Yin, Heng; Meng, Tian; Zheng, Guang-ning

    2011-01-01

    BACKGROUND: This study was carried out to analyze the relationship between clinical factors and velopharyngeal closure (VPC) in cleft palate patients. METHODS: Chi-square test was used to compare the postoperative velopharyngeal closure rate. Logistic regression model was used to analyze independent variables associated with velopharyngeal closure. RESULTS: Difference of postoperative VPC rate in different cleft types, operative ages and surgical techniques was significant (P=0.000). Results of logistic regression analysis suggested that when operative age was beyond deciduous dentition stage, or cleft palate type was complete, or just had undergone a simple palatoplasty without levator veli palatini retropositioning, patients would suffer a higher velopharyngeal insufficiency rate after primary palatal repair. CONCLUSIONS: Cleft type, operative age and surgical technique were the contributing factors influencing VPC rate after primary palatal repair of cleft palate patients. PMID:22279464

  8. Reduction and analysis techniques for infrared imaging data

    NASA Technical Reports Server (NTRS)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  9. Data management system performance modeling

    NASA Technical Reports Server (NTRS)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  10. The Impact of Information Technology on Organizations: Implications for Organizational Integration and the Management of Information Technology.

    DTIC Science & Technology

    1998-03-01

    and role model for its mission: "To be a great company by the year 2000 - to be to the cycling industry what Nike is to athletic shoes and Apple is to...position in industry by using three analytical techniques: strength, weaknesses, opportunities, and threats ( SWOT ) analysis; strategic cost...analysis; and competitive strength assessment. SWOT analysis shows a company’s internal and external factors. SWOT analysis provides a quick way for an

  11. Vortex-assisted magnetic β-cyclodextrin/attapulgite-linked ionic liquid dispersive liquid-liquid microextraction coupled with high-performance liquid chromatography for the fast determination of four fungicides in water samples.

    PubMed

    Yang, Miyi; Xi, Xuefei; Wu, Xiaoling; Lu, Runhua; Zhou, Wenfeng; Zhang, Sanbing; Gao, Haixiang

    2015-02-13

    A novel microextraction technique combining magnetic solid-phase microextraction (MSPME) with ionic liquid dispersive liquid-liquid microextraction (IL-DLLME) to determine four fungicides is presented in this work for the first time. The main factors affecting the extraction efficiency were optimized by the one-factor-at-a-time approach and the impacts of these factors were studied by an orthogonal design. Without tedious clean-up procedure, analytes were extracted from the sample to the adsorbent and organic solvent and then desorbed in acetonitrile prior to chromatographic analysis. Under the optimum conditions, good linearity and high enrichment factors were obtained for all analytes, with correlation coefficients ranging from 0.9998 to 1.0000 and enrichment factors ranging 135 and 159 folds. The recoveries for proposed approach were between 98% and 115%, the limits of detection were between 0.02 and 0.04 μg L(-1) and the RSDs changed from 2.96 to 4.16. The method was successfully applied in the analysis of four fungicides (azoxystrobin, chlorothalonil, cyprodinil and trifloxystrobin) in environmental water samples. The recoveries for the real water samples ranged between 81% and 109%. The procedure proved to be a time-saving, environmentally friendly, and efficient analytical technique. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    PubMed

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-05-15

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Load and dynamic assessment of B-52B-008 carrier aircraft for finned configuration 1 space shuttle solid rocket booster decelerator subsystem drop test vehicle. Volume 2: Airplane flutter and load analysis results

    NASA Technical Reports Server (NTRS)

    Quade, D. A.

    1978-01-01

    The airplane flutter and maneuver-gust load analysis results obtained during B-52B drop test vehicle configuration (with fins) evaluation are presented. These data are presented as supplementary data to that given in Volume 1 of this document. A brief mathematical description of airspeed notation and gust load factor criteria are provided as a help to the user. References are defined which provide mathematical description of the airplane flutter and load analysis techniques. Air-speed-load factor diagrams are provided for the airplane weight configurations reanalyzed for finned drop test vehicle configuration.

  14. Towards scar-free surgery: An analysis of the increasing complexity from laparoscopic surgery to NOTES

    PubMed Central

    Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.

    2014-01-01

    Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811

  15. Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach

    NASA Astrophysics Data System (ADS)

    Jain, Vineet; Raj, Tilak

    2014-09-01

    Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.

  16. Decomposition techniques

    USGS Publications Warehouse

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  17. Factors affecting plant species composition of hedgerows: relative importance and hierarchy

    NASA Astrophysics Data System (ADS)

    Deckers, Bart; Hermy, Martin; Muys, Bart

    2004-07-01

    Although there has been a clear quantitative and qualitative decline in traditional hedgerow network landscapes during last century, hedgerows are crucial for the conservation of rural biodiversity, functioning as an important habitat, refuge and corridor for numerous species. To safeguard this conservation function, insight in the basic organizing principles of hedgerow plant communities is needed. The vegetation composition of 511 individual hedgerows situated within an ancient hedgerow network landscape in Flanders, Belgium was recorded, in combination with a wide range of explanatory variables, including a selection of spatial variables. Non-parametric statistics in combination with multivariate data analysis techniques were used to study the effect of individual explanatory variables. Next, variables were grouped in five distinct subsets and the relative importance of these variable groups was assessed by two related variation partitioning techniques, partial regression and partial canonical correspondence analysis, taking into account explicitly the existence of intercorrelations between variables of different factor groups. Most explanatory variables affected significantly hedgerow species richness and composition. Multivariate analysis showed that, besides adjacent land use, hedgerow management, soil conditions, hedgerow type and origin, the role of other factors such as hedge dimensions, intactness, etc., could certainly not be neglected. Furthermore, both methods revealed the same overall ranking of the five distinct factor groups. Besides a predominant impact of abiotic environmental conditions, it was found that management variables and structural aspects have a relatively larger influence on the distribution of plant species in hedgerows than their historical background or spatial configuration.

  18. Bad splits in bilateral sagittal split osteotomy: systematic review and meta-analysis of reported risk factors.

    PubMed

    Steenen, S A; van Wijk, A J; Becking, A G

    2016-08-01

    An unfavourable and unanticipated pattern of the bilateral sagittal split osteotomy (BSSO) is generally referred to as a 'bad split'. Patient factors predictive of a bad split reported in the literature are controversial. Suggested risk factors are reviewed in this article. A systematic review was undertaken, yielding a total of 30 studies published between 1971 and 2015 reporting the incidence of bad split and patient age, and/or surgical technique employed, and/or the presence of third molars. These included 22 retrospective cohort studies, six prospective cohort studies, one matched-pair analysis, and one case series. Spearman's rank correlation showed a statistically significant but weak correlation between increasing average age and increasing occurrence of bad splits in 18 studies (ρ=0.229; P<0.01). No comparative studies were found that assessed the incidence of bad split among the different splitting techniques. A meta-analysis pooling the effect sizes of seven cohort studies showed no significant difference in the incidence of bad split between cohorts of patients with third molars present and concomitantly removed during surgery, and patients in whom third molars were removed at least 6 months preoperatively (odds ratio 1.16, 95% confidence interval 0.73-1.85, Z=0.64, P=0.52). In summary, there is no robust evidence to date to show that any risk factor influences the incidence of bad split. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  19. [Exploration of influencing factors of price of herbal based on VAR model].

    PubMed

    Wang, Nuo; Liu, Shu-Zhen; Yang, Guang

    2014-10-01

    Based on vector auto-regression (VAR) model, this paper takes advantage of Granger causality test, variance decomposition and impulse response analysis techniques to carry out a comprehensive study of the factors influencing the price of Chinese herbal, including herbal cultivation costs, acreage, natural disasters, the residents' needs and inflation. The study found that there is Granger causality relationship between inflation and herbal prices, cultivation costs and herbal prices. And in the total variance analysis of Chinese herbal and medicine price index, the largest contribution to it is from its own fluctuations, followed by the cultivation costs and inflation.

  20. A Componential Interpretation of the General Factor in Human Intelligence.

    DTIC Science & Technology

    1980-10-01

    individual - differences data . If one delves into the nature of variation across stimulus types rather than across subjects, however, a result...the conclusion we will reach from an evaluation of the data we have collected, we assert here that individual differences in general intelligence can... data . The technique we used was nonmetric multidimensional scaling rather than factor analysis , however (see Kruskal, 1964a, 1964b; Shepard, 1962a

  1. Understanding Pediatric Dentists' Dental Caries Management Treatment Decisions: A Conjoint Experiment.

    PubMed

    Kateeb, E T; Warren, J J; Gaeth, G J; Momany, E T; Damiano, P C

    2016-04-01

    When traditional ranking and rating surveys are used to assess dentists' treatment decisions, the patient's source of payment appears to be of little importance. Therefore, this study used the marketing research tool conjoint analysis to investigate the relative impact of source of payment along with the child's age and cooperativeness on pediatric dentists' willingness to use Atraumatic Restorative Treatment (ART) to restore posterior primary teeth. A conjoint survey was completed by 707 pediatric dentists. Three factors (age of the child, cooperativeness, type of insurance) were varied across 3 levels to create 9 patient scenarios. The relative weights that dentists placed on these factors in the restorative treatment decision process were determined by conjoint analysis. "Cooperativeness" (52%) was the most important factor, "age of the child" (26%) the second-most important factor, followed by "insurance status of the child" (22%). For the third factor, insurance, pediatric dentists were least willing to use ART with publicly insured children (-0.082), and this was significantly different from their willingness to use ART with uninsured children (0.010) but not significantly different than their willingness to use ART for children with private insurance (0.073). Unlike traditional ranking and rating tools, conjoint analysis found that the insurance status of the patient appeared to be an important factor in dentists' decisions about different restorative treatment options. When pediatric dentists were forced to make tradeoffs among different patients' factors, they were most willing to use ART technique with young, uncooperative patients when they had no insurance. Knowledge Transfer Statement : The present study suggests the feasibility of using techniques borrowed from marketing research, such as conjoint analysis, to understand dentists' restorative treatment decisions. Results of this study demonstrate pediatric dentists' willingness to use a particular restorative treatment option (Atraumatic Restorative Treatment in this application) when forced to make tradeoffs in a "conjoined," or holistic, context among different factors presented in real-life patient scenarios. A deeper understanding of dentists' treatment decisions is vital to develop valid practice guidelines and interventions that encourage the use of appropriate restorative treatment modalities.

  2. Qualitative computer aided evaluation of dental impressions in vivo.

    PubMed

    Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H

    2006-01-01

    Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.

  3. Analysis of Mesa Dislocation Gettering in HgCdTe/CdTe/Si(211) by Scanning Transmission Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Jacobs, R. N.; Stoltz, A. J.; Benson, J. D.; Smith, P.; Lennon, C. M.; Almeida, L. A.; Farrell, S.; Wijewarnasuriya, P. S.; Brill, G.; Chen, Y.; Salmon, M.; Zu, J.

    2013-11-01

    Due to its strong infrared absorption and variable band-gap, HgCdTe is the ideal detector material for high-performance infrared focal-plane arrays (IRFPAs). Next-generation IRFPAs will utilize dual-color high-definition formats on large-area substrates such as Si or GaAs. However, heteroepitaxial growth on these substrates is plagued by high densities of lattice-mismatch-induced threading dislocations (TDs) that ultimately reduce IRFPA operability. Previously we demonstrated a postgrowth technique with the potential to eliminate or move TDs such that they have less impact on detector operability. In this technique, highly reticulated mesa structures are produced in as-grown HgCdTe epilayers, and then subjected to thermal cycle annealing. To fully exploit this technique, better understanding of the inherent mechanism is required. In this work, we employ scanning transmission electron microscopy (STEM) analysis of HgCdTe/CdTe/Si(211) samples prepared by focused ion beam milling. A key factor is the use of defect-decorated samples, which allows for a correlation of etch pits observed on the surface with underlying dislocation segments viewed in cross-section STEM images. We perform an analysis of these dislocations in terms of the general distribution, density, and mobility at various locations within the mesa structures. Based on our observations, we suggest factors that contribute to the underlying mechanism for dislocation gettering.

  4. Determination of total sulfur in lichens and plants by combustion-infrared analysis

    USGS Publications Warehouse

    Jackson, L.L.; Engleman, E.E.; Peard, J.L.

    1985-01-01

    Sulfur was determined in plants and lichens by combustion of the sample and infrared detection of evolved sulfur dioxide using an automated sulfur analyzer. Vanadium pentaoxide was used as a combustion accelerator. Pelletization of the sample prior to combustion was not found to be advantageous. Washing studies showed that leaching of sulfur was not a major factor in the sample preparation. The combustion-IR analysis usually gave higher sulfur content than the turbidimetric analysis as well as shorter analysis time. Relative standard deviations of less than 7% were obtained by the combustion-IR technique when sulfur levels in plant material ranged from 0.05 to 0.70%. Determination of sulfur in National Bureau of Standards botanical reference materials showed good agreement between the combustion-IR technique and other instrumental procedures. Seven NBS botanical reference materials were analyzed.

  5. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    This paper presents a comparison of analysis and flight test data for a drone aircraft equipped with an active flutter suppression system. Emphasis is placed on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are presented for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. In addition to presenting the mathematical models and a brief description of existing analytical techniques, an alternative analytical technique for obtaining closed-loop results is presented.

  6. [Multivariate geostatistics and GIS-based approach to study the spatial distribution and sources of heavy metals in agricultural soil in the Pearl River Delta, China].

    PubMed

    Cai, Li-mei; Ma, Jin; Zhou, Yong-zhang; Huang, Lan-chun; Dou, Lei; Zhang, Cheng-bo; Fu, Shan-ming

    2008-12-01

    One hundred and eighteen surface soil samples were collected from the Dongguan City, and analyzed for concentration of Cu, Zn, Ni, Cr, Pb, Cd, As, Hg, pH and OM. The spatial distribution and sources of soil heavy metals were studied using multivariate geostatistical methods and GIS technique. The results indicated concentrations of Cu, Zn, Ni, Pb, Cd and Hg were beyond the soil background content in Guangdong province, and especially concentrations of Pb, Cd and Hg were greatly beyond the content. The results of factor analysis group Cu, Zn, Ni, Cr and As in Factor 1, Pb and Hg in Factor 2 and Cd in Factor 3. The spatial maps based on geostatistical analysis show definite association of Factor 1 with the soil parent material, Factor 2 was mainly affected by industries. The spatial distribution of Factor 3 was attributed to anthropogenic influence.

  7. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  8. The use of the modified Cholesky decomposition in divergence and classification calculations

    NASA Technical Reports Server (NTRS)

    Vanroony, D. L.; Lynn, M. S.; Snyder, C. H.

    1973-01-01

    The use of the Cholesky decomposition technique is analyzed as applied to the feature selection and classification algorithms used in the analysis of remote sensing data (e.g. as in LARSYS). This technique is approximately 30% faster in classification and a factor of 2-3 faster in divergence, as compared with LARSYS. Also numerical stability and accuracy are slightly improved. Other methods necessary to deal with numerical stablity problems are briefly discussed.

  9. The use of the modified Cholesky decomposition in divergence and classification calculations

    NASA Technical Reports Server (NTRS)

    Van Rooy, D. L.; Lynn, M. S.; Snyder, C. H.

    1973-01-01

    This report analyzes the use of the modified Cholesky decomposition technique as applied to the feature selection and classification algorithms used in the analysis of remote sensing data (e.g., as in LARSYS). This technique is approximately 30% faster in classification and a factor of 2-3 faster in divergence, as compared with LARSYS. Also numerical stability and accuracy are slightly improved. Other methods necessary to deal with numerical stability problems are briefly discussed.

  10. Types of Sensory Integrative Dysfunction among Disabled Learners

    ERIC Educational Resources Information Center

    Ayres, A. Jean

    1972-01-01

    R-technique factor analysis was used to correlate results of sensorimotor, psycholinguistic and cognitive tests given to California children with learning disabilities. Results show not all children with specific neural disorders perform poorly on related tests where low scores would be expected. (PD)

  11. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  12. An acoustic emission and acousto-ultrasonic analysis of impact damaged composite pressure vessels

    NASA Technical Reports Server (NTRS)

    Workman, Gary L. (Principal Investigator); Walker, James L.

    1996-01-01

    The use of acoustic emission to characterize impact damage in composite structures is being performed on composite bottles wrapped with graphite epoxy and kevlar bottles. Further development of the acoustic emission methodology will include neural net analysis and/or other multivariate techniques to enhance the capability of the technique to identify dominant failure mechanisms during fracture. The acousto-ultrasonics technique will also continue to be investigated to determine its ability to predict regions prone to failure prior to the burst tests. Characterization of the stress wave factor before, and after impact damage will be useful for inspection purposes in manufacturing processes. The combination of the two methods will also allow for simple nondestructive tests capable of predicting the performance of a composite structure prior to its being placed in service and during service.

  13. Spatial patterns of heavy metals in soil under different geological structures and land uses for assessing metal enrichments.

    PubMed

    Krami, Loghman Khoda; Amiri, Fazel; Sefiyanian, Alireza; Shariff, Abdul Rashid B Mohamed; Tabatabaie, Tayebeh; Pradhan, Biswajeet

    2013-12-01

    One hundred and thirty composite soil samples were collected from Hamedan county, Iran to characterize the spatial distribution and trace the sources of heavy metals including As, Cd, Co, Cr, Cu, Ni, Pb, V, Zn, and Fe. The multivariate gap statistical analysis was used; for interrelation of spatial patterns of pollution, the disjunctive kriging and geoenrichment factor (EF(G)) techniques were applied. Heavy metals and soil properties were grouped using agglomerative hierarchical clustering and gap statistic. Principal component analysis was used for identification of the source of metals in a set of data. Geostatistics was used for the geospatial data processing. Based on the comparison between the original data and background values of the ten metals, the disjunctive kriging and EF(G) techniques were used to quantify their geospatial patterns and assess the contamination levels of the heavy metals. The spatial distribution map combined with the statistical analysis showed that the main source of Cr, Co, Ni, Zn, Pb, and V in group A land use (agriculture, rocky, and urban) was geogenic; the origin of As, Cd, and Cu was industrial and agricultural activities (anthropogenic sources). In group B land use (rangeland and orchards), the origin of metals (Cr, Co, Ni, Zn, and V) was mainly controlled by natural factors and As, Cd, Cu, and Pb had been added by organic factors. In group C land use (water), the origin of most heavy metals is natural without anthropogenic sources. The Cd and As pollution was relatively more serious in different land use. The EF(G) technique used confirmed the anthropogenic influence of heavy metal pollution. All metals showed concentrations substantially higher than their background values, suggesting anthropogenic pollution.

  14. Human factors issues and approaches in the spatial layout of a space station control room, including the use of virtual reality as a design analysis tool

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1994-01-01

    Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.

  15. Medical data mining: knowledge discovery in a clinical data warehouse.

    PubMed Central

    Prather, J. C.; Lobach, D. F.; Goodwin, L. K.; Hales, J. W.; Hage, M. L.; Hammond, W. E.

    1997-01-01

    Clinical databases have accumulated large quantities of information about patients and their medical conditions. Relationships and patterns within this data could provide new medical knowledge. Unfortunately, few methodologies have been developed and applied to discover this hidden knowledge. In this study, the techniques of data mining (also known as Knowledge Discovery in Databases) were used to search for relationships in a large clinical database. Specifically, data accumulated on 3,902 obstetrical patients were evaluated for factors potentially contributing to preterm birth using exploratory factor analysis. Three factors were identified by the investigators for further exploration. This paper describes the processes involved in mining a clinical database including data warehousing, data query and cleaning, and data analysis. PMID:9357597

  16. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  17. The Possible Application of Socioeconomic Careers and Path Analysis Concepts to the Study of Factors Relevant to Physicians' Choice of Practice Location.

    ERIC Educational Resources Information Center

    Grimes, Walter F.

    In response to the current shortage of rural physicians and the difficulties encountered in studying this problem, this paper attempts to apply a specific multivariate technique (path analysis) and the socioeconomic careers model of Featherman and others to the study of the physician's choice of practice location. The socioeconomic careers model…

  18. Simultaneous Analysis of the Behavioural Phenotype, Physical Factors, and Parenting Stress in People with Cornelia De Lange Syndrome

    ERIC Educational Resources Information Center

    Wulffaert, J.; van Berckelaer-Onnes, I.; Kroonenberg, P.; Scholte, E.; Bhuiyan, Z.; Hennekam, R.

    2009-01-01

    Background: Studies into the phenotype of rare genetic syndromes largely rely on bivariate analysis. The aim of this study was to describe the phenotype of Cornelia de Lange syndrome (CdLS) in depth by examining a large number of variables with varying measurement levels. Virtually the only suitable multivariate technique for this is categorical…

  19. Life Cycle Costing: A Working Level Approach

    DTIC Science & Technology

    1981-06-01

    Effects Analysis ( FMEA ) ...... ................ .. 59 Logistics Performance Factors (LPFs) 60 Planning the Use of Life Cycle Cost in the Demonstration...form. Failure Mode and Effects Analysis ( FMEA ). Description. FMEA is a technique that attempts to improve the design of any particular unit. The FMEA ...failure modes and also eliminate extra parts or ones that are used to achieve more performance than is necessary (16:5-14]. Advantages. FMEA forces

  20. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Comparision of ICP-OES and MP-AES in determing soil nutrients by Mechlich3 method

    NASA Astrophysics Data System (ADS)

    Tonutare, Tonu; Penu, Priit; Krebstein, Kadri; Rodima, Ako; Kolli, Raimo; Shanskiy, Merrit

    2014-05-01

    Accurate, routine testing of nutrients in soil samples is critical to understanding soil potential fertility. There are different factors which must be taken into account selecting the best analytical technique for soil laboratory analysis. Several techniques can provide adequate detection range for same analytical subject. In similar cases the choise of technique will depend on factors such as sample throughput, required infrastructure, ease of use, used chemicals and need for gas supply and operating costs. Mehlich 3 extraction method is widely used for the determination of the plant available nutrient elements contents in agricultural soils. For determination of Ca, K, and Mg from soil extract depending of laboratory ICP and AAS techniques are used, also flame photometry for K in some laboratories. For the determination of extracted P is used ICP or Vis spectrometry. The excellent sensitivity and wide working range for all extracted elements make ICP a nearly ideal method, so long as the sample throughput is big enough to justify the initial capital outlay. Other advantage of ICP techniques is the multiplex character (simultaneous acquisition of all wavelengths). Depending on element the detection limits are in range 0.1 - 1000 μg/L. For smaller laboratories with low sample throughput requirements the use of AAS is more common. Flame AAS is a fast, relatively cheap and easy technique for analysis of elements. The disadvantages of the method is single element analysis and use of flammable gas, like C2H2 and oxidation gas N2O for some elements. Detection limits of elements for AAS lays from 1 to 1000 μg/L. MP-AES offers a unique alternative to both, AAS and ICP-OES techniques with its detection power, speed of analysis. MP-AES is quite new, simple and relatively inexpensive multielemental technique, which is use self-sustained atmospheric pressure microwave plasma (MP) using nitrogen gas generated by nitrogen generator. Therefore not needs for argon and flammable (C2H2) gases, cylinder handling and the running costs of equipment are low. Detection limits of elements for MP-AES lays between the AAS and ICP ones. The objective of this study was to compare the results of soil analysis using two multielemental analytical methods - ICP-OES and MP-AES. In the experiment, different soil types with various texture, content of organic matter and pH were used. For the study soil samples of Albeluvisols, Leptosols, Cambisols, Regosols and Histosols were used . The plant available nutrients were estimated by Mehlich 3 extraction. The ICP-OES analysis were provided in the Estonian Agricultural Research Centre and MP-AES analysis in department of Soil Science and Agrochemistry at Estonian University of Life Sciences. The detection limits and limits of quantification of Ca, K, Mg and P in extracts are calculated and reported.

  2. Systems thinking applied to safety during manual handling tasks in the transport and storage industry.

    PubMed

    Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter

    2014-07-01

    Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Risk Factors of Catheter-Related Thrombosis (CRT) in Cancer Patients: A Patient-Level Data (IPD) Meta-Analysis of Clinical Trials and Prospective Studies

    PubMed Central

    Saber, W.; Moua, T.; Williams, E. C.; Verso, M.; Agnelli, G.; Couban, S.; Young, A.; De Cicco, M.; Biffi, R.; van Rooden, C. J.; Huisman, M. V.; Fagnani, D.; Cimminiello, C.; Moia, M.; Magagnoli, M.; Povoski, S. P.; Malak, S. F.; Lee, A. Y.

    2010-01-01

    Background Knowledge of independent, baseline risk factors of catheter-related thrombosis (CRT) may help select adult cancer patients at high risk to receive thromboprophylaxis. Objectives We conducted a meta-analysis of individual patient-level data to identify these baseline risk factors. Patients/Methods MEDLINE, EMBASE, CINAHL, CENTRAL, DARE, Grey literature databases were searched in all languages from 1995-2008. Prospective studies and randomized controlled trials (RCTs) were eligible. Studies were included if original patient-level data were provided by the investigators and if CRT was objectively confirmed with valid imaging. Multivariate logistic regression analysis of 17 prespecified baseline characteristics was conducted. Adjusted odds ratios (OR) and 95% confidence intervals (CI) were estimated. Results A total sample of 5636 subjects from 5 RCTs and 7 prospective studies was included in the analysis. Among these subjects, 425 CRT events were observed. In multivariate logistic regression, the use of implanted ports as compared with peripherally implanted central venous catheters (PICC), decreased CRT risk (OR = 0.43; 95% CI, 0.23-0.80), whereas past history of deep vein thrombosis (DVT) (OR = 2.03; 95% CI, 1.05-3.92), subclavian venipuncture insertion technique (OR = 2.16; 95% CI, 1.07-4.34), and improper catheter tip location (OR = 1.92; 95% CI, 1.22-3.02), increased CRT risk. Conclusions CRT risk is increased with using PICC catheters, previous history of DVT, subclavian venipuncture insertion technique and improper positioning of the catheter tip. These factors may be useful for risk stratifying patients to select those for thromboprophylaxis. Prospective studies are needed to validate these findings. PMID:21040443

  4. Construct validation of emotional labor scale for a sample of Pakistani corporate employees.

    PubMed

    Akhter, Noreen

    2017-02-01

    To translate, adapt and validate emotional labour scale for Pakistani corporate employees. This study was conducted in locale of Rawalpindi and Islamabad from October 2014 to December 2015, and comprised customer service employees of commercial banks and telecommunication companies. It comprised of two independent parts. Part one had two steps. Step one involved translation and adaptation of the instrument. In the second step psychometric properties of the translated scale were established by administering it to customer services employees from commercial banks and the telecommunication sector. Data of the pilot study was analysed by using exploratory factor analysis to extract the initial factor of emotional labour. Part two comprised the main study. Commercial bank employees were included in the sample by using convenient sampling technique. SPSS 20 was used for data analysis. There were 145 participants in the first study and 495 in the second study . Exploratory factor analysis initially generated three-factor model of emotional labour which was further confirmed by confirmatory factor analysis suggesting that emotional labour had three distinct dimensions, i.e. surface acting, deep acting and genuine expressions of emotions. The emotional labour scale was found to be a valid and reliable measure.

  5. Data mining-based coefficient of influence factors optimization of test paper reliability

    NASA Astrophysics Data System (ADS)

    Xu, Peiyao; Jiang, Huiping; Wei, Jieyao

    2018-05-01

    Test is a significant part of the teaching process. It demonstrates the final outcome of school teaching through teachers' teaching level and students' scores. The analysis of test paper is a complex operation that has the characteristics of non-linear relation in the length of the paper, time duration and the degree of difficulty. It is therefore difficult to optimize the coefficient of influence factors under different conditions in order to get text papers with clearly higher reliability with general methods [1]. With data mining techniques like Support Vector Regression (SVR) and Genetic Algorithm (GA), we can model the test paper analysis and optimize the coefficient of impact factors for higher reliability. It's easy to find that the combination of SVR and GA can get an effective advance in reliability from the test results. The optimal coefficient of influence factors optimization has a practicability in actual application, and the whole optimizing operation can offer model basis for test paper analysis.

  6. "Feeling unsafe": a photovoice analysis of factors influencing physical activity behavior among Malaysian adolescents.

    PubMed

    Saimon, Rosalia; Choo, Wan Yuen; Bulgiba, Awang

    2015-03-01

    Understanding the factors influencing physical activity (PA) in the Asia-Pacific region is critical, given the high prevalence of inactivity in this area. The photovoice technique explores the types of PA and factors influencing PA among adolescents in Kuching, Sarawak. A total of 160 photographs were collected from participants (adolescents, n = 22, mean age = 14.27 ± 0.7 years, and parents, n = 8, mean age = 48 ± 6.8 years). Data analysis used constant comparison methods of a grounded theory. The Analysis Grid for Environments Linked to Obesity was used to categorize PA factors. Study findings were centered on the concept of safety, facilities, parental restriction, friends, cultural traits, media, community cohesiveness, and weather. The central theme was "feeling unsafe" when being outdoors. To promote PA behavior, provision of PA facilities needs to be supported by other programs that build on peer support, crime prevention, and traffic safety, together with other educational campaigns. © 2013 APJPH.

  7. Concrete pavement mixture design and analysis (MDA) : application of a portable x-ray fluorescence technique to assess concrete mix proportions.

    DOT National Transportation Integrated Search

    2012-03-01

    Any transportation infrastructure system is inherently concerned with durability and performance issues. The proportioning and : uniformity control of concrete mixtures are critical factors that directly affect the longevity and performance of the po...

  8. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  9. Analysis of Proportional Integral and Optimized Proportional Integral Controllers for Resistance Spot Welding System (RSWS) - A Performance Perspective

    NASA Astrophysics Data System (ADS)

    Rama Subbanna, S.; Suryakalavathi, M., Dr.

    2017-08-01

    This paper is an attempt to accomplish a performance analysis of the different control techniques on spikes reduction method applied on the medium frequency transformer based DC spot welding system. Spike reduction is an important factor to be considered while spot welding systems are concerned. During normal RSWS operation welding transformer’s magnetic core can become saturated due to the unbalanced resistances of both transformer secondary windings and different characteristics of output rectifier diodes, which causes current spikes and over-current protection switch-off of the entire system. The current control technique is a piecewise linear control technique that is inspired from the DC-DC converter control algorithms to register a novel spike reduction method in the MFDC spot welding applications. Two controllers that were used for the spike reduction portion of the overall applications involve the traditional PI controller and Optimized PI controller. Care is taken such that the current control technique would maintain a reduced spikes in the primary current of the transformer while it reduces the Total Harmonic Distortion. The performance parameter that is involved in the spikes reduction technique is the THD, Percentage of current spike reduction for both techniques. Matlab/SimulinkTM based simulation is carried out for the MFDC RSWS with KW and results are tabulated for the PI and Optimized PI controllers and a tradeoff analysis is carried out.

  10. Modified skin window technique for the extended characterisation of acute inflammation in humans

    PubMed Central

    Marks, D. J. B.; Radulovic, M.; McCartney, S.; Bloom, S.; Segal, A. W.

    2009-01-01

    Objective To modify the skin window technique for extended analysis of acute inflammatory responses in humans, and demonstrate its applicability for investigating disease. Subjects 15 healthy subjects and 5 Crohn’s patients. Treatment Skin windows, created by dermal abrasion, were overlaid for various durations with filter papers saturated in saline, 100 ng/ml muramyl dipeptide (MDP) or 10 μg/ml interleukin-8 (IL-8). Methods Exuded leukocytes were analyzed by microscopy, immunoblot, DNA-bound transcription factor arrays and RT-PCR. Inflammatory mediators were quantified by ELISA. Results Infiltrating leukocytes were predominantly neutrophils. Numerous secreted mediators were detectable. MDP and IL-8 enhanced responses. Many signalling proteins were phosphorylated with differential patterns in Crohn’s patients, notably PKC α/β hyperphosphorylation (11.3 ± 3.1 vs 1.2 ± 0.9 units, P < 0.02). Activities of 44 transcription factors were detectable, and sufficient RNA isolated for expression analysis of over 400 genes. Conclusions The modifications enable broad characterisation of inflammatory responses and administration of exogenous immunomodulators. PMID:17522815

  11. Structure of the Nucleon and its Excitations

    NASA Astrophysics Data System (ADS)

    Kamleh, Waseem; Leinweber, Derek; Liu, Zhan-wei; Stokes, Finn; Thomas, Anthony; Thomas, Samuel; Wu, Jia-jun

    2018-03-01

    The structure of the ground state nucleon and its finite-volume excitations are examined from three different perspectives. Using new techniques to extract the relativistic components of the nucleon wave function, the node structure of both the upper and lower components of the nucleon wave function are illustrated. A non-trivial role for gluonic components is manifest. In the second approach, the parity-expanded variational analysis (PEVA) technique is utilised to isolate states at finite momenta, enabling a novel examination of the electric and magnetic form factors of nucleon excitations. Here the magnetic form factors of low-lying odd-parity nucleons are particularly interesting. Finally, the structure of the nucleon spectrum is examined in a Hamiltonian effective field theory analysis incorporating recent lattice-QCD determinations of low-lying two-particle scattering-state energies in the finite volume. The Roper resonance of Nature is observed to originate from multi-particle coupled-channel interactions while the first radial excitation of the nucleon sits much higher at approximately 1.9 GeV.

  12. Types of provincial structure and population health.

    PubMed

    Young, Frank W; Rodriguez, Eunice

    2005-01-01

    This paper explores the potential of using large administrative units for studies of population health within a country. The objective is to illustrate a new way of defining structural dimensions and to use them in examining variation in life expectancy rates. We use data from the 50 provinces of Spain as a case study. A factor analysis of organizational items such as schools, hotels and medical personnel is employed to define and generate "collective" measures for well-known provincial types, in this case: urban, commercial, industrial and tourist provinces. The scores derived from the factor analysis are then used in a regression model to predict life expectancy. The City-centered and Commercial provinces showed positive correlations with life expectancy while those for the Tourist provinces were negative. The industrial type was nonsignificant. Explanations of these correlations are proposed and the advantages and disadvantages of this exploratory technique are reviewed. The use of this technique for generating an overview of social organization and population health is discussed.

  13. Pneumatic jigging: Influence of operating parameters on separation efficiency of solid waste materials.

    PubMed

    Abd Aziz, Mohd Aizudin; Md Isa, Khairuddin; Ab Rashid, Radzuwan

    2017-06-01

    This article aims to provide insights into the factors that contribute to the separation efficiency of solid particles. In this study, a pneumatic jigging technique was used to assess the separation of solid waste materials that consisted of copper, glass and rubber insulator. Several initial experiments were carried out to evaluate the strengths and limitations of the technique. It is found that despite some limitations of the technique, all the samples prepared for the experiments were successfully separated. The follow-up experiments were then carried out to further assess the separation of copper wire and rubber insulator. The effects of air flow and pulse rates on the separation process were examined. The data for these follow-up experiments were analysed using a sink float analysis technique. The analysis shows that the air flow rate was very important in determining the separation efficiency. However, the separation efficiency may be influenced by the type of materials used.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, L.L.; Engleman, E.E.; Peard, J.L.

    Sulfur was determined in plants and lichens by combustion of the sample and infrared detection of evolved sulfur dioxide using an automated sulfur analyzer. Vanadium pentaoxide was used as a combustion accelerator. Pelletization of the sample prior to combustion was not found to be advantageous. Washing studies showed that leaching of sulfur was not a major factor in the sample preparation. The combustion-IR analysis usually gave higher sulfur content than the turbidimetric analysis as well as shorter analysis time. Relative standard deviations of less than 7% were obtained by the combustion-IR technique when sulfur levels in plant material range frommore » 0.05 to 0.70%. Determination of sulfur in National Bureau of Standards botanical reference materials showed good agreement between the combustion-IR technique and other instrumental procedures. Seven NBS botanical reference materials were analyzed.« less

  15. Successive ion layer adsorption and reaction (SILAR) technique synthesis of Al(III)-8-hydroxy-5-nitrosoquinolate nano-sized thin films: characterization and factors optimization.

    PubMed

    Haggag, Sawsan M S; Farag, A A M; Abdel Refea, M

    2013-02-01

    Nano Al(III)-8-hydroxy-5-nitrosoquinolate [Al(III)-(HNOQ)(3)] thin films were synthesized by the rapid, direct, simple and efficient successive ion layer adsorption and reaction (SILAR) technique. Thin film formation optimized factors were evaluated. Stoichiometry and structure were confirmed by elemental analysis and FT-IR. The particle size (27-71 nm) was determined using scanning electron microscope (SEM). Thermal stability and thermal parameters were determined by thermal gravimetric analysis (TGA). Optical properties were investigated using spectrophotometric measurements of transmittance and reflectance at normal incidence. Refractive index, n, and absorption index, k, were determined. Spectral behavior of the absorption coefficient in the intrinsic absorption region revealed a direct allowed transition with 2.45 eV band gap. The current-voltage (I-V) characteristics of [Al(III)-(HNOQ)(3)]/p-Si heterojunction was measured at room temperature. The forward and reverse I-V characteristics were analyzed. The calculated zero-bias barrier height (Φ(b)) and ideality factor (n) showed strong bias dependence. Energy distribution of interface states (N(ss)) was obtained. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Defining the questions: a research agenda for nontraditional authentication in arms control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hauck, Danielle K; Mac Arthur, Duncan W; Smith, Morag K

    Many traditional authentication techniques have been based on hardware solutions. Thus authentication of measurement system hardware has been considered in terms of physical inspection and destructive analysis. Software authentication has implied hash function analysis or authentication tools such as Rose. Continuity of knowledge is maintained through TIDs and cameras. Although there is ongoing progress improving all of these authentication methods, there has been little discussion of the human factors involved in authentication. Issues of non-traditional authentication include sleight-of-hand substitutions, monitor perception vs. reality, and visual diversions. Since monitor confidence in a measurement system depends on the product of their confidencesmore » in each authentication element, it is important to investigate all authentication techniques, including the human factors. This paper will present an initial effort to identify the most important problems that traditional authentication approaches in safeguards have not addressed and are especially relevant to arms control verification. This will include a survey of the literature and direct engagement with nontraditional experts in areas like psychology and human factors. Based on the identification of problem areas, potential research areas will be identified and a possible research agenda will be developed.« less

  17. Prediction of Return-to-original-work after an Industrial Accident Using Machine Learning and Comparison of Techniques

    PubMed Central

    2018-01-01

    Background Many studies have tried to develop predictors for return-to-work (RTW). However, since complex factors have been demonstrated to predict RTW, it is difficult to use them practically. This study investigated whether factors used in previous studies could predict whether an individual had returned to his/her original work by four years after termination of the worker's recovery period. Methods An initial logistic regression analysis of 1,567 participants of the fourth Panel Study of Worker's Compensation Insurance yielded odds ratios. The participants were divided into two subsets, a training dataset and a test dataset. Using the training dataset, logistic regression, decision tree, random forest, and support vector machine models were established, and important variables of each model were identified. The predictive abilities of the different models were compared. Results The analysis showed that only earned income and company-related factors significantly affected return-to-original-work (RTOW). The random forest model showed the best accuracy among the tested machine learning models; however, the difference was not prominent. Conclusion It is possible to predict a worker's probability of RTOW using machine learning techniques with moderate accuracy. PMID:29736160

  18. Analysis of Weibull Grading Test for Solid Tantalum Capacitors

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2010-01-01

    Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.

  19. Influence of Topographic and Hydrographic Factors on the Spatial Distribution of Leptospirosis Disease in São Paulo County, Brazil: An Approach Using Geospatial Techniques and GIS Analysis

    NASA Astrophysics Data System (ADS)

    Ferreira, M. C.; Ferreira, M. F. M.

    2016-06-01

    Leptospirosis is a zoonosis caused by Leptospira genus bacteria. Rodents, especially Rattus norvegicus, are the most frequent hosts of this microorganism in the cities. The human transmission occurs by contact with urine, blood or tissues of the rodent and contacting water or mud contaminated by rodent urine. Spatial patterns of concentration of leptospirosis are related to the multiple environmental and socioeconomic factors, like housing near flooding areas, domestic garbage disposal sites and high-density of peoples living in slums located near river channels. We used geospatial techniques and geographical information system (GIS) to analysing spatial relationship between the distribution of leptospirosis cases and distance from rivers, river density in the census sector and terrain slope factors, in Sao Paulo County, Brazil. To test this methodology we used a sample of 183 geocoded leptospirosis cases confirmed in 2007, ASTER GDEM2 data, hydrography and census sectors shapefiles. Our results showed that GIS and geospatial analysis techniques improved the mapping of the disease and permitted identify the spatial pattern of association between location of cases and spatial distribution of the environmental variables analyzed. This study showed also that leptospirosis cases might be more related to the census sectors located on higher river density areas and households situated at shorter distances from rivers. In the other hand, it was not possible to assert that slope terrain contributes significantly to the location of leptospirosis cases.

  20. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    PubMed

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  1. [Sanitation of the health service centre in Warsaw (Samodzielny Zespół Publicznych Zakładów Lecznictwa Otwartego Warszawa-Mokotów). Financial and economic analysis].

    PubMed

    Buczak-Stec, Elzbieta

    2010-01-01

    The aim of the financial and economic analysis, conducted in March 2010, was to identify all significant factors that had a positive influence on the restructuring process in the health service centre (Samodzielny Zespół Publicznych Zakładów Lecznictwa Otwartego Warszawa--Mokotów) in Warsaw. Within the framework of the analysis, financial data form time period 1999-2009 were analyzed. Also the managing director and financial director were interviewed. Taking into consideration research results it can be stated that not a single factor but a collection of the purposeful efforts influenced the improvement of the health service centre condition. Apart from received public help, the most significant factors include: rational restructuring process, managing of personnel development, professionally managed financial department, cooperation between departments, good internal communication and use of modern management techniques.

  2. Experiments to Evaluate and Implement Passive Tracer Gas Methods to Measure Ventilation Rates in Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lunden, Melissa; Faulkner, David; Heredia, Elizabeth

    2012-10-01

    This report documents experiments performed in three homes to assess the methodology used to determine air exchange rates using passive tracer techniques. The experiments used four different tracer gases emitted simultaneously but implemented with different spatial coverage in the home. Two different tracer gas sampling methods were used. The results characterize the factors of the execution and analysis of the passive tracer technique that affect the uncertainty in the calculated air exchange rates. These factors include uncertainties in tracer gas emission rates, differences in measured concentrations for different tracer gases, temporal and spatial variability of the concentrations, the comparison betweenmore » different gas sampling methods, and the effect of different ventilation conditions.« less

  3. Discrete geometric analysis of message passing algorithm on graphs

    NASA Astrophysics Data System (ADS)

    Watanabe, Yusuke

    2010-04-01

    We often encounter probability distributions given as unnormalized products of non-negative functions. The factorization structures are represented by hypergraphs called factor graphs. Such distributions appear in various fields, including statistics, artificial intelligence, statistical physics, error correcting codes, etc. Given such a distribution, computations of marginal distributions and the normalization constant are often required. However, they are computationally intractable because of their computational costs. One successful approximation method is Loopy Belief Propagation (LBP) algorithm. The focus of this thesis is an analysis of the LBP algorithm. If the factor graph is a tree, i.e. having no cycle, the algorithm gives the exact quantities. If the factor graph has cycles, however, the LBP algorithm does not give exact results and possibly exhibits oscillatory and non-convergent behaviors. The thematic question of this thesis is "How the behaviors of the LBP algorithm are affected by the discrete geometry of the factor graph?" The primary contribution of this thesis is the discovery of a formula that establishes the relation between the LBP, the Bethe free energy and the graph zeta function. This formula provides new techniques for analysis of the LBP algorithm, connecting properties of the graph and of the LBP and the Bethe free energy. We demonstrate applications of the techniques to several problems including (non) convexity of the Bethe free energy, the uniqueness and stability of the LBP fixed point. We also discuss the loop series initiated by Chertkov and Chernyak. The loop series is a subgraph expansion of the normalization constant, or partition function, and reflects the graph geometry. We investigate theoretical natures of the series. Moreover, we show a partial connection between the loop series and the graph zeta function.

  4. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  5. Chromatin immunoprecipitation assays: application of ChIP-on-chip for defining dynamic transcriptional mechanisms in bone cells.

    PubMed

    van der Deen, Margaretha; Hassan, Mohammad Q; Pratap, Jitesh; Teplyuk, Nadiya M; Young, Daniel W; Javed, Amjad; Zaidi, Sayyed K; Lian, Jane B; Montecino, Martin; Stein, Janet L; Stein, Gary S; van Wijnen, Andre J

    2008-01-01

    Normal cell growth and differentiation of bone cells requires the sequential expression of cell type specific genes to permit lineage specification and development of cellular phenotypes. Transcriptional activation and repression of distinct sets of genes support the anabolic functions of osteoblasts and the catabolic properties of osteoclasts. Furthermore, metastasis of tumors to the bone environment is controlled by transcriptional mechanisms. Insights into the transcriptional regulation of genes in bone cells may provide a conceptual basis for improved therapeutic approaches to treat bone fractures, genetic osteopathologies, and/or cancer metastases to bone. Chromatin immunoprecipitation (ChIP) is a powerful technique to establish in vivo binding of transcription factors to the promoters of genes that are either activated or repressed in bone cells. Combining ChIP with genomic microarray analysis, colloquially referred to as "ChIP-on-chip," has become a valuable method for analysis of endogenous protein/DNA interactions. This technique permits assessment of chromosomal binding sites for transcription factors or the location of histone modifications at a genomic scale. This chapter discusses protocols for performing chromatin immunoprecipitation experiments, with a focus on ChIP-on-chip analysis. The information presented is based on the authors' experience with defining interactions of Runt-related (RUNX) transcription factors with bone-related genes within the context of the native nucleosomal organization of intact osteoblastic cells.

  6. Selecting Strategies to Reduce High-Risk Unsafe Work Behaviors Using the Safety Behavior Sampling Technique and Bayesian Network Analysis.

    PubMed

    Ghasemi, Fakhradin; Kalatpour, Omid; Moghimbeigi, Abbas; Mohammadfam, Iraj

    2017-03-04

    High-risk unsafe behaviors (HRUBs) have been known as the main cause of occupational accidents. Considering the financial and societal costs of accidents and the limitations of available resources, there is an urgent need for managing unsafe behaviors at workplaces. The aim of the present study was to find strategies for decreasing the rate of HRUBs using an integrated approach of safety behavior sampling technique and Bayesian networks analysis. A cross-sectional study. The Bayesian network was constructed using a focus group approach. The required data was collected using the safety behavior sampling, and the parameters of the network were estimated using Expectation-Maximization algorithm. Using sensitivity analysis and belief updating, it was determined that which factors had the highest influences on unsafe behavior. Based on BN analyses, safety training was the most important factor influencing employees' behavior at the workplace. High quality safety training courses can reduce the rate of HRUBs about 10%. Moreover, the rate of HRUBs increased by decreasing the age of employees. The rate of HRUBs was higher in the afternoon and last days of a week. Among the investigated variables, training was the most important factor affecting safety behavior of employees. By holding high quality safety training courses, companies would be able to reduce the rate of HRUBs significantly.

  7. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  8. Assessment of phytoplankton class abundance using fluorescence excitation-emission matrix by parallel factor analysis and nonnegative least squares

    NASA Astrophysics Data System (ADS)

    Su, Rongguo; Chen, Xiaona; Wu, Zhenzhen; Yao, Peng; Shi, Xiaoyong

    2015-07-01

    The feasibility of using fluorescence excitation-emission matrix (EEM) along with parallel factor analysis (PARAFAC) and nonnegative least squares (NNLS) method for the differentiation of phytoplankton taxonomic groups was investigated. Forty-one phytoplankton species belonging to 28 genera of five divisions were studied. First, the PARAFAC model was applied to EEMs, and 15 fluorescence components were generated. Second, 15 fluorescence components were found to have a strong discriminating capability based on Bayesian discriminant analysis (BDA). Third, all spectra of the fluorescence component compositions for the 41 phytoplankton species were spectrographically sorted into 61 reference spectra using hierarchical cluster analysis (HCA), and then, the reference spectra were used to establish a database. Finally, the phytoplankton taxonomic groups was differentiated by the reference spectra database using the NNLS method. The five phytoplankton groups were differentiated with the correct discrimination ratios (CDRs) of 100% for single-species samples at the division level. The CDRs for the mixtures were above 91% for the dominant phytoplankton species and above 73% for the subdominant phytoplankton species. Sixteen of the 85 field samples collected from the Changjiang River estuary were analyzed by both HPLC-CHEMTAX and the fluorometric technique developed. The results of both methods reveal that Bacillariophyta was the dominant algal group in these 16 samples and that the subdominant algal groups comprised Dinophyta, Chlorophyta and Cryptophyta. The differentiation results by the fluorometric technique were in good agreement with those from HPLC-CHEMTAX. The results indicate that the fluorometric technique could differentiate algal taxonomic groups accurately at the division level.

  9. The Incremental Multiresolution Matrix Factorization Algorithm

    PubMed Central

    Ithapu, Vamsi K.; Kondor, Risi; Johnson, Sterling C.; Singh, Vikas

    2017-01-01

    Multiresolution analysis and matrix factorization are foundational tools in computer vision. In this work, we study the interface between these two distinct topics and obtain techniques to uncover hierarchical block structure in symmetric matrices – an important aspect in the success of many vision problems. Our new algorithm, the incremental multiresolution matrix factorization, uncovers such structure one feature at a time, and hence scales well to large matrices. We describe how this multiscale analysis goes much farther than what a direct “global” factorization of the data can identify. We evaluate the efficacy of the resulting factorizations for relative leveraging within regression tasks using medical imaging data. We also use the factorization on representations learned by popular deep networks, providing evidence of their ability to infer semantic relationships even when they are not explicitly trained to do so. We show that this algorithm can be used as an exploratory tool to improve the network architecture, and within numerous other settings in vision. PMID:29416293

  10. Factors Affecting the Communication Competence in Iranian Nursing Students: A Qualitative Study

    PubMed Central

    Jouzi, Mina; Vanaki, Zohreh; Mohammadi, Easa

    2015-01-01

    Background: Communication competence in nursing students is one of the nursing education requirements, especially during the internship period, the final stage of the bachelor nursing education in Iran. Several factors can influence this competence and identifying them could help provide safe care by nursing students in the future. Objectives: This study aimed to investigate factors that influence nursing students' communication competence. Patients and Methods: A purposeful sampling technique was used to select 18 nursing students who had completed their internship. Semi-structured interviews were conducted and data were analyzed by the conventional qualitative content analysis method. Results: After data analysis, three main categories were achieved: organizational factors, humanistic factors and socio-cultural factors. The main and latent theme that affected the students' communication competence was not being accepted as a caregiver in the clinical environment. Conclusions: With regards to students not being accepted in health care environments, it is recommended to plan special programs for empowering students to acquire better social state and acceptance by the health care team. PMID:26019902

  11. Long-term follow-up results of umbilical hernia repair

    PubMed Central

    Venclauskas, Linas; Zilinskas, Justas; Zviniene, Kristina; Kiudelis, Mindaugas

    2017-01-01

    Introduction Multiple suture techniques and various mesh repairs are used in open or laparoscopic umbilical hernia (UH) surgery. Aim To compare long-term follow-up results of UH repair in different hernia surgery groups and to identify risk factors for UH recurrence. Material and methods A retrospective analysis of 216 patients who underwent elective surgery for UH during a 10-year period was performed. The patients were divided into three groups according to surgery technique (suture, mesh and laparoscopic repair). Early and long-term follow-up results including hospital stay, postoperative general and wound complications, recurrence rate and postoperative patient complaints were reviewed. Risk factors for recurrence were also analyzed. Results One hundred and forty-six patients were operated on using suture repair, 52 using open mesh and 18 using laparoscopic repair technique. 77.8% of patients underwent long-term follow-up. The postoperative wound complication rate and long-term postoperative complaints were significantly higher in the open mesh repair group. The overall hernia recurrence rate was 13.1%. Only 2 (1.7%) patients with small hernias (< 2 cm) had a recurrence in the suture repair group. Logistic regression analysis showed that body mass index (BMI) > 30 kg/m2, diabetes and wound infection were independent risk factors for umbilical hernia recurrence. Conclusions The overall umbilical hernia recurrence rate was 13.1%. Body mass index > 30 kg/m2, diabetes and wound infection were independent risk factors for UH recurrence. According to our study results, laparoscopic medium and large umbilical hernia repair has slight advantages over open mesh repair concerning early postoperative complications, long-term postoperative pain and recurrence. PMID:29362649

  12. Spectroscopic ellipsometry and polarimetry for materials and systems analysis at the nanometer scale: state-of-the-art, potential, and perspectives

    PubMed Central

    Bergmair, Michael; Bruno, Giovanni; Cattelan, Denis; Cobet, Christoph; de Martino, Antonello; Fleischer, Karsten; Dohcevic-Mitrovic, Zorana; Esser, Norbert; Galliet, Melanie; Gajic, Rados; Hemzal, Dušan; Hingerl, Kurt; Humlicek, Josef; Ossikovski, Razvigor; Popovic, Zoran V.; Saxl, Ottilia

    2009-01-01

    This paper discusses the fundamentals, applications, potential, limitations, and future perspectives of polarized light reflection techniques for the characterization of materials and related systems and devices at the nanoscale. These techniques include spectroscopic ellipsometry, polarimetry, and reflectance anisotropy. We give an overview of the various ellipsometry strategies for the measurement and analysis of nanometric films, metal nanoparticles and nanowires, semiconductor nanocrystals, and submicron periodic structures. We show that ellipsometry is capable of more than the determination of thickness and optical properties, and it can be exploited to gain information about process control, geometry factors, anisotropy, defects, and quantum confinement effects of nanostructures. PMID:21170135

  13. Application of sensitivity-analysis techniques to the calculation of topological quantities

    NASA Astrophysics Data System (ADS)

    Gilchrist, Stuart

    2017-08-01

    Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.

  14. Prediction of light aircraft interior noise

    NASA Technical Reports Server (NTRS)

    Howlett, J. T.; Morales, D. A.

    1976-01-01

    At the present time, predictions of aircraft interior noise depend heavily on empirical correction factors derived from previous flight measurements. However, to design for acceptable interior noise levels and to optimize acoustic treatments, analytical techniques which do not depend on empirical data are needed. This paper describes a computerized interior noise prediction method for light aircraft. An existing analytical program (developed for commercial jets by Cockburn and Jolly in 1968) forms the basis of some modal analysis work which is described. The accuracy of this modal analysis technique for predicting low-frequency coupled acoustic-structural natural frequencies is discussed along with trends indicating the effects of varying parameters such as fuselage length and diameter, structural stiffness, and interior acoustic absorption.

  15. Efficient and robust analysis of complex scattering data under noise in microwave resonators.

    PubMed

    Probst, S; Song, F B; Bushev, P A; Ustinov, A V; Weides, M

    2015-02-01

    Superconducting microwave resonators are reliable circuits widely used for detection and as test devices for material research. A reliable determination of their external and internal quality factors is crucial for many modern applications, which either require fast measurements or operate in the single photon regime with small signal to noise ratios. Here, we use the circle fit technique with diameter correction and provide a step by step guide for implementing an algorithm for robust fitting and calibration of complex resonator scattering data in the presence of noise. The speedup and robustness of the analysis are achieved by employing an algebraic rather than an iterative fit technique for the resonance circle.

  16. Risk factor analysis of new brain lesions associated with carotid endarterectmy.

    PubMed

    Lee, Jae Hoon; Suh, Bo Yang

    2014-01-01

    Carotid endarterectomy (CEA) is the standard treatment for carotid artery stenosis. New brain ischemia is a major concern associated with CEA and diffusion weighted imaging (DWI) is a good imaging modality for detecting early ischemic brain lesions. We aimed to investigate the surgical complications and identify the potential risk factors for the incidence of new brain lesions (NBL) on DWI after CEA. From January 2006 to November 2011, 94 patients who had been studied by magnetic resonance imaging including DWI within 1 week after CEA were included in this study. Data were retrospectively investigated by review of vascular registry protocol. Seven clinical variables and three procedural variables were analyzed as risk factors for NBL after CEA. The incidence of periprocedural NBL on DWI was 27.7%. There were no fatal complications, such as ipsilateral disabling stroke, myocardial infarction or mortality. A significantly higher incidence of NBL was found in ulcer positive patients as opposed to ulcer negative patients (P = 0.029). The incidence of NBL after operation was significantly higher in patients treated with conventional technique than with eversion technique (P = 0.042). Our data shows CEA has acceptable periprocedural complication rates and the existence of ulcerative plaque and conventional technique of endarterectomy are high risk factors for NBL development after CEA.

  17. Analysis of endodontist posture utilizing cinemetry, surface electromyography and ergonomic checklists.

    PubMed

    Onety, Geraldo Celso da Silva; Leonel, Daniel Vilela; Saquy, Paulo César; Silva, Gabriel Pádua da; Ferreira, Bruno; Varise, Tiago Gilioli; Sousa, Luiz Gustavo de; Verri, Edson Donizetti; Siéssere, Selma; Semprini, Marisa; Nepomuceno, Victor Rodrigues; Regalo, Simone Cecilio Hallak

    2014-01-01

    The postural risk factors for dentists include the ease of vision in the workplace, cold, vibration and mechanical pressure in tissues, incorrect posture, functional fixity, cognitive requirements and work-related organizational and psychosocial factors. The objective was to analyze the posture of endodontists at the workplace. Eighteen right-handed endodontists aged 25 to 60 years (34±3) participated in the study. Electromyography, kinemetry, ergonomic scales (RULA and Couto's checklist) and biophotogrammetry were used to analyze the posture of endodontists during root canal treatment of the maxillary right first and second molars using rotary and manual instrumentation. The variations observed in the electromyographic activities during the performance of rotary and manual techniques suggest that the fibers of the longissimus region, anterior and medium deltoid, medium trapezium, biceps, triceps brachii, brachioradialis and short thumb abductor muscles underwent adaptations to provide more accurate functional movements. Computerized kinemetry and biophotogrammetry showed that, as far as posture is concerned, rotary technique was more demanding than the manual technique. In conclusion, the group of endodontists evaluated in this study exhibited posture disorders regardless of whether the rotary or manual technique was used.

  18. Evaluation of the environmental contamination at an abandoned mining site using multivariate statistical techniques--the Rodalquilar (Southern Spain) mining district.

    PubMed

    Bagur, M G; Morales, S; López-Chicano, M

    2009-11-15

    Unsupervised and supervised pattern recognition techniques such as hierarchical cluster analysis, principal component analysis, factor analysis and linear discriminant analysis have been applied to water samples recollected in Rodalquilar mining district (Southern Spain) in order to identify different sources of environmental pollution caused by the abandoned mining industry. The effect of the mining activity on waters was monitored determining the concentration of eleven elements (Mn, Ba, Co, Cu, Zn, As, Cd, Sb, Hg, Au and Pb) by inductively coupled plasma mass spectrometry (ICP-MS). The Box-Cox transformation has been used to transform the data set in normal form in order to minimize the non-normal distribution of the geochemical data. The environmental impact is affected mainly by the mining activity developed in the zone, the acid drainage and finally by the chemical treatment used for the benefit of gold.

  19. Using the Science Writing Heuristic in the General Chemistry Laboratory to Improve Students' Academic Performance

    ERIC Educational Resources Information Center

    Poock, Jason R.; Burke, K. A.; Greenbowe, Thomas J.; Hand, Brian M.

    2007-01-01

    The analysis describes the effects of using the science writing heuristic (SWH) in the general chemistry laboratory on the students' academic performance. The technique has found to be extremely important factor in a student's learning process and achievement in science.

  20. Aromatherapy hand massage for older adults with chronic pain living in long-term care.

    PubMed

    Cino, Kathleen

    2014-12-01

    Older adults living in long-term care experience high rates of chronic pain. Concerns with pharmacologic management have spurred alternative approaches. The purpose of this study was to examine a nursing intervention for older adults with chronic pain. This prospective, randomized control trial compared the effect of aromatherapy M technique hand massage, M technique without aromatherapy, and nurse presence on chronic pain. Chronic pain was measured with the Geriatric Multidimensional Pain and Illness Inventory factors, pain and suffering, life interference, and emotional distress and the Iowa Pain Thermometer, a pain intensity scale. Three groups of 39 to 40 participants recruited from seven long-term care facilities participated twice weekly for 4 weeks. Analysis included multivariate analysis of variance and analysis of variance. Participants experienced decreased levels of chronic pain intensity. Group membership had a significant effect on the Geriatric Multidimensional Pain Inventory Pain and Suffering scores; Iowa Pain Thermometer scores differed significantly within groups. M technique hand massage with or without aromatherapy significantly decreased chronic pain intensity compared to nurse presence visits. M technique hand massage is a safe, simple, but effective intervention. Caregivers using it could improve chronic pain management in this population. © The Author(s) 2014.

  1. Analysis of psychological factors for quality assessment of interactive multimodal service

    NASA Astrophysics Data System (ADS)

    Yamagishi, Kazuhisa; Hayashi, Takanori

    2005-03-01

    We proposed a subjective quality assessment model for interactive multimodal services. First, psychological factors of an audiovisual communication service were extracted by using the semantic differential (SD) technique and factor analysis. Forty subjects participated in subjective tests and performed point-to-point conversational tasks on a PC-based TV phone that exhibits various network qualities. The subjects assessed those qualities on the basis of 25 pairs of adjectives. Two psychological factors, i.e., an aesthetic feeling and a feeling of activity, were extracted from the results. Then, quality impairment factors affecting these two psychological factors were analyzed. We found that the aesthetic feeling is mainly affected by IP packet loss and video coding bit rate, and the feeling of activity depends on delay time and video frame rate. We then proposed an opinion model derived from the relationships among quality impairment factors, psychological factors, and overall quality. The results indicated that the estimation error of the proposed model is almost equivalent to the statistical reliability of the subjective score. Finally, using the proposed model, we discuss guidelines for quality design of interactive audiovisual communication services.

  2. A computational intelligent approach to multi-factor analysis of violent crime information system

    NASA Astrophysics Data System (ADS)

    Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing

    2017-02-01

    Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

  3. Study to determine cloud motion from meteorological satellite data

    NASA Technical Reports Server (NTRS)

    Clark, B. B.

    1972-01-01

    Processing techniques were tested for deducing cloud motion vectors from overlapped portions of pairs of pictures made from meteorological satellites. This was accomplished by programming and testing techniques for estimating pattern motion by means of cross correlation analysis with emphasis placed upon identifying and reducing errors resulting from various factors. Techniques were then selected and incorporated into a cloud motion determination program which included a routine which would select and prepare sample array pairs from the preprocessed test data. The program was then subjected to limited testing with data samples selected from the Nimbus 4 THIR data provided by the 11.5 micron channel.

  4. Concepts and techniques for ultrasonic evaluation of material mechanical properties

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic methods that can be used for material strength are reviewed. Emergency technology involving advanced ultrasonic techniques and associated measurements is described. It is shown that ultrasonic NDE is particularly useful in this area because it involves mechanical elastic waves that are strongly modulated by morphological factors that govern mechanical strength and also dynamic failure modes. These aspects of ultrasonic NDE are described in conjunction with advanced approaches and theoretical concepts for signal acquisition and analysis for materials characterization. It is emphasized that the technology is in its infancy and that much effort is still required before the techniques and concepts can be transferred from laboratory to field conditions.

  5. Epidemiological analysis of factors influencing rate of progress in Echinococcus granulosus control in New Zealand.

    PubMed Central

    Burridge, M. J.; Schwabe, C. W.

    1977-01-01

    The factors influencing the rate of progress in Echinococcus granulosus control in New Zealand were analysed by hydatid control area using stepwise multiple regression techniques. The results indicated that the rate of progress was related positively to initial E. granulosus prevalence in dogs and the efficiency with which local authorities implemented national control policy, and negatively to the Maori proportion in the local population and the number of dogs per owner. Problems in analysis of the New Zealand data are discussed and improved methods of monitoring progress in hydatid disease control programmes are described. Images Fig. 1 PMID:265340

  6. Incorporation of Precipitation Data Into FIA Analyses: A Case Study of Factors Influencing Susceptibility to Oak Decline in Southern Missouri, U.S.A.

    Treesearch

    W. Keith Moser; Greg Liknes; Mark Hansen; Kevin Nimerfro

    2005-01-01

    The Forest Inventory and Analysis program at the North Central Research Station focuses on understanding the forested ecosystems in the North Central and Northern Great Plains States through analyzing the results of annual inventories. The program also researches techniques for data collection and analysis. The FIA process measures the above-ground vegetation and the...

  7. Grade of hypospadias is the only factor predicting for re-intervention after primary hypospadias repair: a multivariate analysis from a cohort of 474 patients.

    PubMed

    Spinoit, Anne-Françoise; Poelaert, Filip; Van Praet, Charles; Groen, Luitzen-Albert; Van Laecke, Erik; Hoebeke, Piet

    2015-04-01

    There is an ongoing quest on how to minimize complications in hypospadias surgery. There is however a lack of high-quality data on the following parameters that might influence the outcome of primary hypospadias repair: age at initial surgery, the type of suture material, the initial technique, and the type of hypospadias. The objective of this study was to identify independent predictors for re-intervention in primary hypospadias repair. We retrospectively analyzed our database of 474 children undergoing primary hypospadias surgery. Univariate and multivariate logistic regression was performed to identify variables associated with re-intervention. A p-value <0.05 was considered statistically significant and therefore considered as a prognostic factor for re-intervention. Distal penile hypospadias was reported in 77.2% (n = 366), midpenile in 11.4% (n = 54) and proximal in 11.4% (n = 54) of children. Initial repair was based on an incised plate technique in 39.9% (n = 189), meatal advancement in 36.0% (n = 171), an onlay flap in 17.3% (n = 82) and other or combined techniques in 5.3% (n = 25). In 114 patients (24.1%) re-intervention was required (n = 114) of which 54 re-interventions (47.4%) were performed within the first year post-surgery, 17 (14.9%) in the second year and 43 (37.7%) later than 2 years after initial surgery. The reason for the first re-intervention was fistula in 52 patients (46.4%), meatal stenosis in 32 (28.6%), cosmesis in 35 (31.3%) and other in 14 (12.5%). The median time for re-intervention was 14 months after surgery [range 0-114]. Significant predictors for re-intervention on univariate logistic regression (polyglactin suture material versus poliglecaprone, proximal hypospadias, lower age at operation and other than meatal advancement repair) were put in a multivariate logistic regression model. Of all significant variables, only proximal hypospadias remained an independent predictor for re-intervention (OR 3.27; p = 0.012). The grade of hypospadias remains according to our retrospective analysis the only objective independent predicting factor for re-intervention in hypospadias surgery. This finding is rather obvious for everyone operating hypospadias. Curiously midpenile hypospadias cases were doing slightly better than distal hypospadias in terms of re-intervention rates. Our study however has also some shortcomings. First of all, data was gathered retrospectively and follow-up time was ill-balanced for several variables. We tried to correct this by applying sensitivity analysis, but possible associations between some variables and re-intervention might still be obscured by this. Standard questionnaires to analyze surgical outcome were not available. Therefore, we focused our analysis on re-intervention rate as this is a hard and clinically relevant end point. This retrospective analysis of a large hypospadias database with long-term follow-up indicates that the long-lasting debate about factors influencing the reoperation rate in hypospadias surgery might be futile: in experienced hands, the only variable that independently predicts for re-intervention is the severity of hypospadias, the only factor we cannot modify. This retrospective multivariate analysis of a large hypospadias database with long-term follow-up suggests that the only significant independent predictive factor for re-intervention is proximal hypospadias. In our series, technique did not influence the re-intervention rate. Copyright © 2015 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  8. Gas flow headspace liquid phase microextraction.

    PubMed

    Yang, Cui; Qiu, Jinxue; Ren, Chunyan; Piao, Xiangfan; Li, Xifeng; Wu, Xue; Li, Donghao

    2009-11-06

    There is a trend towards the use of enrichment techniques such as microextraction in the analysis of trace chemicals. Based on the theory of ideal gases, theory of gas chromatography and the original headspace liquid phase microextraction (HS-LPME) technique, a simple gas flow headspace liquid phase microextraction (GF-HS-LPME) technique has been developed, where the extracting gas phase volume is increased using a gas flow. The system is an open system, where an inert gas containing the target compounds flows continuously through a special gas outlet channel (D=1.8mm), and the target compounds are trapped on a solvent microdrop (2.4 microL) hanging on the microsyringe tip, as a result, a high enrichment factor is obtained. The parameters affecting the enrichment factor, such as the gas flow rate, the position of the microdrop, the diameter of the gas outlet channel, the temperatures of the extracting solvent and of the sample, and the extraction time, were systematically optimized for four types of polycyclic aromatic hydrocarbons. The results were compared with results obtained from HS-LPME. Under the optimized conditions (where the extraction time and the volume of the extracting sample vial were fixed at 20min and 10mL, respectively), detection limits (S/N=3) were approximately a factor of 4 lower than those for the original HS-LPME technique. The method was validated by comparison of the GF-HS-LPME and HS-LPME techniques using data for PAHs from environmental sediment samples.

  9. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  10. Pancreatic thickness as a predictive factor for postoperative pancreatic fistula after distal pancreatectomy using an endopath stapler.

    PubMed

    Okano, Keiichi; Oshima, Minoru; Kakinoki, Keitaro; Yamamoto, Naoki; Akamoto, Shintaro; Yachida, Shinichi; Hagiike, Masanobu; Kamada, Hideki; Masaki, Tsutomu; Suzuki, Yasuyuki

    2013-02-01

    No consistent risk factor has yet been established for the development of pancreatic fistula (PF) after distal pancreatectomy (DP) with a stapler. A total of 31 consecutive patients underwent DP with an endopath stapler between June 2006 and December 2010 using a slow parenchymal flattening technique. The risk factors for PF after DP with an endopath stapler were identified based on univariate and multivariate analyses. Clinical PF developed in 7 of 31 (22 %) patients who underwent DP with a stapler. The pancreata were significantly thicker at the transection line in patients with PF (19.4 ± 1.47 mm) in comparison to patients without PF (12.6 ± 0.79 mm; p = 0.0003). A 16-mm cut-off for pancreatic thickness was established based on the receiver operating characteristic (ROC) curve; the area under the ROC curve was 0.875 (p = 0.0215). Pancreatic thickness (p = 0.0006) and blood transfusion (p = 0.028) were associated with postoperative PF in a univariate analysis. Pancreatic thickness was the only significant independent factor (odds ratio 9.99; p = 0.036) according to a multivariate analysis with a specificity of 72 %, and a sensitivity of 85 %. Pancreatic thickness is a significant independent risk factor for PF development after DP with an endopath stapler. The stapler technique is thus considered to be an appropriate modality in patients with a pancreatic thicknesses of <16 mm.

  11. Application of Avco data analysis and prediction techniques (ADAPT) to prediction of sunspot activity

    NASA Technical Reports Server (NTRS)

    Hunter, H. E.; Amato, R. A.

    1972-01-01

    The results are presented of the application of Avco Data Analysis and Prediction Techniques (ADAPT) to derivation of new algorithms for the prediction of future sunspot activity. The ADAPT derived algorithms show a factor of 2 to 3 reduction in the expected 2-sigma errors in the estimates of the 81-day running average of the Zurich sunspot numbers. The report presents: (1) the best estimates for sunspot cycles 20 and 21, (2) a comparison of the ADAPT performance with conventional techniques, and (3) specific approaches to further reduction in the errors of estimated sunspot activity and to recovery of earlier sunspot historical data. The ADAPT programs are used both to derive regression algorithm for prediction of the entire 11-year sunspot cycle from the preceding two cycles and to derive extrapolation algorithms for extrapolating a given sunspot cycle based on any available portion of the cycle.

  12. Application of small-signal modeling and measurement techniques to the stability analysis of an integrated switching-mode power system. [onboard Dynamics Explorer Satellite

    NASA Technical Reports Server (NTRS)

    Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.; Rodriguez, G. E.

    1980-01-01

    Small-signal modeling techniques are used in a system stability analysis of a breadboard version of a complete functional electrical power system. The system consists of a regulated switching dc-to-dc converter, a solar-cell-array simulator, a solar-array EMI filter, battery chargers and linear shunt regulators. Loss mechanisms in the converter power stage, including switching-time effects in the semiconductor elements, are incorporated into the modeling procedure to provide an accurate representation of the system without requiring frequency-domain measurements to determine the damping factor. The small-signal system model is validated by the use of special measurement techniques which are adapted to the poor signal-to-noise ratio encountered in switching-mode systems. The complete electrical power system with the solar-array EMI filter is shown to be stable over the intended range of operation.

  13. CRISPR/Cas9 and genome editing in Drosophila.

    PubMed

    Bassett, Andrew R; Liu, Ji-Long

    2014-01-20

    Recent advances in our ability to design DNA binding factors with specificity for desired sequences have resulted in a revolution in genetic engineering, enabling directed changes to the genome to be made relatively easily. Traditional techniques for generating genetic mutations in most organisms have relied on selection from large pools of randomly induced mutations for those of particular interest, or time-consuming gene targeting by homologous recombination. Drosophila melanogaster has always been at the forefront of genetic analysis, and application of these new genome editing techniques to this organism will revolutionise our approach to performing analysis of gene function in the future. We discuss the recent techniques that apply the CRISPR/Cas9 system to Drosophila, highlight potential uses for this technology and speculate upon the future of genome engineering in this model organism. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Spatial Resolution Effects of Digital Terrain Models on Landslide Susceptibility Analysis

    NASA Astrophysics Data System (ADS)

    Chang, K. T.; Dou, J.; Chang, Y.; Kuo, C. P.; Xu, K. M.; Liu, J. K.

    2016-06-01

    The purposes of this study are to identify the maximum number of correlated factors for landslide susceptibility mapping and to evaluate landslide susceptibility at Sihjhong river catchment in the southern Taiwan, integrating two techniques, namely certainty factor (CF) and artificial neural network (ANN). The landslide inventory data of the Central Geological Survey (CGS, MOEA) in 2004-2014 and two digital elevation model (DEM) datasets including a 5-meter LiDAR DEM and a 30-meter Aster DEM were prepared. We collected thirteen possible landslide-conditioning factors. Considering the multi-collinearity and factor redundancy, we applied the CF approach to optimize these thirteen conditioning factors. We hypothesize that if the CF values of the thematic factor layers are positive, it implies that these conditioning factors have a positive relationship with the landslide occurrence. Therefore, based on this assumption and positive CF values, seven conditioning factors including slope angle, slope aspect, elevation, terrain roughness index (TRI), terrain position index (TPI), total curvature, and lithology have been selected for further analysis. The results showed that the optimized-factors model provides a better accuracy for predicting landslide susceptibility in the study area. In conclusion, the optimized-factors model is suggested for selecting relative factors of landslide occurrence.

  15. Aneurysmal subarachnoid hemorrhage prognostic decision-making algorithm using classification and regression tree analysis.

    PubMed

    Lo, Benjamin W Y; Fukuda, Hitoshi; Angle, Mark; Teitelbaum, Jeanne; Macdonald, R Loch; Farrokhyar, Forough; Thabane, Lehana; Levine, Mitchell A H

    2016-01-01

    Classification and regression tree analysis involves the creation of a decision tree by recursive partitioning of a dataset into more homogeneous subgroups. Thus far, there is scarce literature on using this technique to create clinical prediction tools for aneurysmal subarachnoid hemorrhage (SAH). The classification and regression tree analysis technique was applied to the multicenter Tirilazad database (3551 patients) in order to create the decision-making algorithm. In order to elucidate prognostic subgroups in aneurysmal SAH, neurologic, systemic, and demographic factors were taken into account. The dependent variable used for analysis was the dichotomized Glasgow Outcome Score at 3 months. Classification and regression tree analysis revealed seven prognostic subgroups. Neurological grade, occurrence of post-admission stroke, occurrence of post-admission fever, and age represented the explanatory nodes of this decision tree. Split sample validation revealed classification accuracy of 79% for the training dataset and 77% for the testing dataset. In addition, the occurrence of fever at 1-week post-aneurysmal SAH is associated with increased odds of post-admission stroke (odds ratio: 1.83, 95% confidence interval: 1.56-2.45, P < 0.01). A clinically useful classification tree was generated, which serves as a prediction tool to guide bedside prognostication and clinical treatment decision making. This prognostic decision-making algorithm also shed light on the complex interactions between a number of risk factors in determining outcome after aneurysmal SAH.

  16. Strategies for Fermentation Medium Optimization: An In-Depth Review

    PubMed Central

    Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.

    2017-01-01

    Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566

  17. Synergistic effects of Mo and F doping on the quality factor of ZnO thin films prepared by a fully automated home-made nebulizer spray technique

    NASA Astrophysics Data System (ADS)

    Ravichandran, K.; Dineshbabu, N.; Arun, T.; Manivasaham, A.; Sindhuja, E.

    2017-01-01

    Transparent conducting oxide films of undoped, Mo doped, Mo + F co-doped ZnO were deposited using a facile homemade nebulizer spray pyrolysis technique. The effects of Mo and F doping on the structural, optical, electrical and surface morphological properties were investigated using XRD, UV-vis-NIR spectroscopy, I-V and Hall probe techniques, FESEM and AFM, and XPS, respectively. The XRD analysis confirms that all the films are well crystallized with hexagonal wurtzite structure. All the synthesized samples exhibit high transmittance (above 85%) in the visible region. The current-voltage (I-V) characteristics show the ohmic conduction nature of the films. The Hall probe measurements show that the synergistic effects of Mo and F doping cause desirable improvements in the quality factor of the ZnO films. A minimum resistivity of 5.12 × 10-3 Ω cm with remarkably higher values of mobility and carrier concentration is achieved for Mo (2 at.%) + F (15 at.%) co-doped ZnO films. A considerable variation in the intensity of deep level emission caused by Mo and F doping is observed in the photoluminescence (PL) studies. The presence of the constituent elements in the samples is confirmed by XPS analysis.

  18. Cross-cultural examination of measurement invariance of the Beck Depression Inventory-II.

    PubMed

    Dere, Jessica; Watters, Carolyn A; Yu, Stephanie Chee-Min; Bagby, R Michael; Ryder, Andrew G; Harkness, Kate L

    2015-03-01

    Given substantial rates of major depressive disorder among college and university students, as well as the growing cultural diversity on many campuses, establishing the cross-cultural validity of relevant assessment tools is important. In the current investigation, we examined the Beck Depression Inventory-Second Edition (BDI-II; Beck, Steer, & Brown, 1996) among Chinese-heritage (n = 933) and European-heritage (n = 933) undergraduates in North America. The investigation integrated 3 distinct lines of inquiry: (a) the literature on cultural variation in depressive symptom reporting between people of Chinese and Western heritage; (b) recent developments regarding the factor structure of the BDI-II; and (c) the application of advanced statistical techniques to the issue of cross-cultural measurement invariance. A bifactor model was found to represent the optimal factor structure of the BDI-II. Multigroup confirmatory factor analysis showed that the BDI-II had strong measurement invariance across both culture and gender. In group comparisons with latent and observed variables, Chinese-heritage students scored higher than European-heritage students on cognitive symptoms of depression. This finding deviates from the commonly held view that those of Chinese heritage somatize depression. These findings hold implications for the study and use of the BDI-II, highlight the value of advanced statistical techniques such as multigroup confirmatory factor analysis, and offer methodological lessons for cross-cultural psychopathology research more broadly. 2015 APA, all rights reserved

  19. Analysis of Factors Affecting the Success of Onions Development Program in Kampar Regency

    NASA Astrophysics Data System (ADS)

    Amalia; Putri, Asgami

    2017-12-01

    The purpose of this study is to analyze the factors influencing the success of the onion plant development program in Kampar regency. The research method used was the applied survey method using interview technique and observation or direct supervision on the location of the object. The briefing of the interviews as well as the accuracy of collecting the required data was guided by the structured questionnaires. Determination technique of location / region sampling was done purposively based on the potency and capacity of commodity development. While the respondents were taken by cluster purvosive sampling method in order to classify the samples in accordance with the purpose of the study, determined by as many as 100 people taken from members of the farmer group. Analytical technique used is by using Logic Regression Analysis to determine the factors that influence the success of the program seen from the characteristics of farmers. From the results of this study it can be concluded that the factors influencing the success of onion development program in Kampar regency were a age (X1), education (X2), income (X3), ethnicity (X4), jobs (X5) And family responsibility (X6) could be made as follows: Log Y (P/1-p) = -1.778 +X10.021 + X20.028 - X30.213 + X41.986 + X52.930 - X60.455 From the above equation, it can be explained that the attributes that are positively related are X1 (age), X2 (education), X4 (ethnicity) and X5 (jobs) while the negative correlates are X3 (income) and X6 (family responsibility). From the logical regression result it can be seen that the significant value <0,05, then the independent variable influenced the dependent variable, so that when viewed from the table in the equation it was found that factors affecting the success rate of red onion development program in Kampar regency were X2 (education), X4 (ethnicity), X5 (jobs), and X6 (family responsibility).

  20. Testing sample stability using four storage methods and the macroalgae Ulva and Gracilaria

    EPA Science Inventory

    Concern over the relative importance of different sample preparation and storage techniques frequently used in stable isotope analysis of particulate nitrogen (δ15N) and carbon (δ13C) prompted an experiment to determine how important such factors were to measured values in marine...

  1. Using Symbolic-Logic Matrices To Improve Confirmatory Factor Analysis Techniques.

    ERIC Educational Resources Information Center

    Creighton, Theodore B.; Coleman, Donald G.; Adams, R. C.

    A continuing and vexing problem associated with survey instrument development is the creation of items, initially, that correlate favorably a posteriori with constructs being measured. This study tests the use of symbolic-logic matrices developed by D. G. Coleman (1979) in creating factorially "pure" statistically discrete constructs in…

  2. Coping with Drinking Pressures: Adolescent Versus Parent Perspectives.

    ERIC Educational Resources Information Center

    Brown, Sandra A.; Stetson, Barbara A.

    1988-01-01

    Fifteen techniques to limit or stop alcohol consumption were rated by 94 adolescents, aged 12 to 19, and their parents. Factor analysis of effectiveness ratings demonstrated consistency in appraisal of adult options for coping strategies, but significant differences in adolescent and parent views of how teenagers should cope with drinking…

  3. AN ALTERNATIVE METHOD FOR ESTABLISHING TEFS FOR DIOXIN-LIKE COMPOUNDS. PART 1. EVALUATION OF DECISION ANALYSIS METHODS FOR USE IN WEIGHTING RELATIVE POTENCY DATA

    EPA Science Inventory

    A number of investigators have recently examined the utility of applying probabilistic techniques in the derivation of toxic equivalency factors (TEFs) for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (...

  4. Factors Influencing the Academic Achievement of First-Generation College Students

    ERIC Educational Resources Information Center

    Strayhorn, Terrell L.

    2006-01-01

    First-generation college students face a number of unique challenges in college. These obstacles may have a disparate effect on educational outcomes such as academic achievement. This study presents findings from an analysis of the Baccalaureate & Beyond Longitudinal Study using hierarchical multiple regression techniques to measure the influence…

  5. Assessment of Idiographic Organizational Climate

    ERIC Educational Resources Information Center

    Offenberg, Robert M.; Cernius, Vytas

    1978-01-01

    It was hypothesized that factor analysis and elements of social exchange theory could be used to integrate the different perceptions of individuals who make up an organization. An instrument was administered to the faculties of two schools. The results indicate a promising technique for organizational diagnosis. Available from: JABS Order Dept.,…

  6. Metal Pollutant Exposure and Behavior Disorders: Implications for School Practices.

    ERIC Educational Resources Information Center

    Marlowe, Mike

    1986-01-01

    The article summarizes research on relationships between low (below metal poisoning) metal exposure and childhood behavior disorders. Symptoms, assessment techniques (hair analysis), and environmental and dietary factors that may increase the risk of metal pollutant exposure are described. School programs emphasizing education and the role of…

  7. X-Ray Microanalysis and Electron Energy Loss Spectrometry in the Analytical Electron Microscope: Review and Future Directions

    NASA Technical Reports Server (NTRS)

    Goldstein, J. I.; Williams, D. B.

    1992-01-01

    This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.

  8. Linking Spatial Variations in Water Quality with Water and Land Management using Multivariate Techniques.

    PubMed

    Wan, Yongshan; Qian, Yun; Migliaccio, Kati White; Li, Yuncong; Conrad, Cecilia

    2014-03-01

    Most studies using multivariate techniques for pollution source evaluation are conducted in free-flowing rivers with distinct point and nonpoint sources. This study expanded on previous research to a managed "canal" system discharging into the Indian River Lagoon, Florida, where water and land management is the single most important anthropogenic factor influencing water quality. Hydrometric and land use data of four drainage basins were uniquely integrated into the analysis of 25 yr of monthly water quality data collected at seven stations to determine the impact of water and land management on the spatial variability of water quality. Cluster analysis (CA) classified seven monitoring stations into four groups (CA groups). All water quality parameters identified by discriminant analysis showed distinct spatial patterns among the four CA groups. Two-step principal component analysis/factor analysis (PCA/FA) was conducted with (i) water quality data alone and (ii) water quality data in conjunction with rainfall, flow, and land use data. The results indicated that PCA/FA of water quality data alone was unable to identify factors associated with management activities. The addition of hydrometric and land use data into PCA/FA revealed close associations of nutrients and color with land management and storm-water retention in pasture and citrus lands; total suspended solids, turbidity, and NO + NO with flow and Lake Okeechobee releases; specific conductivity with supplemental irrigation supply; and dissolved O with wetland preservation. The practical implication emphasizes the importance of basin-specific land and water management for ongoing pollutant loading reduction and ecosystem restoration programs. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  9. Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.

    PubMed

    Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

    2012-01-01

    The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS.

  10. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System

    PubMed Central

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie

    2017-01-01

    Background Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Methods Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. Results A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20–65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10–2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10–3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53–2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Conclusions Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD. PMID:28056058

  11. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System.

    PubMed

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie; Oh, Kook-Hwan

    2017-01-01

    Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20-65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10-2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10-3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53-2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD.

  12. Elucidating Environmental Fingerprinting Mechanisms of Unconventional Gas Development through Hydrocarbon Analysis.

    PubMed

    Piotrowski, Paulina K; Weggler, Benedikt A; Yoxtheimer, David A; Kelly, Christina N; Barth-Naftilan, Erica; Saiers, James E; Dorman, Frank L

    2018-04-17

    Hydraulic fracturing is an increasingly common technique for the extraction of natural gas entrapped in shale formations. This technique has been highly criticized due to the possibility of environmental contamination, underscoring the need for method development to identify chemical factors that could be utilized in point-source identification of environmental contamination events. Here, we utilize comprehensive two-dimensional gas chromatography (GC × GC) coupled to high-resolution time-of-flight (HRT) mass spectrometry, which offers a unique instrumental combination allowing for petroleomics hydrocarbon fingerprinting. Four flowback fluids from Marcellus shale gas wells in geographic proximity were analyzed for differentiating factors that could be exploited in environmental forensics investigations of shale gas impacts. Kendrick mass defect (KMD) plots of these flowback fluids illustrated well-to-well differences in heteroatomic substituted hydrocarbons, while GC × GC separations showed variance in cyclic hydrocarbons and polyaromatic hydrocarbons among the four wells. Additionally, generating plots that combine GC × GC separation with KMD established a novel data-rich visualization technique that further differentiated the samples.

  13. [PROGNOSTIC MODELS IN MODERN MANAGEMENT OF VULVAR CANCER].

    PubMed

    Tsvetkov, Ch; Gorchev, G; Tomov, S; Nikolova, M; Genchev, G

    2016-01-01

    The aim of the research was to evaluate and analyse prognosis and prognostic factors in patients with squamous cell vulvar carcinoma after primary surgery with individual approach applied during the course of treatment. In the period between January 2000 and July 2010, 113 patients with squamous cell carcinoma of the vulva were diagnosed and operated on at Gynecologic Oncology Clinic of Medical University, Pleven. All the patients were monitored at the same clinic. Individual approach was applied to each patient and whenever it was possible, more conservative operative techniques were applied. The probable clinicopathological characteristics influencing the overall survival and recurrence free survival were analyzed. Univariate statistical analysis and Cox regression analysis were made in order to evaluate the characteristics, which were statistically significant for overall survival and survival without recurrence. A multivariate logistic regression analysis (Forward Wald procedure) was applied to evaluate the combined influence of the significant factors. While performing the multivariate analysis, the synergic effect of the independent prognostic factors of both kinds of survivals was also evaluated. Approaching individually each patient, we applied the following operative techniques: 1. Deep total radical vulvectomy with separate incisions for lymph dissection (LD) or without dissection--68 (60.18 %) patients. 2. En-bloc vulvectomy with bilateral LD without vulva reconstruction--10 (8.85%) 3. Modified radical vulvactomy (hemivulvectomy, patial vulvactomy)--25 (22.02%). 4. wide-local excision--3 (2.65%). 5. Simple (total /partial) vulvectomy--5 (4.43%) patients. 6. En-bloc resection with reconstruction--2 (1.77%) After a thorough analysis of the overall survival and recurrence free survival, we made the conclusion that the relapse occurrence and clinical stage of FIGO were independent prognostic factors for overall survival and the independent prognostic factors for recurrence free survival were: metastatic inguinal nodes (unilateral or bilateral), tumor size (above or below 3 cm) and lymphovascular space invasion. On the basis of these results we created two prognostic models: 1. A prognostic model of overall survival 2. A prognostic model for survival without recurrence. Following the surgical staging of the disease, were able to gather and analyse important clinicopathological indexes, which gave us the opportunity to form prognostic groups for overall survival and recurrence-free survival.

  14. Biomedical and Human Factors Requirements for a Manned Earth Orbiting Station

    NASA Technical Reports Server (NTRS)

    Helvey, W.; Martell, C.; Peters, J.; Rosenthal, G.; Benjamin, F.; Albright, G.

    1964-01-01

    The primary objective of this study is to determine which biomedical and human factors measurements must be made aboard a space station to assure adequate evaluation of the astronaut's health and performance during prolonged space flights. The study has employed, where possible, a medical and engineering systems analysis to define the pertinent life sciences and space station design parameters and their influence on a measurement program. The major areas requiring evaluation in meeting the study objectives include a definition of the space environment, man's response to the environment, selection of measurement and data management techniques, experimental program, space station design requirements, and a trade-off analysis with final recommendations. The space environment factors that are believed to have a significant effect on man were evaluated. This includes those factors characteristic of the space environment (e. g. weightlessness, radiation) as well as those created within the space station (e. g. toxic contaminants, capsule atmosphere). After establishing the general features of the environment, an appraisal was made of the anticipated response of the astronaut to each of these factors. For thoroughness, the major organ systems and functions of the body were delineated, and a determination was made of their anticipated response to each of the environmental categories. A judgment was then made on the medical significance or importance of each response, which enabled a determination of which physiological and psychological effects should be monitored. Concurrently, an extensive list of measurement techniques and methods of data management was evaluated for applicability to the space station program. The various space station configurations and design parameters were defined in terms of the biomedical and human factors requirements to provide the measurements program. Research design of experimental programs for various station configurations, mission durations, and crew sizes were prepared, and, finally, a trade-off analysis of the critical variables in the station planning was completed with recommendations to enhance the confidence in the measurement program.

  15. A Study on Regional Frequency Analysis using Artificial Neural Network - the Sumjin River Basin

    NASA Astrophysics Data System (ADS)

    Jeong, C.; Ahn, J.; Ahn, H.; Heo, J. H.

    2017-12-01

    Regional frequency analysis means to make up for shortcomings in the at-site frequency analysis which is about a lack of sample size through the regional concept. Regional rainfall quantile depends on the identification of hydrologically homogeneous regions, hence the regional classification based on hydrological homogeneous assumption is very important. For regional clustering about rainfall, multidimensional variables and factors related geographical features and meteorological figure are considered such as mean annual precipitation, number of days with precipitation in a year and average maximum daily precipitation in a month. Self-Organizing Feature Map method which is one of the artificial neural network algorithm in the unsupervised learning techniques solves N-dimensional and nonlinear problems and be shown results simply as a data visualization technique. In this study, for the Sumjin river basin in South Korea, cluster analysis was performed based on SOM method using high-dimensional geographical features and meteorological factor as input data. then, for the results, in order to evaluate the homogeneity of regions, the L-moment based discordancy and heterogeneity measures were used. Rainfall quantiles were estimated as the index flood method which is one of regional rainfall frequency analysis. Clustering analysis using SOM method and the consequential variation in rainfall quantile were analyzed. This research was supported by a grant(2017-MPSS31-001) from Supporting Technology Development Program for Disaster Management funded by Ministry of Public Safety and Security(MPSS) of the Korean government.

  16. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    NASA Astrophysics Data System (ADS)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  17. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  18. The flotation and adsorption of mixed collectors on oxide and silicate minerals.

    PubMed

    Xu, Longhua; Tian, Jia; Wu, Houqin; Lu, Zhongyuan; Sun, Wei; Hu, Yuehua

    2017-12-01

    The analysis of flotation and adsorption of mixed collectors on oxide and silicate minerals is of great importance for both industrial applications and theoretical research. Over the past years, significant progress has been achieved in understanding the adsorption of single collectors in micelles as well as at interfaces. By contrast, the self-assembly of mixed collectors at liquid/air and solid/liquid interfaces remains a developing area as a result of the complexity of the mixed systems involved and the limited availability of suitable analytical techniques. In this work, we systematically review the processes involved in the adsorption of mixed collectors onto micelles and at interface by examining four specific points, namely, theoretical background, factors that affect adsorption, analytical techniques, and self-assembly of mixed surfactants at the mineral/liquid interface. In the first part, the theoretical background of collector mixtures is introduced, together with several core solution theories, which are classified according to their application in the analysis of physicochemical properties of mixed collector systems. In the second part, we discuss the factors that can influence adsorption, including factors related to the structure of collectors and environmental conditions. We summarize their influence on the adsorption of mixed systems, with the objective to provide guidance on the progress achieved in this field to date. Advances in measurement techniques can greatly promote our understanding of adsorption processes. In the third part, therefore, modern techniques such as optical reflectometry, neutron scattering, neutron reflectometry, thermogravimetric analysis, fluorescence spectroscopy, ultrafiltration, atomic force microscopy, analytical ultracentrifugation, X-ray photoelectron spectroscopy, Vibrational Sum Frequency Generation Spectroscopy and molecular dynamics simulations are introduced in virtue of their application. Finally, focusing on oxide and silicate minerals, we review and summarize the flotation and adsorption of three most widely used mixed surfactant systems (anionic-cationic, anionic-nonionic, and cationic-nonionic) at the liquid/mineral interface in order to fully understand the self-assembly progress. In the end, the paper gives a brief future outlook of the possible development in the mixed surfactants. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. NASA: Model development for human factors interfacing

    NASA Technical Reports Server (NTRS)

    Smith, L. L.

    1984-01-01

    The results of an intensive literature review in the general topics of human error analysis, stress and job performance, and accident and safety analysis revealed no usable techniques or approaches for analyzing human error in ground or space operations tasks. A task review model is described and proposed to be developed in order to reduce the degree of labor intensiveness in ground and space operations tasks. An extensive number of annotated references are provided.

  20. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors

    PubMed Central

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-01-01

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors. PMID:28574459

  1. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors.

    PubMed

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-06-02

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors.

  2. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  3. Content analysis of 100 consecutive media reports of amusement ride accidents.

    PubMed

    Woodcock, Kathryn

    2008-01-01

    Accident investigations influence public perceptions and safety management strategies by determining the amount and type of information learned about the accident. To examine the factors considered in investigations, this study used a content analysis of 100 consecutive media reports of amusement ride accidents from an online media archive. Fatalities were overrepresented in the media dataset compared with U.S. national estimates. For analysis of reports, a modified "Haddon matrix" was developed using human-factors categories. This approach was useful to show differences between the proportions and types of factors considered in the different accident stages and between employee and rider accidents. Employee injury accounts primarily referred to the employee's task and to the employee. Rider injury reports were primarily related to the ride device itself and rarely referred to the rider's "task", social influences, or the rider's own actions, and only some reference to their characteristics. Qualitatively, it was evident that more human factors analysis is required to augment scant pre-failure information about the task, social environment, and the person, to make that information available for prevention of amusement ride accidents. By design, this study reflected information reported by the media. Future work will use the same techniques with official reports.

  4. Meteor tracking via local pattern clustering in spatio-temporal domain

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel

    2016-09-01

    Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).

  5. Exploring the relationships between free-time management and boredom in leisure.

    PubMed

    Wang, Wei-Ching; Wu, Chung-Chi; Wu, Chang-Yang; Huan, Tzung-Cheng

    2012-04-01

    The purpose of the study was to examine the relations of five dimensions of free-time management (including goal setting and evaluating, technique, values, immediate response, and scheduling) with leisure boredom, and whether these factors could predict leisure boredom. A total of 500 undergraduates from a university in southern Taiwan were surveyed with 403 usable questionnaires was returned. Pearson correlation analysis revealed that five dimensions of free-time management had significant negative relationships with leisure boredom. Furthermore, the results of stepwise regression analysis revealed that four dimensions of free-time management were significant contributors to leisure boredom. Finally, we suggested students can avoid boredom by properly planning and organizing leisure time and applying techniques for managing leisure time.

  6. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  7. Approach for gait analysis in persons with limb loss including residuum and prosthesis socket dynamics.

    PubMed

    LaPrè, A K; Price, M A; Wedge, R D; Umberger, B R; Sup, Frank C

    2018-04-01

    Musculoskeletal modeling and marker-based motion capture techniques are commonly used to quantify the motions of body segments, and the forces acting on them during human gait. However, when these techniques are applied to analyze the gait of people with lower limb loss, the clinically relevant interaction between the residual limb and prosthesis socket is typically overlooked. It is known that there is considerable motion and loading at the residuum-socket interface, yet traditional gait analysis techniques do not account for these factors due to the inability to place tracking markers on the residual limb inside of the socket. In the present work, we used a global optimization technique and anatomical constraints to estimate the motion and loading at the residuum-socket interface as part of standard gait analysis procedures. We systematically evaluated a range of parameters related to the residuum-socket interface, such as the number of degrees of freedom, and determined the configuration that yields the best compromise between faithfully tracking experimental marker positions while yielding anatomically realistic residuum-socket kinematics and loads that agree with data from the literature. Application of the present model to gait analysis for people with lower limb loss will deepen our understanding of the biomechanics of walking with a prosthesis, which should facilitate the development of enhanced rehabilitation protocols and improved assistive devices. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Seal Analysis for the Ares-I Upper Stage Fuel Tank Manhole Cover

    NASA Technical Reports Server (NTRS)

    Phillips, Dawn R.; Wingate, Robert J.

    2010-01-01

    Techniques for studying the performance of Naflex pressure-assisted seals in the Ares-I Upper Stage liquid hydrogen tank manhole cover seal joint are explored. To assess the feasibility of using the identical seal design for the Upper Stage as was used for the Space Shuttle External Tank manhole covers, a preliminary seal deflection analysis using the ABAQUS commercial finite element software is employed. The ABAQUS analyses are performed using three-dimensional symmetric wedge finite element models. This analysis technique is validated by first modeling a heritage External Tank liquid hydrogen tank manhole cover joint and correlating the results to heritage test data. Once the technique is validated, the Upper Stage configuration is modeled. The Upper Stage analyses are performed at 1.4 times the expected pressure to comply with the Constellation Program factor of safety requirement on joint separation. Results from the analyses performed with the External Tank and Upper Stage models demonstrate the effects of several modeling assumptions on the seal deflection. The analyses for Upper Stage show that the integrity of the seal is successfully maintained.

  9. Use of Several Thermal Analysis Techniques on a Hypalon Paint Coating for the Solid Rocket Booster (SRB) of the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Wingard, Charles D.; Whitaker, Ann F. (Technical Monitor)

    2000-01-01

    White Hypalon paint is brush-applied as a moisture barrier coating over cork surfaces on each of the two Space Shuttle SRBs. Fine cracks have been observed in the Hypalon coating three times historically on laboratory witness panels, but never on flight hardware. Samples of the cracked and standard ("good") Hypalon were removed from witness panel cork surfaces, and were tested in 1998 by Thermogravimetric Analysis (TGA), TMA and Differential Scanning Calorimetry (DSC) thermal analysis techniques. The TGA data showed that at 700C, where only paint pigment solids remain, the cracked material had about 9 weight percent more material remaining than the standard material, probably indicating incomplete mixing of the paint before it was brush-applied to produce the cracked material. Use of the TMA film/fiber technique showed that the average modulus (stiffness) vs. temperature was about 3 to 6 times higher for the cracked material than for the standard material. The TMA data also showed that an increase in coating thickness for the cracked Hypalon was not a factor in the anomaly.

  10. Physics Structure Analysis of Parallel Waves Concept of Physics Teacher Candidate

    NASA Astrophysics Data System (ADS)

    Sarwi, S.; Supardi, K. I.; Linuwih, S.

    2017-04-01

    The aim of this research was to find a parallel structure concept of wave physics and the factors that influence on the formation of parallel conceptions of physics teacher candidates. The method used qualitative research which types of cross-sectional design. These subjects were five of the third semester of basic physics and six of the fifth semester of wave course students. Data collection techniques used think aloud and written tests. Quantitative data were analysed with descriptive technique-percentage. The data analysis technique for belief and be aware of answers uses an explanatory analysis. Results of the research include: 1) the structure of the concept can be displayed through the illustration of a map containing the theoretical core, supplements the theory and phenomena that occur daily; 2) the trend of parallel conception of wave physics have been identified on the stationary waves, resonance of the sound and the propagation of transverse electromagnetic waves; 3) the influence on the parallel conception that reading textbooks less comprehensive and knowledge is partial understanding as forming the structure of the theory.

  11. A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.

    The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less

  12. Supplemental Conceptual Design Study of an Integrated Voice/Data Switching and Multiplexing Technique for an Access Area Exchange

    DTIC Science & Technology

    1976-11-11

    exchange. The basis for this choice was derived from several factors . One was a timing analysis that was made for certain basic time-critical software...randidate 6jrstem designs were developed and _*xamined with respect to L their capability to demonstrate the workability of the basic concept and for factors ...algorithm recuires a bit time completion, while SOF production allows byte timing and the involved = SOF correlation procedure may be perfor-med during

  13. Type A Accident Investigation Board report on the January 17, 1996, electrical accident with injury in Technical Area 21 Tritium Science and Fabrication Facility Los Alamos National Laboratory. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-04-01

    An electrical accident was investigated in which a crafts person received serious injuries as a result of coming into contact with a 13.2 kilovolt (kV) electrical cable in the basement of Building 209 in Technical Area 21 (TA-21-209) in the Tritium Science and Fabrication Facility (TSFF) at Los Alamos National Laboratory (LANL). In conducting its investigation, the Accident Investigation Board used various analytical techniques, including events and causal factor analysis, barrier analysis, change analysis, fault tree analysis, materials analysis, and root cause analysis. The board inspected the accident site, reviewed events surrounding the accident, conducted extensive interviews and document reviews,more » and performed causation analyses to determine the factors that contributed to the accident, including any management system deficiencies. Relevant management systems and factors that could have contributed to the accident were evaluated in accordance with the guiding principles of safety management identified by the Secretary of Energy in an October 1994 letter to the Defense Nuclear Facilities Safety Board and subsequently to Congress.« less

  14. Informatics for Metabolomics.

    PubMed

    Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote

    2016-01-01

    Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.

  15. Feedforward interview technique in obstetrics and gynaecology residents: a fact or fallacy.

    PubMed

    Sami, Shehla; Ahmad, Amina

    2015-01-01

    To determine the role of Feedforward Interview (FFI) technique in motivating residents of Obstetrics and Gynaecology for better learning and performance. An explorative study with mixed method approach being employed. Department of Obstetrics and Gynaecology, Sandeman (Provincial) Hospital, Quetta, from November 2010 till May 2013. Feedforward interview technique was complimented by survey questionnaire employing similar philosophy of FFI to triangulate data through two methods. Survey questionnaire was filled-up by 21 residents and analysed by SPSS version 17. Fourteen of these participants were identified for in-depth Feedforward Interviews (FFI), based on nonprobability purposive sampling after informed consent, and content analysis was done. Feedforward interview technique enabled majority of residents in recalling minimum of 3 positive experiences, mainly related to surgical experiences, which enhanced their motivation to aspire for further improvement in this area. Hard work was the main personal contributing factor both in FFI and survey. In addition to identifying clinical experiences enhancing desire to learn, residents also reported need for more academic support as an important factor which could also boost motivation to attain better performance. Feedforward interview technique not only helps residents in recalling positive learning experiences during their training but it also has a significant influence on developing insight about one's performance and motivating residents to achieve higher academic goals.

  16. Bio-speckle assessment of bruising in fruits

    NASA Astrophysics Data System (ADS)

    Pajuelo, M.; Baldwin, G.; Rabal, H.; Cap, N.; Arizaga, R.; Trivi, M.

    2003-07-01

    The dynamic speckle patterns or bio-speckle is a phenomenon produced by laser illumination of active materials, such as a biological tissue. Fruits, even hard peel ones, show a speckle activity that can be related to maturity, turgor, damage, aging, and mechanical properties. In this case, we suggest a bio-speckle technique as a potential methodology for the study of impact on apples and the analysis of bruises produced by them. The aim is to correlate physical properties of apples with quality factors using a non-contact and non-invasive technique.

  17. Assessing risk factors for periodontitis using regression

    NASA Astrophysics Data System (ADS)

    Lobo Pereira, J. A.; Ferreira, Maria Cristina; Oliveira, Teresa

    2013-10-01

    Multivariate statistical analysis is indispensable to assess the associations and interactions between different factors and the risk of periodontitis. Among others, regression analysis is a statistical technique widely used in healthcare to investigate and model the relationship between variables. In our work we study the impact of socio-demographic, medical and behavioral factors on periodontal health. Using regression, linear and logistic models, we can assess the relevance, as risk factors for periodontitis disease, of the following independent variables (IVs): Age, Gender, Diabetic Status, Education, Smoking status and Plaque Index. The multiple linear regression analysis model was built to evaluate the influence of IVs on mean Attachment Loss (AL). Thus, the regression coefficients along with respective p-values will be obtained as well as the respective p-values from the significance tests. The classification of a case (individual) adopted in the logistic model was the extent of the destruction of periodontal tissues defined by an Attachment Loss greater than or equal to 4 mm in 25% (AL≥4mm/≥25%) of sites surveyed. The association measures include the Odds Ratios together with the correspondent 95% confidence intervals.

  18. Factor Analysis of the Modified Sexual Adjustment Questionnaire-Male

    PubMed Central

    Wilmoth, Margaret C.; Hanlon, Alexandra L.; Ng, Lit Soo; Bruner, Debra W.

    2015-01-01

    Background and Purpose The Sexual Adjustment Questionnaire (SAQ) is used in National Cancer Institute–sponsored clinical trials as an outcome measure for sexual functioning. The tool was revised to meet the needs for a clinically useful, theory-based outcome measure for use in both research and clinical settings. This report describes the modifications and validity testing of the modified Sexual Adjustment Questionnaire-Male (mSAQ-Male). Methods This secondary analysis of data from a large Radiation Therapy Oncology Group trial employed principal axis factor analytic techniques in estimating validity of the revised tool. The sample size was 686; most subjects were White, older than the age 60 years, and with a high school education and a Karnofsky performance scale (KPS) score of greater than 90. Results A 16-item, 3-factor solution resulted from the factor analysis. The mSAQ-Male was also found to be sensitive to changes in physical sexual functioning as measured by the KPS. Conclusion The mSAQ-Male is a valid self-report measure of sexuality that can be used clinically to detect changes in male sexual functioning. PMID:25255676

  19. Sources of Variability in Chlorophyll Analysis by Fluorometry and by High Performance Liquid Chromatography. Chapter 22

    NASA Technical Reports Server (NTRS)

    VanHeukelem, Laurie; Thomas, Crystal S.; Glibert, Patricia M.

    2001-01-01

    The need for accurate determination of chlorophyll a (chl a) is of interest for numerous reasons. From the need for ground-truth data for remote sensing to pigment detection for laboratory experimentation, it is essential to know the accuracy of the analyses and the factors potentially contributing to variability and error. Numerous methods and instrument techniques are currently employed in the analyses of chl a. These methods range from spectrophotometric quantification, to fluorometric analysis and determination by high performance liquid chromatography. Even within the application of HPLC techniques, methods vary. Here we provide the results of a comparison among methods and provide some guidance for improving the accuracy of these analyses. These results are based on a round-robin conducted among numerous investigators, including several in the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) and HyCODE Programs. Our purpose here is not to present the full results of the laboratory intercalibration; those results will be presented elsewhere. Rather, here we highlight some of the major factors that may contribute to the variability observed. Specifically, we aim to assess the comparability of chl a analyses performed by fluorometry and HPLC, and we identify several factors in the analyses which may contribute disproportionately to this variability.

  20. Micro-heterogeneity versus clustering in binary mixtures of ethanol with water or alkanes.

    PubMed

    Požar, Martina; Lovrinčević, Bernarda; Zoranić, Larisa; Primorać, Tomislav; Sokolić, Franjo; Perera, Aurélien

    2016-08-24

    Ethanol is a hydrogen bonding liquid. When mixed in small concentrations with water or alkanes, it forms aggregate structures reminiscent of, respectively, the direct and inverse micellar aggregates found in emulsions, albeit at much smaller sizes. At higher concentrations, micro-heterogeneous mixing with segregated domains is found. We examine how different statistical methods, namely correlation function analysis, structure factor analysis and cluster distribution analysis, can describe efficiently these morphological changes in these mixtures. In particular, we explain how the neat alcohol pre-peak of the structure factor evolves into the domain pre-peak under mixing conditions, and how this evolution differs whether the co-solvent is water or alkane. This study clearly establishes the heuristic superiority of the correlation function/structure factor analysis to study the micro-heterogeneity, since cluster distribution analysis is insensitive to domain segregation. Correlation functions detect the domains, with a clear structure factor pre-peak signature, while the cluster techniques detect the cluster hierarchy within domains. The main conclusion is that, in micro-segregated mixtures, the domain structure is a more fundamental statistical entity than the underlying cluster structures. These findings could help better understand comparatively the radiation scattering experiments, which are sensitive to domains, versus the spectroscopy-NMR experiments, which are sensitive to clusters.

  1. The effect of push factors in the leisure sports participation of the retired elderly on re-socialization recovery resilience.

    PubMed

    Lee, Kwang-Uk; Kim, Hong-Rok; Yi, Eun-Surk

    2014-04-01

    This study aimed to provide useful materials for the realization of healthy and happy welfare society through the re-socialization of the retired elderly by identifying the effect of the push factors in the leisure sports participation of the retired elderly on re-socialization and recovery resilience. To achieve the study purpose, 304 subjects over the age of 55 residing in Seoul and Gyeonggin among the retired elderly were selected by using the method of systematic stratified cluster random sampling. As research methods, questionnaire papers were used. The data were collected and data which were judged to be incomplete or unreliable in responses were excluded from the analysis. After inputting data which are available to analysis and SPSS 18.0 program was used for statistical techniques. In this, data were processed by factor analysis, correlation analysis, and multiple regression analysis. The study results that were obtained from this analysis are as follows: First, the psychological stability among the push factors in the leisure sports participation of the elderly had a significant effect on re-socialization, while health pursuit had a significant effect on personal exchange and economic activity among the sub-factors of re-socialization. Second, psychological stability among the push factors in the leisure sports participation of the retired elderly had a significant effect on recovery resilience; personal relationships had an effect on empathy skills, impulse control, and self-efficacy; and health pursuit had a significant effect on impulse control, optimism, and self-efficacy.

  2. The effect of push factors in the leisure sports participation of the retired elderly on re-socialization recovery resilience

    PubMed Central

    Lee, Kwang-Uk; Kim, Hong-Rok; Yi, Eun-Surk

    2014-01-01

    This study aimed to provide useful materials for the realization of healthy and happy welfare society through the re-socialization of the retired elderly by identifying the effect of the push factors in the leisure sports participation of the retired elderly on re-socialization and recovery resilience. To achieve the study purpose, 304 subjects over the age of 55 residing in Seoul and Gyeonggin among the retired elderly were selected by using the method of systematic stratified cluster random sampling. As research methods, questionnaire papers were used. The data were collected and data which were judged to be incomplete or unreliable in responses were excluded from the analysis. After inputting data which are available to analysis and SPSS 18.0 program was used for statistical techniques. In this, data were processed by factor analysis, correlation analysis, and multiple regression analysis. The study results that were obtained from this analysis are as follows: First, the psychological stability among the push factors in the leisure sports participation of the elderly had a significant effect on re-socialization, while health pursuit had a significant effect on personal exchange and economic activity among the sub-factors of re-socialization. Second, psychological stability among the push factors in the leisure sports participation of the retired elderly had a significant effect on recovery resilience; personal relationships had an effect on empathy skills, impulse control, and self-efficacy; and health pursuit had a significant effect on impulse control, optimism, and self-efficacy. PMID:24877044

  3. Analysis and compensation for the effect of the catheter position on image intensities in intravascular optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, Shengnan; Eggermont, Jeroen; Wolterbeek, Ron; Broersen, Alexander; Busk, Carol A. G. R.; Precht, Helle; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2016-12-01

    Intravascular optical coherence tomography (IVOCT) is an imaging technique that is used to analyze the underlying cause of cardiovascular disease. Because a catheter is used during imaging, the intensities can be affected by the catheter position. This work aims to analyze the effect of the catheter position on IVOCT image intensities and to propose a compensation method to minimize this effect in order to improve the visualization and the automatic analysis of IVOCT images. The effect of catheter position is modeled with respect to the distance between the catheter and the arterial wall (distance-dependent factor) and the incident angle onto the arterial wall (angle-dependent factor). A light transmission model incorporating both factors is introduced. On the basis of this model, the interaction effect of both factors is estimated with a hierarchical multivariant linear regression model. Statistical analysis shows that IVOCT intensities are significantly affected by both factors with p<0.001, as either aspect increases the intensity decreases. This effect differs for different pullbacks. The regression results were used to compensate for this effect. Experiments show that the proposed compensation method can improve the performance of the automatic bioresorbable vascular scaffold strut detection.

  4. Stress Management and Relaxation Techniques use among underserved inpatients in an inner city hospital.

    PubMed

    Gardiner, Paula; Sadikova, Ekaterina; Filippelli, Amanda C; Mitchell, Suzanne; White, Laura F; Saper, Robert; Kaptchuk, Ted J; Jack, Brian W; Fredman, Lisa

    2015-06-01

    Little is known about the use of Stress Management and Relaxation Techniques (SMART) in racially diverse inpatients. We hope to identify socioeconomic status (SES) factors, health behavior factors, and clinical factors associated with the use of SMART. We conducted a secondary analysis of baseline data from 623 hospitalized patients enrolled in the Re-Engineered Discharge (RED) clinical trial. We assessed socio-demographic characteristics and use of SMART. We used bivariate and multivariate logistic regression to test the association of SMART with socio-demographic characteristics, health behaviors, and clinical factors. A total of 26.6% of participants reported using SMART and 23.6% used mind body techniques. Thirty six percent of work disabled patients, 39% of illicit drug users, and 38% of participants with depressive symptoms used SMART. Patients who both reported illicit drug use and screened positive for depression had significantly increased odds of using SMART [OR=4.94, 95% CI (1.59, 15.13)]. Compared to non-Hispanic whites, non-Hispanic blacks [0.55 (0.34-0.87)] and Hispanic/other race individuals [0.40 (0.20-0.76)] were less likely to use SMART. We found greater utilization of SMART among all racial groups compared to previous national studies. In the inner city inpatient setting, patients with depression, illicit drug use, and work disability reported higher rates of using SMART. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Recurrent tricuspid insufficiency: is the surgical repair technique a risk factor?

    PubMed

    Kara, Ibrahim; Koksal, Cengiz; Cakalagaoglu, Canturk; Sahin, Muslum; Yanartas, Mehmet; Ay, Yasin; Demir, Serdar

    2013-01-01

    This study compares the medium-term results of De Vega, modified De Vega, and ring annuloplasty techniques for the correction of tricuspid insufficiency and investigates the risk factors for recurrent grades 3 and 4 tricuspid insufficiency after repair. In our clinic, 93 patients with functional tricuspid insufficiency underwent surgical tricuspid repair from May 2007 through October 2010. The study was retrospective, and all the data pertaining to the patients were retrieved from hospital records. Functional capacity, recurrent tricuspid insufficiency, and risk factors aggravating the insufficiency were analyzed for each patient. In the medium term (25.4 ± 10.3 mo), the rates of grades 3 and 4 tricuspid insufficiency in the De Vega, modified De Vega, and ring annuloplasty groups were 31%, 23.1%, and 6.1%, respectively. Logistic regression analysis revealed that chronic obstructive pulmonary disease, left ventricular dysfunction (ejection fraction, < 0.50), pulmonary artery pressure ≥60 mmHg, and the De Vega annuloplasty technique were risk factors for medium-term recurrent grades 3 and 4 tricuspid insufficiency. Medium-term survival was 90.6% for the De Vega group, 96.3% for the modified De Vega group, and 97.1% for the ring annuloplasty group. Ring annuloplasty provided the best relief from recurrent tricuspid insufficiency when compared with DeVega annuloplasty. Modified De Vega annuloplasty might be a suitable alternative to ring annuloplasty when rings are not available.

  6. Adhesive blood microsampling systems for steroid measurement via LC-MS/MS in the rat.

    PubMed

    Heussner, Kirsten; Rauh, Manfred; Cordasic, Nada; Menendez-Castro, Carlos; Huebner, Hanna; Ruebner, Matthias; Schmidt, Marius; Hartner, Andrea; Rascher, Wolfgang; Fahlbusch, Fabian B

    2017-04-01

    Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) allows for the direct analysis of multiple hormones in a single probe with minimal sample volume. Rodent-based animal studies strongly rely on microsampling, such as the dry blood spot (DBS) method. However, DBS suffers the drawback of hematocrit-dependence (non-volumetric). Hence, novel volumetric microsampling techniques were introduced recently, allowing sampling of fixed accurate volumes. We compared these methods for steroid analysis in the rat to improve inter-system comparability. We analyzed steroid levels in blood using the absorptive microsampling devices Whatman® 903 Protein Saver Cards, Noviplex™ Plasma Prep Cards and the Mitra™ Microsampling device and compared the obtained results to the respective EDTA plasma levels. Quantitative steroid analysis was performed via LC-MS/MS. For the determination of the plasma volume factor for each steroid, their levels in pooled blood samples from each human adults and rats (18weeks) were compared and the transferability of these factors was evaluated in a new set of juvenile (21days) and adult (18weeks) rats. Hematocrit was determined concomitantly. Using these approaches, we were unable to apply one single volume factor for each steroid. Instead, plasma volume factors had to be adjusted for the recovery rate of each steroid and device individually. The tested microsampling systems did not allow the use of one single volume factor for adult and juvenile rats based on an unexpectedly strong hematocrit-dependency and other steroid specific (pre-analytic) factors. Our study provides correction factors for LC-MS/MS steroid analysis of volumetric and non-volumetric microsampling systems in comparison to plasma. It argues for thorough analysis of chromatographic effects before the use of novel volumetric systems for steroid analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Factor analysis and psychometric properties of the Mother-Adolescent Sexual Communication (MASC) instrument for sexual risk behavior.

    PubMed

    Cox, Mary Foster; Fasolino, Tracy K; Tavakoli, Abbas S

    2008-01-01

    Sexual risk behavior is a public health problem among adolescents living at or below poverty level. Approximately 1 million pregnancies and 3 million cases of sexually transmitted infections (STIs) are reported yearly. Parenting plays a significant role in adolescent behavior, with mother-adolescent sexual communication correlated with absent or delayed sexual behavior. This study developed an instrument examining constructs of mother-adolescent communication, the Mother-Adolescent Sexual Communication (MASC) instrument. A convenience sample of 99 mothers of middle school children completed the self-administered questionnaires. The original 34-item MASC was reduced to 18 items. Exploratory factor analysis was conducted on the 18-item scale, which resulted in four factors explaining 84.63% of the total variance. Internal consistency analysis produced Cronbach alpha coefficients of .87, .90, .82, and .71 for the four factors, respectively. Convergent validity via hypothesis testing was supported by significant correlations with several subscales of the Parent-Child Relationship Questionnaire (PCRQ) with MASC factors, that is, content and style factors with warmth, personal relationships and disciplinary warmth subscales of the PCRQ, the context factor with personal relationships, and the timing factor with warmth. In light of these findings, the psychometric characteristics and multidimensional perspective of the MASC instrument show evidence of usefulness for measuring and advancing knowledge of mother and adolescent sexual communication techniques.

  8. Milch versus Stimson technique for nonsedated reduction of anterior shoulder dislocation: a prospective randomized trial and analysis of factors affecting success.

    PubMed

    Amar, Eyal; Maman, Eran; Khashan, Morsi; Kauffman, Ehud; Rath, Ehud; Chechik, Ofir

    2012-11-01

    The shoulder is regarded as the most commonly dislocated major joint in the human body. Most dislocations can be reduced by simple methods in the emergency department, whereas others require more complicated approaches. We compared the efficacy, safety, pain, and duration of the reduction between the Milch technique and the Stimson technique in treating dislocations. We also identified factors that affected success rate. All enrolled patients were randomized to either the Milch technique or the Stimson technique for dislocated shoulder reduction. The study cohort consisted of 60 patients (mean age, 43.9 years; age range, 18-88 years) who were randomly assigned to treatment by either the Stimson technique (n = 25) or the Milch technique (n = 35). Oral analgesics were available for both groups. The 2 groups were similar in demographics, patient characteristics, and pain levels. The first reduction attempt in the Milch and Stimson groups was successful in 82.8% and 28% of cases, respectively (P < .001), and the mean reduction time was 4.68 and 8.84 minutes, respectively (P = .007). The success rate was found to be affected by the reduction technique, the interval between dislocation occurrence and first reduction attempt, and the pain level on admittance. The success rate and time to achieve reduction without sedation were superior for the Milch technique compared with the Stimson technique. Early implementation of reduction measures and low pain levels at presentation favor successful reduction, which--in combination with oral pain medication--constitutes an acceptable and reasonable management alternative to reduction with sedation. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  9. Factors That Attenuate the Correlation Coefficient and Its Analogs.

    ERIC Educational Resources Information Center

    Dolenz, Beverly

    The correlation coefficient is an integral part of many other statistical techniques (analysis of variance, t-tests, etc.), since all analytic methods are actually correlational (G. V. Glass and K. D. Hopkins, 1984). The correlation coefficient is a statistical summary that represents the degree and direction of relationship between two variables.…

  10. Application of Markov Models for Analysis of Development of Psychological Characteristics

    ERIC Educational Resources Information Center

    Kuravsky, Lev S.; Malykh, Sergey B.

    2004-01-01

    A technique to study combined influence of environmental and genetic factors on the base of changes in phenotype distributions is presented. Histograms are exploited as base analyzed characteristics. A continuous time, discrete state Markov process with piece-wise constant interstate transition rates is associated with evolution of each histogram.…

  11. Validation of the Consumer Values versus Perceived Product Attributes Model Measuring the Purchase of Athletic Team Merchandise

    ERIC Educational Resources Information Center

    Lee, Donghun; Byon, Kevin K.; Schoenstedt, Linda; Johns, Gary; Bussell, Leigh Ann; Choi, Hwansuk

    2012-01-01

    Various consumer values and perceived product attributes trigger consumptive behaviors of athletic team merchandise (Lee, Trail, Kwon, & Anderson, 2011). Likewise, using a principal component analysis technique on a student sample, a measurement scale was proposed that consisted of nine factors affecting the purchase of athletic team…

  12. Basic Research in Human Factors

    DTIC Science & Technology

    1990-07-01

    settings as military organizations, voluntary organizations, multinational corporations, 15 diplamatic corps, governmnt agencies, and couples managing a...developrent, the analysis suggests both problem and possible solutions. It also derives some general conclusions regarding the design and management ...organize and manage information spontaneasly in order to develop techniques which will help them do so more effectively. Attitudes toards and

  13. Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis

    ERIC Educational Resources Information Center

    Tsai, Meng-Jung; Hou, Huei-Tse; Lai, Meng-Lung; Liu, Wan-Yi; Yang, Fang-Ying

    2012-01-01

    This study employed an eye-tracking technique to examine students' visual attention when solving a multiple-choice science problem. Six university students participated in a problem-solving task to predict occurrences of landslide hazards from four images representing four combinations of four factors. Participants' responses and visual attention…

  14. An Application of Indian Health Service Standards for Alcoholism Programs.

    ERIC Educational Resources Information Center

    Burns, Thomas R.

    1984-01-01

    Discusses Phoenix-area applications of 1981 Indian Health Service standards for alcoholism programs. Results of standard statistical techniques note areas of deficiency through application of a one-tailed z test at .05 level of significance. Factor analysis sheds further light on design of standards. Implications for revisions are suggested.…

  15. Validity Evidence in Scale Development: The Application of Cross Validation and Classification-Sequencing Validation

    ERIC Educational Resources Information Center

    Acar, Tu¨lin

    2014-01-01

    In literature, it has been observed that many enhanced criteria are limited by factor analysis techniques. Besides examinations of statistical structure and/or psychological structure, such validity studies as cross validation and classification-sequencing studies should be performed frequently. The purpose of this study is to examine cross…

  16. The Motor Domain and its Correlates in Educationally Handicapped Children.

    ERIC Educational Resources Information Center

    Rarick, G. Lawrence; And Others

    This monograph is an account of two related investigations of the motor domain and its correlates in educationally handicapped children. Part I describes an investigation primarily concerned with the identification of the basic components of the motor behavior of educable mentally retarded children through the use of factor analysis techniques.…

  17. Development and Validation of Academic Dishonesty Scale (ADS): Presenting a Multidimensional Scale

    ERIC Educational Resources Information Center

    Bashir, Hilal; Bala, Ranjan

    2018-01-01

    The purpose of the study was to develop a scale measuring academic dishonesty of undergraduate students. The sample of the study constitutes nine hundred undergraduate students selected via random sampling technique. After receiving expert's opinions for the face and content validity of the scale, the exploratory factor analysis (EFA) and…

  18. Assigning Cases to Groups Using Taxometric Results: An Empirical Comparison of Classification Techniques

    ERIC Educational Resources Information Center

    Ruscio, John

    2009-01-01

    Determining whether individuals belong to different latent classes (taxa) or vary along one or more latent factors (dimensions) has implications for assessment. For example, no instrument can simultaneously maximize the efficiency of categorical and continuous measurement. Methods such as taxometric analysis can test the relative fit of taxonic…

  19. Real-Time PCR for the Detection of Precise Transgene Copy Number in Wheat.

    PubMed

    Giancaspro, Angelica; Gadaleta, Agata; Blanco, Antonio

    2017-01-01

    Despite the unceasing advances in genetic transformation techniques, the success of common delivery methods still lies on the behavior of the integrated transgenes in the host genome. Stability and expression of the introduced genes are influenced by several factors such as chromosomal location, transgene copy number and interaction with the host genotype. Such factors are traditionally characterized by Southern blot analysis, which can be time-consuming, laborious, and often unable to detect the exact copy number of rearranged transgenes. Recent research in crop field suggests real-time PCR as an effective and reliable tool for the precise quantification and characterization of transgene loci. This technique overcomes most problems linked to phenotypic segregation analysis and can analyze hundreds of samples in a day, making it an efficient method for estimating a gene copy number integrated in a transgenic line. This protocol describes the use of real-time PCR for the detection of transgene copy number in durum wheat transgenic lines by means of two different chemistries (SYBR ® Green I dye and TaqMan ® probes).

  20. Spatial analysis of falls in an urban community of Hong Kong

    PubMed Central

    Lai, Poh C; Low, Chien T; Wong, Martin; Wong, Wing C; Chan, Ming H

    2009-01-01

    Background Falls are an issue of great public health concern. This study focuses on outdoor falls within an urban community in Hong Kong. Urban environmental hazards are often place-specific and dependent upon the built features, landscape characteristics, and habitual activities. Therefore, falls must be examined with respect to local situations. Results This paper uses spatial analysis methods to map fall occurrences and examine possible environmental attributes of falls in an urban community of Hong Kong. The Nearest neighbour hierarchical (Nnh) and Standard Deviational Ellipse (SDE) techniques can offer additional insights about the circumstances and environmental factors that contribute to falls. The results affirm the multi-factorial nature of falls at specific locations and for selected groups of the population. Conclusion The techniques to detect hot spots of falls yield meaningful results that enable the identification of high risk locations. The combined use of descriptive and spatial analyses can be beneficial to policy makers because different preventive measures can be devised based on the types of environmental risk factors identified. The analyses are also important preludes to establishing research hypotheses for more focused studies. PMID:19291326

  1. Peritoneal dialysis technique success during the initial 90 days of therapy.

    PubMed

    Guest, Steven; Hayes, Andrew C; Story, Kenneth; Davis, Ira D

    2012-01-01

    Comparisons of technique success by peritoneal dialysis (PD) modality have typically excluded the initial 90 days of therapy. We analyzed a database of 51,469 new PD starts from 2004 to 2008 in the United States. The analysis concentrated on the initial 90 days of therapy to determine technique success and the impact of the continuous ambulatory PD (CAPD) and automated PD (APD) modalities. Overall, 13.3% of patients stopped PD within 90 days. Of patients starting directly on APD, 14.3% stopped PD within 90 days. Of patients starting on CAPD, 12.6% stopped PD within 90 days, and 63.4% changed to APD within 90 days. Only 3.3% of the latter patients failed to reach 90 days of therapy. By comparison, technique failure occurred in 28.8% of those initiating with and remaining on CAPD. We conclude that initial training to perform CAPD, with timely transfer to APD within the first 3 months, was associated with the greatest technique success at 90 days. The reasons for that success are unclear, and further research should be directed to determining factors responsible. It is possible that patients trained initially to CAPD but converted to APD have a greater understanding of the total therapy, which improves confidence. Those converted to APD may be more appreciative of the lifestyle benefits of APD, which translates into improved compliance; alternatively, technical factors associated with APD may be responsible. Those technical factors may include improved catheter function in the recumbent position during APD or the reduced infection risk associated with just 2 connect/disconnect procedures in APD compared with 8 in CAPD.

  2. Grouping of Bulgarian wines according to grape variety by using statistical methods

    NASA Astrophysics Data System (ADS)

    Milev, M.; Nikolova, Kr.; Ivanova, Ir.; Minkova, St.; Evtimov, T.; Krustev, St.

    2017-12-01

    68 different types of Bulgarian wines were studied in accordance with 9 optical parameters as follows: color parameters in XYZ and SIE Lab color systems, lightness, Hue angle, chroma, fluorescence intensity and emission wavelength. The main objective of this research is using hierarchical cluster analysis to evaluate the similarity and the distance between examined different types of Bulgarian wines and their grouping based on physical parameters. We have found that wines are grouped in clusters on the base of the degree of identity between them. There are two main clusters each one with two subclusters. The first one contains white wines and Sira, the second contains red wines and rose. The results from cluster analysis are presented graphically by a dendrogram. The other statistical technique used is factor analysis performed by the Method of Principal Components (PCA). The aim is to reduce the large number of variables to a few factors by grouping the correlated variables into one factor and subdividing the noncorrelated variables into different factors. Moreover the factor analysis provided the possibility to determine the parameters with the greatest influence over the distribution of samples in different clusters. In our study after the rotation of the factors with Varimax method the parameters were combined into two factors, which explain about 80 % of the total variation. The first one explains the 61.49% and correlates with color characteristics, the second one explains 18.34% from the variation and correlates with the parameters connected with fluorescence spectroscopy.

  3. NECAP: NASA's Energy-Cost Analysis Program. Part 1: User's manual

    NASA Technical Reports Server (NTRS)

    Henninger, R. H. (Editor)

    1975-01-01

    The NECAP is a sophisticated building design and energy analysis tool which has embodied within it all of the latest ASHRAE state-of-the-art techniques for performing thermal load calculation and energy usage predictions. It is a set of six individual computer programs which include: response factor program, data verification program, thermal load analysis program, variable temperature program, system and equipment simulation program, and owning and operating cost program. Each segment of NECAP is described, and instructions are set forth for preparing the required input data and for interpreting the resulting reports.

  4. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  5. A retrospective analysis of laparoscopic partial nephrectomy with segmental renal artery clamping and factors that predict postoperative renal function.

    PubMed

    Li, Pu; Qin, Chao; Cao, Qiang; Li, Jie; Lv, Qiang; Meng, Xiaoxin; Ju, Xiaobing; Tang, Lijun; Shao, Pengfei

    2016-10-01

    To evaluate the feasibility and efficiency of laparoscopic partial nephrectomy (LPN) with segmental renal artery clamping, and to analyse the factors affecting postoperative renal function. We conducted a retrospective analysis of 466 consecutive patients undergoing LPN using main renal artery clamping (group A, n = 152) or segmental artery clamping (group B, n = 314) between September 2007 and July 2015 in our department. Blood loss, operating time, warm ischaemia time (WIT) and renal function were compared between groups. Univariable and multivariable linear regression analyses were applied to assess the correlations of selected variables with postoperative glomerular filtration rate (GFR) reduction. Volumetric data and estimated GFR of a subset of 60 patients in group B were compared with GFR to evaluate the correlation between these functional variables and preserved renal function after LPN. The novel technique slightly increased operating time, WIT and intra-operative blood loss (P < 0.001), while it provided better postoperative renal function (P < 0.001) compared with the conventional technique. The blocking method and tumour characteristics were independent factors affecting GFR reduction, while WIT was not an independent factor. Correlation analysis showed that estimated GFR presented better correlation with GFR compared with kidney volume (R(2) = 0.794 cf. R(2) = 0.199) in predicting renal function after LPN. LPN with segmental artery clamping minimizes warm ischaemia injury and provides better early postoperative renal function compared with clamping the main renal artery. Kidney volume has a significantly inferior role compared with eGFR in predicting preserved renal function. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  6. Effects of soil and topographic factors on vegetation restoration in opencast coal mine dumps located in a loess area

    NASA Astrophysics Data System (ADS)

    Wang, Jinman; Wang, Hongdan; Cao, Yingui; Bai, Zhongke; Qin, Qian

    2016-02-01

    Vegetation plays an important role in improving and restoring fragile ecological environments. In the Antaibao opencast coal mine, located in a loess area, the eco-environment has been substantially disturbed by mining activities, and the relationship between the vegetation and environmental factors is not very clear. Therefore, it is crucial to understand the effects of soil and topographic factors on vegetation restoration to improve the fragile ecosystems of damaged land. An investigation of the soil, topography and vegetation in 50 reclamation sample plots in Shanxi Pingshuo Antaibao opencast coal mine dumps was performed. Statistical analyses in this study included one-way ANOVA and significance testing using SPSS 20.0, and multivariate techniques of detrended correspondence analysis (DCA) and redundancy analysis (RDA) using CANOCO 4.5. The RDA revealed the environmental factors that affected vegetation restoration. Various vegetation and soil variables were significantly correlated. The available K and rock content were good explanatory variables, and they were positively correlated with tree volume. The effects of the soil factors on vegetation restoration were higher than those of the topographic factors.

  7. Effects of soil and topographic factors on vegetation restoration in opencast coal mine dumps located in a loess area

    PubMed Central

    Wang, Jinman; Wang, Hongdan; Cao, Yingui; Bai, Zhongke; Qin, Qian

    2016-01-01

    Vegetation plays an important role in improving and restoring fragile ecological environments. In the Antaibao opencast coal mine, located in a loess area, the eco-environment has been substantially disturbed by mining activities, and the relationship between the vegetation and environmental factors is not very clear. Therefore, it is crucial to understand the effects of soil and topographic factors on vegetation restoration to improve the fragile ecosystems of damaged land. An investigation of the soil, topography and vegetation in 50 reclamation sample plots in Shanxi Pingshuo Antaibao opencast coal mine dumps was performed. Statistical analyses in this study included one-way ANOVA and significance testing using SPSS 20.0, and multivariate techniques of detrended correspondence analysis (DCA) and redundancy analysis (RDA) using CANOCO 4.5. The RDA revealed the environmental factors that affected vegetation restoration. Various vegetation and soil variables were significantly correlated. The available K and rock content were good explanatory variables, and they were positively correlated with tree volume. The effects of the soil factors on vegetation restoration were higher than those of the topographic factors. PMID:26916152

  8. Effects of soil and topographic factors on vegetation restoration in opencast coal mine dumps located in a loess area.

    PubMed

    Wang, Jinman; Wang, Hongdan; Cao, Yingui; Bai, Zhongke; Qin, Qian

    2016-02-26

    Vegetation plays an important role in improving and restoring fragile ecological environments. In the Antaibao opencast coal mine, located in a loess area, the eco-environment has been substantially disturbed by mining activities, and the relationship between the vegetation and environmental factors is not very clear. Therefore, it is crucial to understand the effects of soil and topographic factors on vegetation restoration to improve the fragile ecosystems of damaged land. An investigation of the soil, topography and vegetation in 50 reclamation sample plots in Shanxi Pingshuo Antaibao opencast coal mine dumps was performed. Statistical analyses in this study included one-way ANOVA and significance testing using SPSS 20.0, and multivariate techniques of detrended correspondence analysis (DCA) and redundancy analysis (RDA) using CANOCO 4.5. The RDA revealed the environmental factors that affected vegetation restoration. Various vegetation and soil variables were significantly correlated. The available K and rock content were good explanatory variables, and they were positively correlated with tree volume. The effects of the soil factors on vegetation restoration were higher than those of the topographic factors.

  9. POPA: A Personality and Object Profiling Assistant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreicer, J.S.

    POPA: A Personality and Object Profiling Assistant system utilizes an extension and variation of a process developed for decision analysis as a tool to quantify intuitive feelings and subjective judgments. The technique is based on a manipulation of the Analytical Hierarchy Process. The POPA system models an individual in terms of his character type, life orientation, and incentive (motivational) factors. Then an object (i.e., individual, project, situation, or policy) is modeled with respect to its three most important factors. The individual and object models are combined to indicate the influence each of the three object factors have on the individual.more » We have investigated this problem: 1) to develop a technique that models personality types in a quantitative and organized manner, 2) to develop a tool capable of evaluating the probable success of obtaining funding for proposed programs at Los Alamos National Laboratory, 3) to determine the feasibility of quantifying feelings and intuition, and 4) to better understand subjective knowledge acquisition (especially intuition). 49 refs., 10 figs., 5 tabs.« less

  10. Insufficient Knowledge of Breast Cancer Risk Factors Among Malaysian Female University Students

    PubMed Central

    Samah, Asnarulkhadi Abu; Ahmadian, Maryam; Latiff, Latiffah A.

    2016-01-01

    Background: Despite continuous argument about the efficacy of breast self-examination; it still could be a life-saving technique through inspiring and empowering women to take better control over their body/breast and health. This study investigated Malaysian female university students’ knowledge about breast cancer risk factors, signs, and symptoms and assessed breast self-examination frequency among students. Method: A cross-sectional survey was conducted in 2013 in nine public and private universities in the Klang Valley and Selangor. 842 female students were respondents for the self-administered survey technique. Simple descriptive and inferential statistics were employed for data analysis. Results: The uptake of breast self-examination (BSE) was less than 50% among the students. Most of students had insufficient knowledge on several breast cancer risk factors. Conclusion: Actions and efforts should be done to increase knowledge of breast cancer through the development of ethnically and traditionally sensitive educational training on BSE and breast cancer literacy. PMID:26234996

  11. Physiological ICSI (PICSI) vs. Conventional ICSI in Couples with Male Factor: A Systematic Review.

    PubMed

    Avalos-Durán, Georgina; Ángel, Ana María Emilia Cañedo-Del; Rivero-Murillo, Juana; Zambrano-Guerrero, Jaime Enoc; Carballo-Mondragón, Esperanza; Checa-Vizcaíno, Miguel Ángel

    2018-04-19

    To determine the efficacy of the physiological ICSI technique (PICSI) vs. conventional ICSI in the prognosis of couples, with respect to the following outcome measures: live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage rates. A systematic review of the literature, extracting raw data and performing data analysis. Patient(s): Couples with the male factor, who were subjected to in-vitro fertilization. Main Outcome Measures: rates of live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage. In the systematic search, we found 2,918 studies and an additional study from other sources; only two studies fulfilled the inclusion criteria for this systematic review. The rates of live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage were similar for both groups. There is no statistically significant difference between PICSI vs. ICSI, for any of the outcomes analyzed in this study. Enough information is still not available to prove the efficacy of the PICSI technique over ICSI in couples with male factor.

  12. Application of optimization techniques to vehicle design: A review

    NASA Technical Reports Server (NTRS)

    Prasad, B.; Magee, C. L.

    1984-01-01

    The work that has been done in the last decade or so in the application of optimization techniques to vehicle design is discussed. Much of the work reviewed deals with the design of body or suspension (chassis) components for reduced weight. Also reviewed are studies dealing with system optimization problems for improved functional performance, such as ride or handling. In reviewing the work on the use of optimization techniques, one notes the transition from the rare mention of the methods in the 70's to an increased effort in the early 80's. Efficient and convenient optimization and analysis tools still need to be developed so that they can be regularly applied in the early design stage of the vehicle development cycle to be most effective. Based on the reported applications, an attempt is made to assess the potential for automotive application of optimization techniques. The major issue involved remains the creation of quantifiable means of analysis to be used in vehicle design. The conventional process of vehicle design still contains much experience-based input because it has not yet proven possible to quantify all important constraints. This restraint on the part of the analysis will continue to be a major limiting factor in application of optimization to vehicle design.

  13. Reducing the blame culture through clinical audit in nuclear medicine: a mixed methods study.

    PubMed

    Ross, P; Hubert, J; Wong, W L

    2017-02-01

    To identify the barriers and facilitators of doctors' engagement with clinical audit and to explore how and why these factors influenced doctors' decisions to engage with the NHS National Clinical Audit Programme. A single-embedded case study. Mixed methods sequential approach with explorative pilot study and follow-up survey. Pilot study comprised 13 semi-structured interviews with purposefully selected consultant doctors over a six-month period. Interview data coded and analysed using directed thematic content analysis with themes compared against the study's propositions. Themes derived from the pilot study informed the online survey question items. Exploratory factor analysis using STATA and descriptive statistical methods applied to summarise findings. Data triangulation techniques used to corroborate and validate findings across the different methodological techniques. NHS National PET-CT Clinical Audit Programme. Doctors reporting on the Audit Programme. Extent of engagement with clinical audit, factors that influence engagement with clinical audit. Online survey: 58/59 doctors responded (98.3%). Audit was found to be initially threatening (79%); audit was reassuring (85%); audit helped validate professional competence (93%); participation in audit improved reporting skills (76%). Three key factors accounted for 97.6% of the variance in survey responses: (1) perception of audit's usefulness, (2) a common purpose, (3) a supportive blame free culture of trust. Factor 1 influenced medical engagement most. The study documents performance feedback as a key facilitator of medical engagement with clinical audit. It found that medical engagement with clinical audit was associated with reduced levels of professional anxiety and higher levels of perceived self-efficacy.

  14. Early diagnosis of tongue malignancy using laser induced fluorescence spectroscopy technique

    NASA Astrophysics Data System (ADS)

    Patil, Ajeetkumar; Unnikrishnan V., K.; Ongole, Ravikiran; Pai, Keerthilatha M.; Kartha, V. B.; Chidangil, Santhosh

    2015-07-01

    Oral cancer together with pharyngeal cancer is the sixth most common malignancy reported worldwide and one with high mortality ratio among all malignancies [1]. Worldwide 450,000 new cases are estimated in 2014[2]. About 90% are a type of cancer called squamous cell carcinoma (SCC). SCC of the tongue is the most common oral malignancy accounting for approximately 40% of all oral carcinomas. One of the important factors for successful therapy of any malignancy is early diagnosis. Although considerable progress has been made in understanding the cellular and molecular mechanisms of tumorigenesis, lack of reliable diagnostic methods for early detection leading to delay in therapy is an important factor responsible for the increase in the mortality rate in various types of cancers. Spectroscopy techniques are extremely sensitive for the analysis of biochemical changes in cellular systems. These techniques can provide a valuable information on alterations that occur during the development of cancer. This is especially important in oral cancer, where "tumor detection is complicated by a tendency towards field cancerization, leading to multi-centric lesions" and "current techniques detect malignant change too late" [3], and "biopsies are not representative of the whole premalignant lesion". [4

  15. Laplace-SGBEM analysis of the dynamic stress intensity factors and the dynamic T-stress for the interaction between a crack and auxetic inclusions

    NASA Astrophysics Data System (ADS)

    Kwon, Kibum

    A dynamic analysis of the interaction between a crack and an auxetic (negative Poisson ratio)/non-auxetic inclusion is presented. The two most important fracture parameters, namely the stress intensity factors and the T-stress are analyzed by using the symmetric Galerkin boundary element method in the Laplace domain for three different models of crack-inclusion interaction. To investigate the effects of auxetic inclusions on the fracture behavior of composites reinforced by this new type of material, comparisons of the dynamic stress intensity factors and the dynamic T-stress are made between the use of auxetic inclusions as opposed to the use of traditional inclusions. Furthermore, the technique presented in this research can be employed to analyze for the interaction between a crack and a cluster of auxetic/non-auxetic inclusions. Results from the latter models can be employed in crack growth analysis in auxetic-fiber-reinforced composites.

  16. Exploratory factor analysis of self-reported symptoms in a large, population-based military cohort

    PubMed Central

    2010-01-01

    Background US military engagements have consistently raised concern over the array of health outcomes experienced by service members postdeployment. Exploratory factor analysis has been used in studies of 1991 Gulf War-related illnesses, and may increase understanding of symptoms and health outcomes associated with current military conflicts in Iraq and Afghanistan. The objective of this study was to use exploratory factor analysis to describe the correlations among numerous physical and psychological symptoms in terms of a smaller number of unobserved variables or factors. Methods The Millennium Cohort Study collects extensive self-reported health data from a large, population-based military cohort, providing a unique opportunity to investigate the interrelationships of numerous physical and psychological symptoms among US military personnel. This study used data from the Millennium Cohort Study, a large, population-based military cohort. Exploratory factor analysis was used to examine the covariance structure of symptoms reported by approximately 50,000 cohort members during 2004-2006. Analyses incorporated 89 symptoms, including responses to several validated instruments embedded in the questionnaire. Techniques accommodated the categorical and sometimes incomplete nature of the survey data. Results A 14-factor model accounted for 60 percent of the total variance in symptoms data and included factors related to several physical, psychological, and behavioral constructs. A notable finding was that many factors appeared to load in accordance with symptom co-location within the survey instrument, highlighting the difficulty in disassociating the effects of question content, location, and response format on factor structure. Conclusions This study demonstrates the potential strengths and weaknesses of exploratory factor analysis to heighten understanding of the complex associations among symptoms. Further research is needed to investigate the relationship between factor analytic results and survey structure, as well as to assess the relationship between factor scores and key exposure variables. PMID:20950474

  17. Zero mortality in more than 300 hepatic resections: validity of preoperative volumetric analysis.

    PubMed

    Itoh, Shinji; Shirabe, Ken; Taketomi, Akinobu; Morita, Kazutoyo; Harimoto, Norifumi; Tsujita, Eiji; Sugimachi, Keishi; Yamashita, Yo-Ichi; Gion, Tomonobu; Maehara, Yoshihiko

    2012-05-01

    We reviewed a series of patients who underwent hepatic resection at our institution, to investigate the risk factors for postoperative complications after hepatic resection of liver tumors and for procurement of living donor liver transplantation (LDLT) grafts. Between April 2004 and August 2007, we performed 304 hepatic resections for liver tumors or to procure grafts for LDLT. Preoperative volumetric analysis was done using 3-dimensional computed tomography (3D-CT) prior to major hepatic resection. We compared the clinicopathological factors between patients with and without postoperative complications. There was no operative mortality. According to the 3D-CT volumetry, the mean error ratio between the actual and the estimated remnant liver volume was 13.4%. Postoperative complications developed in 96 (31.6%) patients. According to logistic regression analysis, histological liver cirrhosis and intraoperative blood loss >850 mL were significant risk factors of postoperative complications after hepatic resection. Meticulous preoperative evaluation based on volumetric analysis, together with sophisticated surgical techniques, achieved zero mortality and minimized intraoperative blood loss, which was classified as one of the most significant predictors of postoperative complications after major hepatic resection.

  18. Spatial data analysis and the use of maps in scientific health articles.

    PubMed

    Nucci, Luciana Bertoldi; Souccar, Patrick Theodore; Castilho, Silvia Diez

    2016-07-01

    Despite the growing number of studies with a characteristic element of spatial analysis, the application of the techniques is not always clear and its continuity in epidemiological studies requires careful evaluation. To verify the spread and use of those processes in national and international scientific papers. An assessment was made of periodicals according to the impact index. Among 8,281 journals surveyed, four national and four international were selected, of which 1,274 articles were analyzed regarding the presence or absence of spatial analysis techniques. Just over 10% of articles published in 2011 in high impact journals, both national and international, showed some element of geographical location. Although these percentages vary greatly from one journal to another, denoting different publication profiles, we consider this percentage as an indication that location variables have become an important factor in studies of health.

  19. Comparative study of two approaches to model the offshore fish cages

    NASA Astrophysics Data System (ADS)

    Zhao, Yun-peng; Wang, Xin-xin; Decew, Jud; Tsukrov, Igor; Bai, Xiao-dong; Bi, Chun-wei

    2015-06-01

    The goal of this paper is to provide a comparative analysis of two commonly used approaches to discretize offshore fish cages: the lumped-mass approach and the finite element technique. Two case studies are chosen to compare predictions of the LMA (lumped-mass approach) and FEA (finite element analysis) based numerical modeling techniques. In both case studies, we consider several loading conditions consisting of different uniform currents and monochromatic waves. We investigate motion of the cage, its deformation, and the resultant tension in the mooring lines. Both model predictions are sufficient close to the experimental data, but for the first experiment, the DUT-FlexSim predictions are slightly more accurate than the ones provided by Aqua-FE™. According to the comparisons, both models can be successfully utilized to the design and analysis of the offshore fish cages provided that an appropriate safety factor is chosen.

  20. Economic Analysis in the Pacific Northwest Land Resources Project: Theoretical Considerations and Preliminary Results

    NASA Technical Reports Server (NTRS)

    Morse, D. R. A.; Sahlberg, J. T.

    1977-01-01

    The Pacific Northwest Land Resources Inventory Demonstration Project i s an a ttempt to combine a whole spectrum of heterogeneous geographic, institutional and applications elements in a synergistic approach to the evaluation of remote sensing techniques. This diversity is the prime motivating factor behind a theoretical investigation of alternative economic analysis procedures. For a multitude of reasons--simplicity, ease of understanding, financial constraints and credibility, among others--cost-effectiveness emerges as the most practical tool for conducting such evaluation determinatIons in the Pacific Northwest. Preliminary findings in two water resource application areas suggest, in conformity with most published studies, that Lands at-aided data collection methods enjoy substantial cost advantages over alternative techniques. The pntential for sensitivity analysis based on cost/accuracy tradeoffs is considered on a theoretical plane in the absence of current accuracy figures concerning the Landsat-aided approach.

  1. Tutorial: Advanced fault tree applications using HARP

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.

    1993-01-01

    Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.

  2. Behaviour and Analysis of Mechanically Fastened Joints in Composite Structures

    DTIC Science & Technology

    1988-03-01

    Safety Factors for Use When Designing bolted Joints In GRP," Composites , April 1979, pp. M376. 93. Dastln, S., "Joining and Machining Techniques... MACHINE SPACER LOCKmm STEEL PLATE FASTENER 203 mm OR DOWEL FiN EXTENSOMETER EXTENSOMETER TGAUGE LENGTH ATTACHMENT COMPOSITE - PLATE 31 mm p NOTE: NOT TO...No.427 Behaviour and Analysis of Mechanically Fastened Joints in Composite Structures DTIC CXVTflUTION STATEME~r £ELECTE Approved fm Vubhc sIlam l JUL

  3. A comparison of TSS and TRASYS in form factor calculation

    NASA Technical Reports Server (NTRS)

    Golliher, Eric

    1993-01-01

    As the workstation and personal computer become more popular than a centralized mainframe to perform thermal analysis, the methods for space vehicle thermal analysis will change. Already, many thermal analysis codes are now available for workstations, which were not in existence just five years ago. As these changes occur, some organizations will adopt the new codes and analysis techniques, while others will not. This might lead to misunderstandings between thermal shops in different organizations. If thermal analysts make an effort to understand the major differences between the new and old methods, a smoother transition to a more efficient and more versatile thermal analysis environment will be realized.

  4. [Application of text mining approach to pre-education prior to clinical practice].

    PubMed

    Koinuma, Masayoshi; Koike, Katsuya; Nakamura, Hitoshi

    2008-06-01

    We developed a new survey analysis technique to understand students' actual aims for effective pretraining prior to clinical practice. We asked third-year undergraduate students to write fixed-style complete and free sentences on "preparation of drug dispensing." Then, we converted their sentence data in to text style and performed Japanese-language morphologic analysis on the data using language analysis software. We classified key words, which were created on the basis of the word class information of the Japanese language morphologic analysis, into categories based on causes and characteristics. In addition to this, we classified the characteristics into six categories consisting of those concepts including "knowledge," "skill and attitude," "image," etc. with the KJ method technique. The results showed that the awareness of students of "preparation of drug dispensing" tended to be approximately three-fold more frequent in "skill and attitude," "risk," etc. than in "knowledge." Regarding the characteristics in the category of the "image," words like "hard," "challenging," "responsibility," "life," etc. frequently occurred. The results of corresponding analysis showed that the characteristics of the words "knowledge" and "skills and attitude" were independent. As the result of developing a cause-and-effect diagram, it was demonstrated that the phase "hanging tough" described most of the various factors. We thus could understand students' actual feelings by applying text-mining as a new survey analysis technique.

  5. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  6. Hot mill process parameters impacting on hot mill tertiary scale formation

    NASA Astrophysics Data System (ADS)

    Kennedy, Jonathan Ian

    For high end steel applications surface quality is paramount to deliver a suitable product. A major cause of surface quality issues is from the formation of tertiary scale. The scale formation depends on numerous factors such as thermo-mechanical processing routes, chemical composition, thickness and rolls used. This thesis utilises a collection of data mining techniques to better understand the influence of Hot Mill process parameters on scale formation at Port Talbot Hot Strip Mill in South Wales. The dataset to which these data mining techniques were applied was carefully chosen to reduce process variation. There are several main factors that were considered to minimise this variability including time period, grade and gauge investigated. The following data mining techniques were chosen to investigate this dataset: Partial Least Squares (PLS); Logit Analysis; Principle Component Analysis (PCA); Multinomial Logistical Regression (MLR); Adaptive Neuro Inference Fuzzy Systems (ANFIS). The analysis indicated that the most significant variable for scale formation is the temperature entering the finishing mill. If the temperature is controlled on entering the finishing mill scale will not be formed. Values greater than 1070 °C for the average Roughing Mill and above 1050 °C for the average Crop Shear temperature are considered high, with values greater than this increasing the chance of scale formation. As the temperature increases more scale suppression measures are required to limit scale formation, with high temperatures more likely to generate a greater amount of scale even with fully functional scale suppression systems in place. Chemistry is also a significant factor in scale formation, with Phosphorus being the most significant of the chemistry variables. It is recommended that the chemistry specification for Phosphorus be limited to a maximum value of 0.015 % rather than 0.020 % to limit scale formation. Slabs with higher values should be treated with particular care when being processed through the Hot Mill to limit scale formation.

  7. Low pacemaker incidence with continuous-sutured valves: a retrospective analysis.

    PubMed

    Niclauss, Lars; Delay, Dominique; Pfister, Raymond; Colombier, Sebastien; Kirsch, Matthias; Prêtre, René

    2017-06-01

    Background Permanent pacemaker implantation after surgical aortic valve replacement depends on patient selection and risk factors for conduction disorders. We aimed to identify risk criteria and obtain a selected group comparable to patients assigned to transcatheter aortic valve implantation. Methods Isolated sutured aortic valve replacements in 994 patients treated from 2007 to 2015 were reviewed. Demographics, hospital stay, preexisting conduction disorders, surgical technique, and etiology in patients with and without permanent pacemaker implantation were compared. Reported outcomes after transcatheter aortic valve implantation were compared with those of a subgroup including only degenerative valve disease and first redo. Results The incidence of permanent pacemaker implantation was 2.9%. Longer hospital stay ( p = 0.01), preexisting rhythm disorders ( p < 0.001), complex prosthetic endocarditis ( p = 0.01), and complex redo ( p < 0.001) were associated with permanent pacemaker implantation. Although prostheses were sutured with continuous monofilament in the majority of cases (86%), interrupted pledgetted sutures were used more often in the pacemaker group ( p = 0.002). In the subgroup analysis, the incidence of permanent pacemaker implantation was 2%; preexisting rhythm disorders and the suture technique were still major risk factors. Conclusion Permanent pacemaker implantation depends on etiology, preexisting rhythm disorders, and suture technique, and the 2% incidence compares favorably with the reported 5- to 10-fold higher incidence after transcatheter aortic valve implantation. Cost analysis should take this into account. Often dismissed as minor complication, permanent pacemaker implantation increases the risks of endocarditis, impaired myocardial recovery, and higher mortality if associated with prosthesis regurgitation.

  8. [SWOT analysis: the analytical method in the process of planning and its application in the development of orthopaedic hospital department].

    PubMed

    Terzić, Zorica; Vukasinović, Zoran; Bjegović-Mikanović, Vesna; Jovanović, Vesna; Janicić, Radmila

    2010-01-01

    SWOT analysis is a managerial tool used to evaluate internal and external environment through strengths and weaknesses, opportunities and threats. The aim was to demonstrate the application of the SWOT analysis on the example of the Department for Paediatric Orthopaedics and Traumatology at the Institute of Orthopaedic Surgery "Banjica" in Belgrade. Qualitative research was conducted during December 2008 at the Department for Paediatric Orthopaedics and Traumatology of the Institute of Orthopaedic Surgery "Banjica" by applying the focus group technique. Participants were members of the medical staff and patients. In the first phase of the focus group brainstorming was applied to collect the factors of internal and external environment, and to identify strengths and weaknesses, opportunities and threats, respectively. In the second phase the nominal group technique was applied in order to reduce the list of factors. The factors were assessed according to their influence on the Department. Factors ranked by the three point Likert scale from 3 (highest impact) to 1 (lowest impact). The most important strengths of the Department are competent and skilled staff, high quality of services, average hospital bed utilization, the Department providing the educational basis of the School of Medicine, satisfied patients, pleasant setting, and additional working hours. The weaknesses are: poor spatial organization, personnel unmotivated to refresh knowledge, lack of specifically trained personnel, inadequate sanitary facilities, and uncovered services by the Insurance Fund, long average hospital stay, and low economic status of patients. The opportunities are: legislative regulations, formed paediatric traumatology service at the City level, good regional position of the Institute, and extension of referral areas. The threats are: absent Department autonomy in the personnel policy of the Institute, competitions within the Institute, impossibility to increase the Department capacities, inadequate nutrition, low opportunities for expert training of the personnel, outdated equipment, and presence of informal payments. SWOT analysis is a frequently used managerial instrument, which enables the systematic approach in decision making process.

  9. Improved Variable Selection Algorithm Using a LASSO-Type Penalty, with an Application to Assessing Hepatitis B Infection Relevant Factors in Community Residents

    PubMed Central

    Guo, Pi; Zeng, Fangfang; Hu, Xiaomin; Zhang, Dingmei; Zhu, Shuming; Deng, Yu; Hao, Yuantao

    2015-01-01

    Objectives In epidemiological studies, it is important to identify independent associations between collective exposures and a health outcome. The current stepwise selection technique ignores stochastic errors and suffers from a lack of stability. The alternative LASSO-penalized regression model can be applied to detect significant predictors from a pool of candidate variables. However, this technique is prone to false positives and tends to create excessive biases. It remains challenging to develop robust variable selection methods and enhance predictability. Material and methods Two improved algorithms denoted the two-stage hybrid and bootstrap ranking procedures, both using a LASSO-type penalty, were developed for epidemiological association analysis. The performance of the proposed procedures and other methods including conventional LASSO, Bolasso, stepwise and stability selection models were evaluated using intensive simulation. In addition, methods were compared by using an empirical analysis based on large-scale survey data of hepatitis B infection-relevant factors among Guangdong residents. Results The proposed procedures produced comparable or less biased selection results when compared to conventional variable selection models. In total, the two newly proposed procedures were stable with respect to various scenarios of simulation, demonstrating a higher power and a lower false positive rate during variable selection than the compared methods. In empirical analysis, the proposed procedures yielding a sparse set of hepatitis B infection-relevant factors gave the best predictive performance and showed that the procedures were able to select a more stringent set of factors. The individual history of hepatitis B vaccination, family and individual history of hepatitis B infection were associated with hepatitis B infection in the studied residents according to the proposed procedures. Conclusions The newly proposed procedures improve the identification of significant variables and enable us to derive a new insight into epidemiological association analysis. PMID:26214802

  10. [Microvascular descompression for trigeminal neuralgia: prognostic [corrected] factors].

    PubMed

    Alberione, F; Arena, A; Matera, R

    2008-06-01

    We describe our experience of the MVD in the typical trigeminal neuralgia and identify the prognostic factors. A retrospective studio of 89 cases between 1995-2005 was used. The prognostic significant data evaluated were demographics data; duration of neuralgia; the affected divisions involved; surgical findings; used material for the decompression. The data analysis was made with the chi(2) test. We have found an excellent outcome in 77% one year later. The age and the antecedent of hypertension disease were not statistically significant. A poor outcome was observed for: female sex, neuralgia lasting longer than two years, the three divisions involved, venous compression and the muscle used as surgical material. The MVD is an effective and reliable technique. The use of muscle is not recommended. When the three trigeminal divisions are involved we should choose another technique.

  11. Technique for information retrieval using enhanced latent semantic analysis generating rank approximation matrix by factorizing the weighted morpheme-by-document matrix

    DOEpatents

    Chew, Peter A; Bader, Brett W

    2012-10-16

    A technique for information retrieval includes parsing a corpus to identify a number of wordform instances within each document of the corpus. A weighted morpheme-by-document matrix is generated based at least in part on the number of wordform instances within each document of the corpus and based at least in part on a weighting function. The weighted morpheme-by-document matrix separately enumerates instances of stems and affixes. Additionally or alternatively, a term-by-term alignment matrix may be generated based at least in part on the number of wordform instances within each document of the corpus. At least one lower rank approximation matrix is generated by factorizing the weighted morpheme-by-document matrix and/or the term-by-term alignment matrix.

  12. Line width measurement below 60 nm using an optical interferometer and artificial neural network

    NASA Astrophysics Data System (ADS)

    See, Chung W.; Smith, Richard J.; Somekh, Michael G.; Yacoot, Andrew

    2007-03-01

    We have recently described a technique for optical line-width measurements. The system currently is capable of measuring line-width down to 60 nm with a precision of 2 nm, and potentially should be able to measure down to 10nm. The system consists of an ultra-stable interferometer and artificial neural networks (ANNs). The former is used to generate optical profiles which are input to the ANNs. The outputs of the ANNs are the desired sample parameters. Different types of samples have been tested with equally impressive results. In this paper we will discuss the factors that are essential to extend the application of the technique. Two of the factors are signal conditioning and sample classification. Methods, including principal component analysis, that are capable of performing these tasks will be considered.

  13. Exploring Northwest China's agricultural water-saving strategy: analysis of water use efficiency based on an SE-DEA model conducted in Xi'an, Shaanxi Province.

    PubMed

    Mu, L; Fang, L; Wang, H; Chen, L; Yang, Y; Qu, X J; Wang, C Y; Yuan, Y; Wang, S B; Wang, Y N

    Worldwide, water scarcity threatens delivery of water to urban centers. Increasing water use efficiency (WUE) is often recommended to reduce water demand, especially in water-scarce areas. In this paper, agricultural water use efficiency (AWUE) is examined using the super-efficient data envelopment analysis (DEA) approach in Xi'an in Northwest China at a temporal and spatial level. The grey systems analysis technique was then adopted to identify the factors that influenced the efficiency differentials under the shortage of water resources. From the perspective of temporal scales, the AWUE increased year by year during 2004-2012, and the highest (2.05) was obtained in 2009. Additionally, the AWUE was the best in the urban area at the spatial scale. Moreover, the key influencing factors of the AWUE are the financial situations and agricultural water-saving technology. Finally, we identified several knowledge gaps and proposed water-saving strategies for increasing AWUE and reducing its water demand by: (1) improving irrigation practices (timing and amounts) based on compatible water-saving techniques; (2) maximizing regional WUE by managing water resources and allocation at regional scales as well as enhancing coordination among Chinese water governance institutes.

  14. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  15. Factors associated with sealant outcome in 2 pediatric dental clinics: a multivariate hierarchical analysis.

    PubMed

    West, Nathan G; Ilief-Ala, Melina A; Douglass, Joanna M; Hagadorn, James I

    2011-01-01

    This study's purpose was to determine whether one-time sealants placed by pediatric dental residents vs dental students have different outcomes. The effect of isolation technique, behavior, duration of follow-up, and caries history was also examined. Records from 2 inner-city pediatric dental clinics were audited for 6- to 10-year-old patients with a permanent first molar sealant with at least 2 years of follow-up. A successful sealant was a one-time sealant that received no further treatment and was sealed or unsealed but not carious or restored at the final audit. Charts from 203 children with 481 sealants were audited. Of these, 281 sealants were failures. Univariate analysis revealed longer follow-up and younger age were associated with sealant failure. Operator type, child behavior, and isolation technique were not associated with sealant failure. After adjusting for follow-up duration, increased age at treatment reduced the odds of sealant failure while a history of caries reduced the protective effect of increased age. After adjusting for these factors, practitioner type, behavior, and type of isolation were not associated with sealant outcome in multivariate analysis. Age at sealant placement, history of caries prior to placement, and longer duration of follow-up are associated with sealant failure.

  16. [Delphi study to identify the management skills of nursing executives].

    PubMed

    Yañez, M R; Avila, J A; Bermudez, M I; De Miguel, I; Bellver, V; Guilabert, M; Mira, J J

    2016-01-01

    To determine and update the skills map for the position of Nurse Administrator in hospitals and Primary Care. An observational, descriptive, cross-sectional study based on a Delphi technique was conducted in hospital and Primary Care settings. Two nominal groups with 15 nurses each were used to define the contents of the questionnaire 0 in the Delphi technique. All nurses registered in the professional associations of Alicante, Castellón and Valencia were invited to participate. The results of the Delphi study was submitted to factor analysis to identify the set of skills and, subsequently, compare them with the offer of post-graduate course in colleges and universities during the 2014-15 academic year. Forty-five competences were extracted during the Nominal groups. In total, 705 nurses replied to the first wave in the Delphi Technique, and 394 in the second (response rate of 56%). Factorial analysis grouped the skills chosen into 10 factors: managing people, conflict management, independent learning, ethics, emotional balance, commitment, self-discipline, continuous improvement, critical-thinking, and innovation. Four skills groups identified in this study (emotional balancing, commitment, self-discipline and courage) were not usually included in the post-graduate courses The nurse administrator skills should be related to relational and ethical behaviour. The training offer of the post-graduate courses must be reoriented. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  17. [Development of an attitude-measurement questionnaire using the semantic differential technique: defining the attitudes of radiological technology students toward X-ray examination].

    PubMed

    Tamura, Naomi; Terashita, Takayoshi; Ogasawara, Katsuhiko

    2014-03-01

    In general, it is difficult to objectively evaluate the results of an educational program. The semantic differential (SeD) technique, a methodology used to measure the connotative meaning of objects, words, and concepts, can, however, be applied to the evaluation of students' attitudes. In this study, we aimed to achieve an objective evaluation of the effects of radiological technology education. We therefore investigated the attitude of radiological students using the SeD technique. We focused on X-ray examinations in the field of radiological technology science. Bipolar adjective scales were used for the SeD questionnaire. To create the questionnaire, appropriate adjectives were selected from past reports of X-ray examination practice. The participants were 32 senior students at Hokkaido University at the Division of Radiological Technology at the School of Medicine's Department of Health Sciences. All the participants completed the questionnaire. The study was conducted in early June 2012. Attitudes toward X-ray examination were identified using a factor analysis of 11 adjectives. The factor analysis revealed the following three attitudes: feelings of expectation, responsibility, and resistance. Knowledge regarding the attitudes that students have toward X-ray examination will prove useful for evaluating the effects of educational intervention. In this study, a sampling bias may have occurred due to the small sample size; however, no other biases were observed.

  18. An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.

    ERIC Educational Resources Information Center

    Skakun, Ernest N.; Hakstian, A. Ralph

    Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…

  19. The pitfalls of hair analysis for toxicants in clinical practice: three case reports.

    PubMed Central

    Frisch, Melissa; Schwartz, Brian S

    2002-01-01

    Hair analysis is used to assess exposure to heavy metals in patients presenting with nonspecific symptoms and is a commonly used procedure in patients referred to our clinic. We are frequently called on to evaluate patients who have health-related concerns as a result of hair analysis. Three patients first presented to outside physicians with nonspecific, multisystemic symptoms. A panel of analytes was measured in hair, and one or more values were interpreted as elevated. As a result of the hair analysis and other unconventional diagnostic tests, the patients presented to us believing they suffered from metal toxicity. In this paper we review the clinical efficacy of this procedure within the context of a patient population with somatic disorders and no clear risk factors for metal intoxication. We also review limitations of hair analysis in this setting; these limitations include patient factors such as low pretest probability of disease and test factors such as the lack of validation of analytic techniques, the inability to discern between exogenous contaminants and endogenous toxicants in hair, the variability of analytic procedures, low interlaboratory reliability, and the increased likelihood of false positive test results in the measurement of panels of analytes. PMID:11940463

  20. Resolving and quantifying overlapped chromatographic bands by transmutation

    PubMed

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  1. Comparative study of glass tube and mist chamber sampling techniques for the analysis of gaseous carbonyl compounds

    NASA Astrophysics Data System (ADS)

    François, Stéphanie; Perraud, Véronique; Pflieger, Maryline; Monod, Anne; Wortham, Henri

    In this work, glass tube and mist chamber sampling techniques using 2,4-dinitrophenylhydrazine as derivative agent for the analysis of gaseous carbonyl compounds are compared. Trapping efficiencies of formaldehyde, acetaldehyde, propionaldehyde, acetone, acrolein, glyoxal, crotonaldehyde, benzaldehyde, butyraldehyde and valeraldehyde are experimentally determined using a gas-phase generator. In addition to generalise our results to all atmospheric gaseous compounds and derivative agents, theoretical trapping efficiencies and enrichment factors are expressed taking into account mechanisms involved in the two kinds of traps. Theoretical and experimental results show that, as expected, the trapping efficiencies of the glass tube depend mainly on solubility of compounds. The results provide new information and better understanding of phenomena occurring in the mist chamber and the ability of this sampler to concentrate the samples. Hence, the mist chamber is the more convenient sampling method when the trapping is associated to a fast derivatisation of the compounds and the glass tube technique must be used to trap atmospheric compounds without simultaneous derivatisation.

  2. Measurements of Cuspal Slope Inclination Angles in Palaeoanthropological Applications

    NASA Astrophysics Data System (ADS)

    Gaboutchian, A. V.; Knyaz, V. A.; Leybova, N. A.

    2017-05-01

    Tooth crown morphological features, studied in palaeoanthropology, provide valuable information about human evolution and development of civilization. Tooth crown morphology represents biological and historical data of high taxonomical value as it characterizes genetically conditioned tooth relief features averse to substantial changes under environmental factors during lifetime. Palaeoanthropological studies are still based mainly on descriptive techniques and manual measurements of limited number of morphological parameters. Feature evaluation and measurement result analysis are expert-based. Development of new methods and techniques in 3D imaging creates a background provides for better value of palaeoanthropological data processing, analysis and distribution. The goals of the presented research are to propose new features for automated odontometry and to explore their applicability to paleoanthropological studies. A technique for automated measuring of given morphological tooth parameters needed for anthropological study is developed. It is based on using original photogrammetric system as a teeth 3D models acquisition device and on a set of algorithms for given tooth parameters estimation.

  3. Unification of color postprocessing techniques for 3-dimensional computational mechanics

    NASA Technical Reports Server (NTRS)

    Bailey, Bruce Charles

    1985-01-01

    To facilitate the understanding of complex three-dimensional numerical models, advanced interactive color postprocessing techniques are introduced. These techniques are sufficiently flexible so that postprocessing difficulties arising from model size, geometric complexity, response variation, and analysis type can be adequately overcome. Finite element, finite difference, and boundary element models may be evaluated with the prototype postprocessor. Elements may be removed from parent models to be studied as independent subobjects. Discontinuous responses may be contoured including responses which become singular, and nonlinear color scales may be input by the user for the enhancement of the contouring operation. Hit testing can be performed to extract precise geometric, response, mesh, or material information from the database. In addition, stress intensity factors may be contoured along the crack front of a fracture model. Stepwise analyses can be studied, and the user can recontour responses repeatedly, as if he were paging through the response sets. As a system, these tools allow effective interpretation of complex analysis results.

  4. Two-dimensional fracture analysis of piezoelectric material based on the scaled boundary node method

    NASA Astrophysics Data System (ADS)

    Shen-Shen, Chen; Juan, Wang; Qing-Hua, Li

    2016-04-01

    A scaled boundary node method (SBNM) is developed for two-dimensional fracture analysis of piezoelectric material, which allows the stress and electric displacement intensity factors to be calculated directly and accurately. As a boundary-type meshless method, the SBNM employs the moving Kriging (MK) interpolation technique to an approximate unknown field in the circumferential direction and therefore only a set of scattered nodes are required to discretize the boundary. As the shape functions satisfy Kronecker delta property, no special techniques are required to impose the essential boundary conditions. In the radial direction, the SBNM seeks analytical solutions by making use of analytical techniques available to solve ordinary differential equations. Numerical examples are investigated and satisfactory solutions are obtained, which validates the accuracy and simplicity of the proposed approach. Project supported by the National Natural Science Foundation of China (Grant Nos. 11462006 and 21466012), the Foundation of Jiangxi Provincial Educational Committee, China (Grant No. KJLD14041), and the Foundation of East China Jiaotong University, China (Grant No. 09130020).

  5. A Novel approach for predicting monthly water demand by combining singular spectrum analysis with neural networks

    NASA Astrophysics Data System (ADS)

    Zubaidi, Salah L.; Dooley, Jayne; Alkhaddar, Rafid M.; Abdellatif, Mawada; Al-Bugharbee, Hussein; Ortega-Martorell, Sandra

    2018-06-01

    Valid and dependable water demand prediction is a major element of the effective and sustainable expansion of municipal water infrastructures. This study provides a novel approach to quantifying water demand through the assessment of climatic factors, using a combination of a pretreatment signal technique, a hybrid particle swarm optimisation algorithm and an artificial neural network (PSO-ANN). The Singular Spectrum Analysis (SSA) technique was adopted to decompose and reconstruct water consumption in relation to six weather variables, to create a seasonal and stochastic time series. The results revealed that SSA is a powerful technique, capable of decomposing the original time series into many independent components including trend, oscillatory behaviours and noise. In addition, the PSO-ANN algorithm was shown to be a reliable prediction model, outperforming the hybrid Backtracking Search Algorithm BSA-ANN in terms of fitness function (RMSE). The findings of this study also support the view that water demand is driven by climatological variables.

  6. Rapid analysis of colipase gene variants by multicapillary electrophoresis.

    PubMed

    Jaczó, Zsuzsanna; Pál, Eszter; Dénes, Réka; Somogyi, Anikó; Sasvári-Székely, Mária; Guttman, András; Rónai, Zsolt

    2015-06-01

    Despite of the fact that the Human Genome Project was completed more than a decade ago, identification of the genetic background of polygenic diseases is still challenging. Several somewhat different approaches are available to investigate inheritable factors of complex phenotypes, all require, however efficient, high-throughput techniques for SNP genotyping. In this paper, we report a robust and reliable multiplex PCR-RFLP for genotype and haplotype analysis of six SNPs (rs41270082, rs3748051, rs142027015, rs3748048, rs73404011, and rs72925892) of the colipase (CLPS) gene. A multicapillary (12 capillaries) electrophoresis unit was used for high throughput and sensitive analysis of the digestion fragments. A Microsoft Excel-based spreadsheet was designed for the flexible visualization and evaluation of the electrophoretic separations, which is readily adaptable for any kind of electrophoresis application. Haplotype analysis of the two loci localized in close proximity of each other was carried out by molecular method, extended haplotypes including all five SNPs in the 5' upstream region were calculated. The techniques were applied in a case-control association study of type 2 diabetes mellitus. Although, single marker analysis did not reveal any significant association, it was observed that the rare GGCCG haplotype of the five 5' upstream region SNPs was about three times more frequent among patients compared to healthy control population. Our results demonstrated the applicability of multicapillary CGE in large-scale, high-throughput SNP analysis, and suggested that the CLPS gene polymorphisms might be considered as genetic risk factor for type 2 diabetes mellitus. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Extremely Low Loss Phonon-Trapping Cryogenic Acoustic Cavities for Future Physical Experiments

    PubMed Central

    Galliou, Serge; Goryachev, Maxim; Bourquin, Roger; Abbé, Philippe; Aubry, Jean Pierre; Tobar, Michael E.

    2013-01-01

    Low loss Bulk Acoustic Wave devices are considered from the point of view of the solid state approach as phonon-confining cavities. We demonstrate effective design of such acoustic cavities with phonon-trapping techniques exhibiting extremely high quality factors for trapped longitudinally-polarized phonons of various wavelengths. Quality factors of observed modes exceed 1 billion, with a maximum Q-factor of 8 billion and Q × f product of 1.6 · 1018 at liquid helium temperatures. Such high sensitivities allow analysis of intrinsic material losses in resonant phonon systems. Various mechanisms of phonon losses are discussed and estimated. PMID:23823569

  8. Updating QR factorization procedure for solution of linear least squares problem with equality constraints.

    PubMed

    Zeb, Salman; Yousaf, Muhammad

    2017-01-01

    In this article, we present a QR updating procedure as a solution approach for linear least squares problem with equality constraints. We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. The QR factorization of the subproblem is calculated and then we apply updating techniques to its upper triangular factor R to obtain its solution. We carry out the error analysis of the proposed algorithm to show that it is backward stable. We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems.

  9. Autonomous selection of PDE inpainting techniques vs. exemplar inpainting techniques for void fill of high resolution digital surface models

    NASA Astrophysics Data System (ADS)

    Rahmes, Mark; Yates, J. Harlan; Allen, Josef DeVaughn; Kelley, Patrick

    2007-04-01

    High resolution Digital Surface Models (DSMs) may contain voids (missing data) due to the data collection process used to obtain the DSM, inclement weather conditions, low returns, system errors/malfunctions for various collection platforms, and other factors. DSM voids are also created during bare earth processing where culture and vegetation features have been extracted. The Harris LiteSite TM Toolkit handles these void regions in DSMs via two novel techniques. We use both partial differential equations (PDEs) and exemplar based inpainting techniques to accurately fill voids. The PDE technique has its origin in fluid dynamics and heat equations (a particular subset of partial differential equations). The exemplar technique has its origin in texture analysis and image processing. Each technique is optimally suited for different input conditions. The PDE technique works better where the area to be void filled does not have disproportionately high frequency data in the neighborhood of the boundary of the void. Conversely, the exemplar based technique is better suited for high frequency areas. Both are autonomous with respect to detecting and repairing void regions. We describe a cohesive autonomous solution that dynamically selects the best technique as each void is being repaired.

  10. Single-Molecule Studies of Actin Assembly and Disassembly Factors

    PubMed Central

    Smith, Benjamin A.; Gelles, Jeff; Goode, Bruce L.

    2014-01-01

    The actin cytoskeleton is very dynamic and highly regulated by multiple associated proteins in vivo. Understanding how this system of proteins functions in the processes of actin network assembly and disassembly requires methods to dissect the mechanisms of activity of individual factors and of multiple factors acting in concert. The advent of single-filament and single-molecule fluorescence imaging methods has provided a powerful new approach to discovering actin-regulatory activities and obtaining direct, quantitative insights into the pathways of molecular interactions that regulate actin network architecture and dynamics. Here we describe techniques for acquisition and analysis of single-molecule data, applied to the novel challenges of studying the filament assembly and disassembly activities of actin-associated proteins in vitro. We discuss the advantages of single-molecule analysis in directly visualizing the order of molecular events, measuring the kinetic rates of filament binding and dissociation, and studying the coordination among multiple factors. The methods described here complement traditional biochemical approaches in elucidating actin-regulatory mechanisms in reconstituted filamentous networks. PMID:24630103

  11. The relationship between the availability of the supporting elements of pedestrian with pedestrian crossing facility usage based on user preferences (Case Study corridor of Sumbersari Street, Gajayana Street, MT. Haryono Street, Malang City)

    NASA Astrophysics Data System (ADS)

    Soetrisno, D. P.

    2017-06-01

    Pedestrian crossing facilities are effective enough to avoid pedestrians with vehicles, but its utilization is still quite low. It indicated that safety is not the only factor that influences a person to utilize the pedestrian crossing facilities. In addition, the availability of supporting elements of the pedestrian is still not quite attention, which is also became a factor that causes the pedestrians doesn’t utilize the pedestrian crossing facilities. Therefore, this research was structured to examine the relationship between the availability of the supporting elements of the pedestrian with pedestrian crossing facility usage based on user preferences. Data collection method used is primary survey consist of observation and the questionnaire. Sampling techniques used is purposive sampling with the number of respondents as many as 211 respondents by using questionnaire with ordinal scales to identify respondents’ consideration level of supporting elements pedestrian and crossing facility utilization factors. The survey is done on 15 crossing facilities area in 3 different locations with the same characteristics of land use in the form of higher education area (university area) and trades and services activities area. The analysis technique used is frequency distribution analysis in order to identify preference pedestrian on the availability of supporting elements of pedestrian and pedestrian crossing facility utilization factors, and chi square analysis is used to analyze the relationship between the availability of the supporting elements of the pedestrian with pedestrian crossing facility utilization. Based on the chi square analysis results with significance 5 % obtained the result that there are six supporting elements of pedestrian having correlation to the factors of pedestrian crossing facility utilization consist of the availability of sidewalk, pedestrian lights, Street Lighting Lamps, Pedestrian Crossing Markings Facilities, Sign Crossings Facilities, vegetation, and dustbin. So the result of this research can be considered for the government as main stakehoder especially the local government in preparing policy to provide supporting elements of pedestrian that should be on the area of pedestrian crossing facilities.

  12. What are the important surgical factors affecting the wound healing after primary total knee arthroplasty?

    PubMed

    Harato, Kengo; Tanikawa, Hidenori; Morishige, Yutaro; Kaneda, Kazuya; Niki, Yasuo

    2016-01-13

    Wound condition after primary total knee arthroplasty (TKA) is an important issue to avoid any postoperative adverse events. Our purpose was to investigate and to clarify the important surgical factors affecting wound score after TKA. A total of 139 knees in 128 patients (mean 73 years) without severe comorbidity were enrolled in the present study. All primary unilateral or bilateral TKAs were done using the same skin incision line, measured resection technique, and wound closure technique using unidirectional barbed suture. In terms of the wound healing, Hollander Wound Evaluation Score (HWES) was assessed on postoperative day 14. We performed multiple regression analysis using stepwise method to identify the factors affecting HWES. Variables considered in the analysis were age, sex, body mass index (kg/m(2)), HbA1C (%), femorotibial angle (degrees) on plain radiographs, intraoperative patella eversion during the cutting phase of the femur and the tibia in knee flexion, intraoperative anterior translation of the tibia, patella resurfacing, surgical time (min), tourniquet time (min), length of skin incision (cm), postoperative drainage (ml), patellar height on postoperative lateral radiographs, and HWES. HWES was treated as a dependent variable, and others were as independent variables. The average HWES was 5.0 ± 0.8 point. According to stepwise forward regression test, patella eversion during the cutting phase of the femur and the tibia in knee flexion and anterior translation of the tibia were entered in this model, while other factors were not entered. Standardized partial regression coefficient was as follows: 0.57 in anterior translation of the tibia and 0.38 in patella eversion. Fortunately, in the present study using the unidirectional barbed suture, major wound healing problem did not occur. As to the surgical technique, intraoperative patella eversion and anterior translation of the tibia should be avoided for quality cosmesis in primary TKA.

  13. Using Recursive Regression to Explore Nonlinear Relationships and Interactions: A Tutorial Applied to a Multicultural Education Study

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2009-01-01

    This paper discusses how a seldom-used statistical procedure, recursive regression (RR), can numerically and graphically illustrate data-driven nonlinear relationships and interaction of variables. This routine falls into the family of exploratory techniques, yet a few interesting features make it a valuable compliment to factor analysis and…

  14. Hypothesis Generation, Evaluation, and Memory Abilities in Adult Human Concept Learning.

    ERIC Educational Resources Information Center

    Cason, Carolyn L.; And Others

    Studies were made between performance on tests of mental abilities and concept learning tasks; it is pointed out that the researcher is usually confronted with administering large batteries of tests of mental abilities and then analyzing his results with one of the factor analytic techniques. An information process analysis of tests of mental…

  15. A Comprehensive Careers Cluster Curriculum Model. Health Occupations Cluster Curriculum Project and Health-Care Aide Curriculum Project.

    ERIC Educational Resources Information Center

    Bortz, Richard F.

    To prepare learning materials for health careers programs at the secondary level, the developmental phase of two curriculum projects--the Health Occupations Cluster Curriculum Project and Health-Care Aide Curriculum Project--utilized a model which incorporated a key factor analysis technique. Entitled "A Comprehensive Careers Cluster Curriculum…

  16. Management Development of Scientists and Engineers in the Federal Government; An Analysis of Basic Behavioral and Systems Considerations.

    ERIC Educational Resources Information Center

    Berniklau, Vladimir V.

    Focusing on management development of scientists and engineers within the Federal government, this study was done to form a framework of factors (mainly attitudes, motives or needs, and leadership styles) to be evaluated before choosing suitable techniques and alternatives. Such variables as differing program objectives, characteristics of…

  17. Differences between Peer Victimization in Cyber and Physical Settings and Associated Psychosocial Adjustment in Early Adolescence

    ERIC Educational Resources Information Center

    Dempsey, Allison G.; Sulkowski, Michael L.; Nichols, Rebecca; Storch, Eric A.

    2009-01-01

    The increasing use of cyberspace as a social networking forum creates a new medium for youth to become victims of peer aggression. This study used factor analysis techniques to confirm whether survey questions about frequency of cyber victimization formed a distinct latent construct from questions about relational and overt victimization…

  18. How Do the Different Types of Computer Use Affect Math Achievement?

    ERIC Educational Resources Information Center

    Flores, Raymond; Inan, Fethi; Lin, Zhangxi

    2013-01-01

    In this study, the National Educational Longitudinal Study (ELS:2002) dataset was used and a predictive data mining technique, decision tree analysis, was implemented in order to examine which factors, in conjunction to computer use, can be used to predict high or low probability of success in high school mathematics. Specifically, this study…

  19. Socioeconomic Status and Asian American and Pacific Islander Students' Transition to College: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Museus, Samuel D.; Vue, Rican

    2013-01-01

    The purpose of this study is to examine socioeconomic differences in the interpersonal factors that influence college access among Asian Americans and Pacific Islanders (AAPIs). Data on 1,460 AAPIs from the Education Longitudinal Study (ELS: 02/06) were analyzed using structural equation modeling techniques. Findings suggest that parental…

  20. A Quantitative Analysis of Organizational Factors That Relate to Data Mining Success

    ERIC Educational Resources Information Center

    Huebner, Richard A.

    2017-01-01

    The ubiquity of data in various forms has fueled the need for advanced data-mining techniques within organizations. The advent of data mining methods used to uncover hidden nuggets of information buried within large data sets has also fueled the need for determining how these unique projects can be successful. There are many challenges associated…

  1. Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method

    ERIC Educational Resources Information Center

    Ramlo, Susan

    2015-01-01

    Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…

  2. Income Inequality across Micro and Meso Geographic Scales in the Midwestern United States, 1979-2009

    ERIC Educational Resources Information Center

    Peters, David J.

    2012-01-01

    This article examines the spatial distribution of income inequality and the socioeconomic factors affecting it using spatial analysis techniques across 16,285 block groups, 5,050 tracts, and 618 counties in the western part of the North Central Region of the United States. Different geographic aggregations result in different inequality outcomes,…

  3. Development and Initial Validation of an Instrument for Human Capital Planning

    ERIC Educational Resources Information Center

    Zula, Kenneth J.; Chermack, Thomas J.

    2008-01-01

    This article reports on development and validation of an instrument for use in human capital approaches for organizational planning. The article describes use of a team of subject matter experts in developing a measure of human capital planning, and use of exploratory factor analysis techniques to validate the resulting instrument. These data were…

  4. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  5. Human Factors in Field Experimentation Design and Analysis of Analytical Suppression Model

    DTIC Science & Technology

    1978-09-01

    men in uf"an-dachine- Systems " supports the development of new doctrines, design of weapon systems as well as training programs for trQops. One...Experimentation Design -Master’s thesis: and Analysis.of an Analytical Suppression.Spebr17 Model PR@~w 3.RPR 7. AUTHOR(@) COT RIETeo 31AN? wijMu~aw...influences to suppression. Techniques are examined for including. the suppre.ssive effects of weapon systems in Lanchester-type combat m~odels, whir~h may be

  6. Multivariate analysis of selected metals in tannery effluents and related soil.

    PubMed

    Tariq, Saadia R; Shah, Munir H; Shaheen, N; Khalique, A; Manzoor, S; Jaffar, M

    2005-06-30

    Effluent and relevant soil samples from 38 tanning units housed in Kasur, Pakistan, were obtained for metal analysis by flame atomic absorption spectrophotometric method. The levels of 12 metals, Na, Ca, K, Mg, Fe, Mn, Cr, Co, Cd, Ni, Pb and Zn were determined in the two media. The data were evaluated towards metal distribution and metal-to-metal correlations. The study evidenced enhanced levels of Cr (391, 16.7 mg/L) and Na (25,519, 9369 mg/L) in tannery effluents and relevant soil samples, respectively. The effluent versus soil trace metal content relationship confirmed that the effluent Cr was strongly correlated with soil Cr. For metal source identification the techniques of principal component analysis, and cluster analysis were applied. The principal component analysis yielded two factors for effluents: factor 1 (49.6% variance) showed significant loading for Ca, Fe, Mn, Cr, Cd, Ni, Pb and Zn, referring to a tanning related source for these metals, and factor 2 (12.6% variance) with higher loadings of Na, K, Mg and Co, was associated with the processes during the skin/hide treatment. Similarly, two factors with a cumulative variance of 34.8% were obtained for soil samples: factor 1 manifested the contribution from Mg, Mn, Co, Cd, Ni and Pb, which though soil-based is basically effluent-derived, while factor 2 was found associated with Na, K, Ca, Cr and Zn which referred to a tannery-based source. The dendograms obtained from cluster analysis, also support the observed results. The study exhibits a gross pollution of soils with Cr at levels far exceeding the stipulated safe limit laid down for tannery effluents.

  7. Novel conformal technique to reduce staircasing artifacts at material boundaries for FDTD modeling of the bioheat equation.

    PubMed

    Neufeld, E; Chavannes, N; Samaras, T; Kuster, N

    2007-08-07

    The modeling of thermal effects, often based on the Pennes Bioheat Equation, is becoming increasingly popular. The FDTD technique commonly used in this context suffers considerably from staircasing errors at boundaries. A new conformal technique is proposed that can easily be integrated into existing implementations without requiring a special update scheme. It scales fluxes at interfaces with factors derived from the local surface normal. The new scheme is validated using an analytical solution, and an error analysis is performed to understand its behavior. The new scheme behaves considerably better than the standard scheme. Furthermore, in contrast to the standard scheme, it is possible to obtain with it more accurate solutions by increasing the grid resolution.

  8. Confirmatory factor analysis of the School Refusal Assessment Scale – Revised in an African American community sample

    PubMed Central

    Lyon, Aaron R.

    2010-01-01

    The current study used confirmatory factor analysis techniques to investigate the construct validity of the child version of the School Refusal Assessment Scale – Revised (SRAS-R) in a community sample of low socioeconomic status, urban, African American fifth and sixth graders (n = 174). The SRAS-R is the best-researched measure of school refusal behavior in youth and typically yields four functional dimensions. Results of the investigation suggested that a modified version of the four-factor model, in which three items from the tangible reinforcement dimension are removed, may have construct validity in the current sample of youth. In addition, youth endorsement of the dimension measuring avoidance of social and/or evaluative situations was positively associated with unexcused absences. Implications for further psychometric research and early identification and prevention of problematic absenteeism in low-SES, ethnic minority community samples are highlighted. PMID:20567603

  9. Environmental applications of single collector high resolution ICP-MS.

    PubMed

    Krachler, Michael

    2007-08-01

    The number of environmental applications of single collector high resolution ICP-MS (HR-ICP-MS) has increased rapidly in recent years. There are many factors that contribute to make HR-ICP-MS a very powerful tool in environmental analysis. They include the extremely low detection limits achievable, tremendously high sensitivity, the ability to separate ICP-MS signals of the analyte from spectral interferences, enabling the reliable determination of many trace elements, and the reasonable precision of isotope ratio measurements. These assets are improved even further using high efficiency sample introduction systems. Therefore, external factors such as the stability of laboratory blanks are frequently the limiting factor in HR-ICP-MS analysis rather than the detection power. This review aims to highlight the most recent applications of HR-ICP-MS in this sector, focusing on matrices and applications where the superior capabilities of the instrumental technique are most useful and often ultimately required.

  10. Development and examination of the psychometric properties of the Learning Experience Scale in nursing.

    PubMed

    Takase, Miyuki; Imai, Takiko; Uemura, Chizuru

    2016-06-01

    This paper examines the psychometric properties of the Learning Experience Scale. A survey method was used to collect data from a total of 502 nurses. Data were analyzed by factor analysis and the known-groups technique to examine the construct validity of the scale. In addition, internal consistency was evaluated by Cronbach's alpha, and stability was examined by test-retest correlation. Factor analysis showed that the Learning Experience Scale consisted of five factors: learning from practice, others, training, feedback, and reflection. The scale also had the power to discriminate between nurses with high and low levels of nursing competence. The internal consistency and the stability of the scale were also acceptable. The Learning Experience Scale is a valid and reliable instrument, and helps organizations to effectively design learning interventions for nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  11. Harmonic analysis of the precipitation in Greece

    NASA Astrophysics Data System (ADS)

    Nastos, P. T.; Zerefos, C. S.

    2009-04-01

    Greece is a country with a big variety of climates due to its geographical position, to the many mountain ranges and also to the multifarious and long coastline. The mountainous volumes are of such orientation that influences the distribution of the precipitation, having as a result, Western Greece to present great differentiations from Central and Eastern Greece. The application of harmonic analysis to the annual variability of precipitation is the goal of this study, so that the components, which compose the annual variability, be elicited. For this purpose, the mean monthly precipitation data from 30 meteorological stations of National Meteorological Service were used for the time period 1950-2000. The initial target is to reduce the number of variables and to detect structure in the relationships between variables. The most commonly used technique for this purpose is the application of Factor Analysis to a table having as columns the meteorological stations-variables and rows the monthly mean precipitation, so that 2 main factors were calculated, which explain the 98% of total variability of precipitation in Greece. Factor 1, representing the so-called uniform field and interpreting the most of the total variance, refers in fact to the Mediterranean depressions, affecting mainly the West of Greece and also the East Aegean and the Asia Minor coasts. In the process, the Fourier Analysis was applied to the factor scores extracted from the Factor Analysis, so that 2 harmonic components are resulted, which explain above the 98% of the total variability of each main factor, and are due to different synoptic and thermodynamic processes associated with Greece's precipitation construction. Finally, the calculation of the time of occurrence of the maximum precipitation, for each harmonic component of each one of the two main factors, gives the spatial distribution of appearance of the maximum precipitation in the Hellenic region.

  12. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    NASA Astrophysics Data System (ADS)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  13. Localisation of stem cell factor, stanniocalcin-1, connective tissue growth factor and heparin-binding epidermal growth factor in the bovine uterus at the time of blastocyst formation.

    PubMed

    Muñoz, M; Martin, D; Carrocera, S; Alonso-Guervos, M; Mora, M I; Corrales, F J; Peynot, N; Giraud-Delville, C; Duranthon, V; Sandra, O; Gómez, E

    2017-10-01

    Early embryonic losses before implantation account for the highest rates of reproductive failure in mammals, in particular when in vitro-produced embryos are transferred. In the present study, we used molecular biology techniques (real-time quantitative polymerase chain reaction), classical immunohistochemical staining coupled with confocal microscopy and proteomic analysis (multiple reaction monitoring and western blot analysis) to investigate the role of four growth factors in embryo-uterine interactions during blastocyst development. Supported by a validated embryo transfer model, the study investigated: (1) the expression of stem cell factor (SCF), stanniocalcin-1 (STC1), connective tissue growth factor (CTGF) and heparin-binding epidermal growth factor-like growth factor (HB-EGF) in bovine uterine fluid; (2) the presence of SCF, STC1, CTGF and HB-EGF mRNA and protein in the bovine endometrium and embryos; and (3) the existence of reciprocal regulation between endometrial and embryonic expression of SCF, STC1, CTGF and HB-EGF. The results suggest that these growth factors most likely play an important role during preimplantation embryo development in cattle. The information obtained from the present study can contribute to improving the performance of in vitro culture technology in cattle and other species.

  14. Effective Factors in Enhancing School Manager's Job Motivation

    PubMed Central

    Mirzamani, S. Mahmoud; Esfahani, Hamideh Darb

    2011-01-01

    Objective This study examines the effective factors in enhancing school manager's job motivation from viewpoint of school mangers, teachers, education department managerial and staff experts in teaching, and also identifies and prioritizes each of these factors and indicators. Method For selecting a representative sample and increasing measurement precision, 587 people were selected using classified random sampling. The measurement tool was a 79-questionnaire made by the researcher. The questionnaire was collected using motivation theories and observing the findings of previous researches. Then, according to the three-stage Delphi technique, the questionnaire was sent to experts in education. The reliability of instruments was measured by calculating Cronbach's Alpha coefficient, and total reliability of the test was 0.99; the validity of the instrument was assessed by factor analysis (Construct Validity) and its load factor was 0.4 which was high. Results The results from factor analysis shows that the effective factors in enhancing manager's job motivation are as follows: self- actualization (51%) including 28 indices; social factor (7/9%) including 22 indices; self-esteem (3.2%) including 17 indices; job desirable features (2.2%) including 4 indices; physiologic (1.8%) including 4 indices; and job richness (1.6%) including 4 indices. Conclusions The results show that the six mentioned factors determine 68% of the total variance of manager's motivation. PMID:22952541

  15. Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets.

    PubMed

    Shuryak, Igor

    2017-01-01

    The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.

  16. Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets

    PubMed Central

    Shuryak, Igor

    2017-01-01

    The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected “signal”; (5) using several machine learning methods to test the “signal’s” sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation. PMID:28068401

  17. Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.

    PubMed

    Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline

    2017-01-01

    Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.

  18. Bim may be a poor prognostic biomarker in breast cancer patients especially in those with luminal A tumors.

    PubMed

    Maimaiti, Yusufu; Dong, Lingling; Aili, Aikebaier; Maimaitiaili, Maimaitiaili; Huang, Tao; Abudureyimu, Kelimu

    2017-07-04

    Bcl-2 interacting mediator of cell death (Bim) appears to have contradictory roles in cancer. It is uncertain whether Bim show prognostic significance in patients with breast cancer. To investigate the correlation between Bim expression and clinicopathological characteristics of breast cancer and to evaluate Bim's effect on overall survival (OS). We used immunohistochemistry (IHC) technique to detect the expression of Bim via tissue microarray in 275 breast cancer samples, Kaplan-Meier analysis to perform survival analysis, and Cox proportional hazards regression model to explore the risk factors of breast cancer. The results revealed that Bim expression was significantly correlated with age, estrogen receptor (ER) and/or progesterone receptor (PR), human epidermal growth factor receptor (HER2) and Ki67 expression (P< 0.05). Bim expression was significantly different in the four molecular subtypes (P= 0.000). Survival analysis showed that Bim positive expression contributed to a shorter OS (P= 0.034), especially in patients with luminal A tumors (P= 0.039). Univariate and multivariate regression analysis showed that Bim was an independent prognostic factor for breast cancer (P< 0.05). Bim may serve as an effective predictive factor for lower OS in breast cancer patients, especially in those with luminal A tumors.

  19. Analysis of motor fan radiated sound and vibration waveform by automatic pattern recognition technique using "Mahalanobis distance"

    NASA Astrophysics Data System (ADS)

    Toma, Eiji

    2018-06-01

    In recent years, as the weight of IT equipment has been reduced, the demand for motor fans for cooling the interior of electronic equipment is on the rise. Sensory test technique by inspectors is the mainstream for quality inspection of motor fans in the field. This sensory test requires a lot of experience to accurately diagnose differences in subtle sounds (sound pressures) of the fans, and the judgment varies depending on the condition of the inspector and the environment. In order to solve these quality problems, development of an analysis method capable of quantitatively and automatically diagnosing the sound/vibration level of a fan is required. In this study, it was clarified that the analysis method applying the MT system based on the waveform information of noise and vibration is more effective than the conventional frequency analysis method for the discrimination diagnosis technology of normal and abnormal items. Furthermore, it was found that due to the automation of the vibration waveform analysis system, there was a factor influencing the discrimination accuracy in relation between the fan installation posture and the vibration waveform.

  20. TDRSS telecommunications system, PN code analysis

    NASA Technical Reports Server (NTRS)

    Dixon, R.; Gold, R.; Kaiser, F.

    1976-01-01

    The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diepold, Marc, E-mail: marc.diepold@mpq.mpg.de; Franke, Beatrice; Götzfried, Johannes

    Avalanche photodiodes are commonly used as detectors for low energy x-rays. In this work, we report on a fitting technique used to account for different detector responses resulting from photoabsorption in the various avalanche photodiode layers. The use of this technique results in an improvement of the energy resolution at 8.2 keV by up to a factor of 2 and corrects the timing information by up to 25 ns to account for space dependent electron drift time. In addition, this waveform analysis is used for particle identification, e.g., to distinguish between x-rays and MeV electrons in our experiment.

  2. Disability: a model and measurement technique.

    PubMed Central

    Williams, R G; Johnston, M; Willis, L A; Bennett, A E

    1976-01-01

    Current methods of ranking or scoring disability tend to be arbitrary. A new method is put forward on the hypothesis that disability progresses in regular, cumulative patterns. A model of disability is defined and tested with the use of Guttman scale analysis. Its validity is indicated on data from a survey in the community and from postsurgical patients, and some factors involved in scale variation are identified. The model provides a simple measurement technique and has implications for the assessment of individual disadvantage, for the prediction of progress in recovery or deterioration, and for evaluation of the outcome of treatment regimes. PMID:953379

  3. Group interaction and flight crew performance

    NASA Technical Reports Server (NTRS)

    Foushee, H. Clayton; Helmreich, Robert L.

    1988-01-01

    The application of human-factors analysis to the performance of aircraft-operation tasks by the crew as a group is discussed in an introductory review and illustrated with anecdotal material. Topics addressed include the function of a group in the operational environment, the classification of group performance factors (input, process, and output parameters), input variables and the flight crew process, and the effect of process variables on performance. Consideration is given to aviation safety issues, techniques for altering group norms, ways of increasing crew effort and coordination, and the optimization of group composition.

  4. Contributions of in vitro transcription to the understanding of human RNA polymerase III transcription

    PubMed Central

    Dumay-Odelot, Hélène; Durrieu-Gaillard, Stéphanie; El Ayoubi, Leyla; Parrot, Camila; Teichmann, Martin

    2014-01-01

    Human RNA polymerase III transcribes small untranslated RNAs that contribute to the regulation of essential cellular processes, including transcription, RNA processing and translation. Analysis of this transcription system by in vitro transcription techniques has largely contributed to the discovery of its transcription factors and to the understanding of the regulation of human RNA polymerase III transcription. Here we review some of the key steps that led to the identification of transcription factors and to the definition of minimal promoter sequences for human RNA polymerase III transcription. PMID:25764111

  5. Computer analysis of femoral angiograms for evaluation of atherosclerosis in post-infarct males-clinical correlates

    NASA Technical Reports Server (NTRS)

    Sanmarco, M. E.; Blankenhorn, D. H.

    1975-01-01

    Femoral artery atheromatous lesions were studied and their changes as a measure of therapeutic effectiveness were assessed. The incidence of coronary risk factors in 100 patients was determined. Abnormal cholesterol was present in 42 percent, abnormal triglycerides in 66 percent, abnormal intravenous glucose tolerance test in 52 percent, judged from a K value of .9 or less by the technique of Wahlbert. A history of high blood pressure was present in 32 percent. Smoking was one of the most common factors.

  6. [Prevalence and risk factors of Enterobius vermicularis among preschool children in kindergartens in Luohu District, Shenzhen City].

    PubMed

    Kuang, Cui-ping; Wu, Xiao-liang; Chen, Wu-shen; Wu, Fei-fei; Zhuo, Fei

    2015-02-01

    To understand the prevalence and risk factors of Enterobius vermicularis among preschool children in kindergartens in Luohu District, Shenzhen City. A total of 489 children in 6 kindergartens were selected by the stratified sampling method and investigated for E. vermicularis infection by the cellophane anal swab technique. The information of sanitary condition of the kindergartens, personal hygiene, and family hygiene were investigated by questionnaire. The infection rate of E. vermicularis was 10.2% (50/489). The single factor analysis indicated that the following factors might related to the infection: the different classes of kindergartens, grades, ground of bed ioom, private toilet, types of taps and beds, bed management, education levels of parents, frequency of shower and washing anus, and washing hands before meal and after WC. The multivariate Logistic analysis indicated that the bed management, education level of mothers, frequency of washing anus, and private toilet were independent risk factors for E. vermicularis infection. To control the infection of E. vermicularis, the circumstance and management of kindergartens, parents' knowledge of E. vernicularis infection, and children's healthy habit need improve.

  7. Exploring factors affecting registered nurses' pursuit of postgraduate education in Australia.

    PubMed

    Ng, Linda; Eley, Robert; Tuckett, Anthony

    2016-12-01

    The aim of this study was to explore the factors influencing registered nurses' pursuit of postgraduate education in specialty nursing practice in Australia. Despite the increased requirement for postgraduate education for advanced practice, little has been reported on the contributory factors involved in the decision to undertake further education. The Nurses' Attitudes Towards Postgraduate Education instrument was administered to 1632 registered nurses from the Nurses and Midwives e-Cohort Study across Australia, with a response rate of 35.9% (n = 568). Data reduction techniques using principal component analysis with varimax rotation were used. The analysis identified a three-factor solution for 14 items, accounting for 52.5% of the variance of the scale: "facilitators," "professional recognition," and "inhibiting factors." Facilitators of postgraduate education accounted for 28.5% of the variance, including: (i) improves knowledge; (ii) increases nurses' confidence in clinical decision-making; (iii) enhances nurses' careers; (iv) improves critical thinking; (v) improves nurses' clinical skill; and (vi) increased job satisfaction. This new instrument has potential clinical and research applications to support registered nurses' pursuit of postgraduate education. © 2016 John Wiley & Sons Australia, Ltd.

  8. Evaluation of the laser-induced breakdown spectroscopy technique for determination of the chemical composition of copper concentrates

    NASA Astrophysics Data System (ADS)

    Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Drzymała, Jan; Abramski, Krzysztof M.

    2014-07-01

    Laser-induced breakdown spectroscopy (LIBS), like many other spectroscopic techniques, is a comparative method. Typically, in qualitative analysis, synthetic certified standard with a well-known elemental composition is used to calibrate the system. Nevertheless, in all laser-induced techniques, such calibration can affect the accuracy through differences in the overall composition of the chosen standard. There are also some intermediate factors, which can cause imprecision in measurements, such as optical absorption, surface structure and thermal conductivity. In this work the calibration performed for the LIBS technique utilizes pellets made directly from the tested materials (old well-characterized samples). This choice produces a considerable improvement in the accuracy of the method. This technique was adopted for the determination of trace elements in industrial copper concentrates, standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for three elements: silver, cobalt and vanadium. We also proposed a method of post-processing the measurement data to minimize matrix effects and permit reliable analysis. It has been shown that the described technique can be used in qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates. It was noted that the final validation of such methodology is limited mainly by the accuracy of the characterization of the standards.

  9. Fabrication of dense wavelength division multiplexing filters with large useful area

    NASA Astrophysics Data System (ADS)

    Lee, Cheng-Chung; Chen, Sheng-Hui; Hsu, Jin-Cherng; Kuo, Chien-Cheng

    2006-08-01

    Dense Wavelength Division Multiplexers (DWDM), a kind of narrow band-pass filter, are extremely sensitive to the optical thickness error in each composite layer. Therefore to have a large useful coating area is extreme difficult because of the uniformity problem. To enlarge the useful coating area it is necessary to improve their design and their fabrication. In this study, we discuss how the tooling factors at different positions and for different materials are related to the optical performance of the design. 100GHz DWDM filters were fabricated by E-gun evaporation with ion-assisted deposition (IAD). To improve the coating uniformity, an analysis technique called shaping tooling factor (STF) was used to analyze the deviation of the optical thickness in different materials so as to enlarge the useful coating area. Also a technique of etching the deposited layers with oxygen ions was introduced. When the above techniques were applied in the fabrication of 100 GHz DWDM filters, the uniformity was better than +/-0.002% over an area of 72 mm in diameter and better than +/-0.0006% over 20mm in diameter.

  10. Management of the second phase of labour: perineum protection techniques.

    PubMed

    Laganà, A S; Burgio, M A; Retto, G; Pizzo, A; Granese, R; Sturlese, E; Ciancimino, L; Chiofalo, B; Retto, A; Triolo, O

    2015-06-01

    The obstetric experience alongside scientific evidences in literature indicate several management techniques during the expulsive period of labour to minimize obstetric complications. Among the various methods that can be used for the protection of the perineum during the expulsive phase, some are performed prepartum (perineum massage), while most are used during childbirth. Among the second group, progressively increasing importance is assumed by the manual techniques to protect the perineum (using the "hands-on" and "hands-off") and by episiotomy. These techniques, when used in accordance to the guidelines, may favour the reduction of adverse outcomes for both the mother and the newborn, both immediately after birth and after a longer time. The midwife should be aware of the evidences in literature so that a critical analysis of the available techniques can be made and put in action during the expulsive phase in order to protect the mother and the foetus from any unfavourable outcomes. Currently, clinical evidence in literature is directing obstetric and medical staff towards a careful analysis of the maternal-foetal parameters, in order to achieve a precise assessment of the risks factors of intrapartum and postpartum outcomes. Increasingly, there is the need for close collaboration between the midwife and medical staff to ensure proper personalized assistance based on the peculiar characteristics of the woman and the fetus.

  11. Anterior segment sparing to reduce charged particle radiotherapy complications in uveal melanoma

    NASA Technical Reports Server (NTRS)

    Daftari, I. K.; Char, D. H.; Verhey, L. J.; Castro, J. R.; Petti, P. L.; Meecham, W. J.; Kroll, S.; Blakely, E. A.; Chatterjee, A. (Principal Investigator)

    1997-01-01

    PURPOSE: The purpose of this investigation is to delineate the risk factors in the development of neovascular glaucoma (NVG) after helium-ion irradiation of uveal melanoma patients and to propose treatment technique that may reduce this risk. METHODS AND MATERIALS: 347 uveal melanoma patients were treated with helium-ions using a single-port treatment technique. Using univariate and multivariate statistics, the NVG complication rate was analyzed according to the percent of anterior chamber in the radiation field, tumor size, tumor location, sex, age, dose, and other risk factors. Several University of California San Francisco-Lawrence Berkeley National Laboratory (LBNL) patients in each size category (medium, large, and extralarge) were retrospectively replanned using two ports instead of a single port. By using appropriate polar and azimuthal gaze angles or by treating patients with two ports, the maximum dose to the anterior segment of the eye can often be reduced. Although a larger volume of anterior chamber may receive a lower dose by using two ports than a single port treatment. We hypothesize that this could reduce the level of complications that result from the irradiation of the anterior chamber of the eye. Dose-volume histograms were calculated for the lens, and compared for the single and two-port techniques. RESULTS: NVG developed in 121 (35%) patients. The risk of NVG peaked between 1 and 2.5 years posttreatment. By univariate and multivariate analysis, the percent of lens in the field was strongly correlated with the development of NVG. Other contributing factors were tumor height, history of diabetes, and vitreous hemorrhage. Dose-volume histogram analysis of single-port vs. two-port techniques demonstrate that for some patients in the medium and large category tumor groups, a significant decrease in dose to the structures in the anterior segment of the eye could have been achieved with the use of two ports. CONCLUSION: The development of NVG after helium-ion irradiation is correlated to the amount of lens, anterior chamber in the treatment field, tumor height, proximity to the fovea, history of diabetes, and the development of vitreous hemorrhage. Although the influence of the higher LET deposition of helium-ions is unclear, this study suggests that by reducing the dose to the anterior segment of the eye may reduce the NVG complications. Based on this retrospective analysis of LBNL patients, we have implemented techniques to reduce the amount of the anterior segment receiving a high dose in our new series of patients treated with protons using the cyclotron at the UC Davis Crocker Nuclear Laboratory (CNL).

  12. Practical application of the benchmarking technique to increase reliability and efficiency of power installations and main heat-mechanic equipment of thermal power plants

    NASA Astrophysics Data System (ADS)

    Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.

    2016-12-01

    Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.

  13. Methodological issues in microdialysis sampling for pharmacokinetic studies.

    PubMed

    de Lange, E C; de Boer, A G; Breimer, D D

    2000-12-15

    Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.

  14. Calibrationless parallel magnetic resonance imaging: a joint sparsity model.

    PubMed

    Majumdar, Angshul; Chaudhury, Kunal Narayan; Ward, Rabab

    2013-12-05

    State-of-the-art parallel MRI techniques either explicitly or implicitly require certain parameters to be estimated, e.g., the sensitivity map for SENSE, SMASH and interpolation weights for GRAPPA, SPIRiT. Thus all these techniques are sensitive to the calibration (parameter estimation) stage. In this work, we have proposed a parallel MRI technique that does not require any calibration but yields reconstruction results that are at par with (or even better than) state-of-the-art methods in parallel MRI. Our proposed method required solving non-convex analysis and synthesis prior joint-sparsity problems. This work also derives the algorithms for solving them. Experimental validation was carried out on two datasets-eight channel brain and eight channel Shepp-Logan phantom. Two sampling methods were used-Variable Density Random sampling and non-Cartesian Radial sampling. For the brain data, acceleration factor of 4 was used and for the other an acceleration factor of 6 was used. The reconstruction results were quantitatively evaluated based on the Normalised Mean Squared Error between the reconstructed image and the originals. The qualitative evaluation was based on the actual reconstructed images. We compared our work with four state-of-the-art parallel imaging techniques; two calibrated methods-CS SENSE and l1SPIRiT and two calibration free techniques-Distributed CS and SAKE. Our method yields better reconstruction results than all of them.

  15. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize this gap in best practices and subsequently to promote instrument development research that is more consistent through the peer-review process.

  16. [Methods of a posteriori identification of food patterns in Brazilian children: a systematic review].

    PubMed

    Carvalho, Carolina Abreu de; Fonsêca, Poliana Cristina de Almeida; Nobre, Luciana Neri; Priore, Silvia Eloiza; Franceschini, Sylvia do Carmo Castro

    2016-01-01

    The objective of this study is to provide guidance for identifying dietary patterns using the a posteriori approach, and analyze the methodological aspects of the studies conducted in Brazil that identified the dietary patterns of children. Articles were selected from the Latin American and Caribbean Literature on Health Sciences, Scientific Electronic Library Online and Pubmed databases. The key words were: Dietary pattern; Food pattern; Principal Components Analysis; Factor analysis; Cluster analysis; Reduced rank regression. We included studies that identified dietary patterns of children using the a posteriori approach. Seven studies published between 2007 and 2014 were selected, six of which were cross-sectional and one cohort, Five studies used the food frequency questionnaire for dietary assessment; one used a 24-hour dietary recall and the other a food list. The method of exploratory approach used in most publications was principal components factor analysis, followed by cluster analysis. The sample size of the studies ranged from 232 to 4231, the values of the Kaiser-Meyer-Olkin test from 0.524 to 0.873, and Cronbach's alpha from 0.51 to 0.69. Few Brazilian studies identified dietary patterns of children using the a posteriori approach and principal components factor analysis was the technique most used.

  17. Development and validation of a questionnaire to evaluate infection control in oral radiology.

    PubMed

    da Costa, Eliana D; Pinelli, Camila; da Silva Tagliaferro, Elaine P; Corrente, José E; Ambrosano, Glaucia M B

    2017-04-01

    To create and validate a questionnaire to evaluate infection control in oral radiology. The questionnaire was developed after review of the literature, which included published articles and the biosafety protocols available from healthcare agencies. The initial version of the questionnaire was composed of 14 multiple choice questions and was divided into 3 domains on handwashing, disinfection/protection of surfaces and disinfectant used. Content validity was assessed by two expert committees, which reviewed the content and scope of the questionnaire and the relevance of each item, respectively. Reliability was evaluated using test-retest and internal consistency methods with 115 undergraduate dentistry students. Construct validity was assessed using the known-groups technique and factor analysis. The known-groups technique involved 641 undergraduate dentistry students, 20 PhD students and 15 oral radiology professors. In the factor analysis, 3 radiology technicians also participated in addition to the 641 undergraduates, 20 PhD students and 15 oral radiology professors. The content validity results were found to be satisfactory to excellent for the ordinal variables (intraclass correlation coefficient = 0.722-1.000) and good to great for the yes/no questions (kappa = 0.662-0.913) in terms of reliability and good internal consistency (Cronbach's alpha = 0.88). After a factor analysis, some questions were excluded, and the questions were grouped into new domains. Significant differences were observed between answers from different groups. The final version of the questionnaire was composed of nine domains. The questionnaire created was found to exhibit good psychometric properties for assessing infection control in oral radiology.

  18. STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, S.H.; Morris, J.W.

    1962-12-15

    Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)

  19. MEG-SIM: a web portal for testing MEG analysis methods using realistic simulated and empirical data.

    PubMed

    Aine, C J; Sanfratello, L; Ranken, D; Best, E; MacArthur, J A; Wallace, T; Gilliam, K; Donahue, C H; Montaño, R; Bryant, J E; Scott, A; Stephen, J M

    2012-04-01

    MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes ( http://cobre.mrn.org/megsim/ ). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis.

  20. MEG-SIM: A Web Portal for Testing MEG Analysis Methods using Realistic Simulated and Empirical Data

    PubMed Central

    Aine, C. J.; Sanfratello, L.; Ranken, D.; Best, E.; MacArthur, J. A.; Wallace, T.; Gilliam, K.; Donahue, C. H.; Montaño, R.; Bryant, J. E.; Scott, A.; Stephen, J. M.

    2012-01-01

    MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes (http://cobre.mrn.org/megsim/). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis. PMID:22068921

Top