Sample records for factor analysis technique

  1. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  2. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  3. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  4. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    PubMed

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  6. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  7. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  8. Factors influencing patient compliance with therapeutic regimens in chronic heart failure: A critical incident technique analysis.

    PubMed

    Strömberg, A; Broström, A; Dahlström, U; Fridlund, B

    1999-01-01

    The aim of this study was to identify factors influencing compliance with prescribed treatment in patients with chronic heart failure. A qualitative design with a critical incident technique was used. Incidents were collected through interviews with 25 patients with heart failure strategically selected from a primary health care clinic, a medical ward, and a specialist clinic. Two hundred sixty critical incidents were identified in the interviews and 2 main areas emerged in the analysis: inward factors and outward factors. The inward factors described how compliance was influenced by the personality of the patient, the disease, and the treatment. The outward factors described how compliance was influenced by social activities, social relationships, and health care professionals. By identifying the inward and outward factors influencing patients with chronic heart failure, health care professionals can assess whether intervention is needed to increase compliance.

  9. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  10. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  11. How Factor Analysis Can Be Used in Classification.

    ERIC Educational Resources Information Center

    Harman, Harry H.

    This is a methodological study that suggests a taxometric technique for objective classification of yeasts. It makes use of the minres method of factor analysis and groups strains of yeast according to their factor profiles. The similarities are judged in the higher-dimensional space determined by the factor analysis, but otherwise rely on the…

  12. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  13. Error analysis of multi-needle Langmuir probe measurement technique.

    PubMed

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  14. Error analysis of multi-needle Langmuir probe measurement technique

    NASA Astrophysics Data System (ADS)

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  15. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    ERIC Educational Resources Information Center

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  16. What School Psychologists Need to Know about Factor Analysis

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Dombrowski, Stefan C.

    2017-01-01

    Factor analysis is a versatile class of psychometric techniques used by researchers to provide insight into the psychological dimensions (factors) that may account for the relationships among variables in a given dataset. The primary goal of a factor analysis is to determine a more parsimonious set of variables (i.e., fewer than the number of…

  17. Diffraction analysis of customized illumination technique

    NASA Astrophysics Data System (ADS)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  18. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    PubMed

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  19. INSTRUMENTS MEASURING PERCEIVED RACISM/RACIAL DISCRIMINATION: REVIEW AND CRITIQUE OF FACTOR ANALYTIC TECHNIQUES

    PubMed Central

    Atkins, Rahshida

    2015-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225

  20. Fluctuations in alliance and use of techniques over time: A bidirectional relation between use of "common factors" techniques and the development of the working alliance.

    PubMed

    Solomonov, Nili; McCarthy, Kevin S; Keefe, John R; Gorman, Bernard S; Blanchard, Mark; Barber, Jacques P

    2018-01-01

    The aim of this study was twofold: (a) Investigate whether therapists are consistent in their use of therapeutic techniques throughout supportive-expressive therapy (SET) and (b) Examine the bi-directional relation between therapists' use of therapeutic techniques and the working alliance over the course of SET. Thirty-seven depressed patients were assigned to 16 weeks of SET as part of a larger randomized clinical trial (Barber, Barrett, Gallop, Rynn, & Rickels, ). Working Alliance Inventory-Short Form (WAI-SF) was collected at Weeks 2, 4, and 8. Use of therapeutic interventions was rated by independent observers using the Multitheoretical List of Therapeutic Interventions (MULTI). Intraclass correlation coefficients assessed therapists' consistency in use of techniques. A cross-lagged path analysis estimated the working alliance inventory- Multitheoretical List of Therapeutic Interventions bidirectional relation across time. Therapists were moderately consistent in their use of prescribed techniques (psychodynamic, process-experiential, and person-centred). However, they were inconsistent, or more flexible, in their use of "common factors" techniques (e.g., empathy, active listening, hope, and encouragements). A positive bidirectional relation was found between use of common factors techniques and the working alliance, such that initial high levels of common factors (but not prescribed) techniques predicted higher alliance later on and vice versa. Therapists tend to modulate their use of common factors techniques across treatment. Additionally, when a strong working alliance is developed early in treatment, therapists tend to use more common factors later on. Moreover, high use of common factors techniques is predictive of later improvement in the alliance. Copyright © 2017 John Wiley & Sons, Ltd.

  1. A human factors analysis of EVA time requirements

    NASA Technical Reports Server (NTRS)

    Pate, D. W.

    1996-01-01

    Human Factors Engineering (HFE), also known as Ergonomics, is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. A human factors motion and time study was initiated with the goal of developing a database of EVA task times and a method of utilizing the database to predict how long an ExtraVehicular Activity (EVA) should take. Initial development relied on the EVA activities performed during the STS-61 mission (Hubble repair). The first step of the analysis was to become familiar with EVAs and with the previous studies and documents produced on EVAs. After reviewing these documents, an initial set of task primitives and task time modifiers was developed. Videotaped footage of STS-61 EVAs were analyzed using these primitives and task time modifiers. Data for two entire EVA missions and portions of several others, each with two EVA astronauts, was collected for analysis. Feedback from the analysis of the data will be used to further refine the primitives and task time modifiers used. Analysis of variance techniques for categorical data will be used to determine which factors may, individually or by interactions, effect the primitive times and how much of an effect they have.

  2. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  3. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  4. Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors

    NASA Astrophysics Data System (ADS)

    Mahalingam, R.; Olsen, M. J.

    2014-12-01

    Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : Li

  5. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  6. Establishing Evidence for Internal Structure Using Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Watson, Joshua C.

    2017-01-01

    Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…

  7. Factors Associated With Healthcare-Acquired Catheter-Associated Urinary Tract Infections: Analysis Using Multiple Data Sources and Data Mining Techniques.

    PubMed

    Park, Jung In; Bliss, Donna Z; Chi, Chih-Lin; Delaney, Connie W; Westra, Bonnie L

    The purpose of this study was to identify factors associated with healthcare-acquired catheter-associated urinary tract infections (HA-CAUTIs) using multiple data sources and data mining techniques. Three data sets were integrated for analysis: electronic health record data from a university hospital in the Midwestern United States was combined with staffing and environmental data from the hospital's National Database of Nursing Quality Indicators and a list of patients with HA-CAUTIs. Three data mining techniques were used for identification of factors associated with HA-CAUTI: decision trees, logistic regression, and support vector machines. Fewer total nursing hours per patient-day, lower percentage of direct care RNs with specialty nursing certification, higher percentage of direct care RNs with associate's degree in nursing, and higher percentage of direct care RNs with BSN, MSN, or doctoral degree are associated with HA-CAUTI occurrence. The results also support the association of the following factors with HA-CAUTI identified by previous studies: female gender; older age (>50 years); longer length of stay; severe underlying disease; glucose lab results (>200 mg/dL); longer use of the catheter; and RN staffing. Additional findings from this study demonstrated that the presence of more nurses with specialty nursing certifications can reduce HA-CAUTI occurrence. While there may be valid reasons for leaving in a urinary catheter, findings show that having a catheter in for more than 48 hours contributes to HA-CAUTI occurrence. Finally, the findings suggest that more nursing hours per patient-day are related to better patient outcomes.

  8. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  9. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  10. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    PubMed

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  11. A Study of Item Bias for Attitudinal Measurement Using Maximum Likelihood Factor Analysis.

    ERIC Educational Resources Information Center

    Mayberry, Paul W.

    A technique for detecting item bias that is responsive to attitudinal measurement considerations is a maximum likelihood factor analysis procedure comparing multivariate factor structures across various subpopulations, often referred to as SIFASP. The SIFASP technique allows for factorial model comparisons in the testing of various hypotheses…

  12. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  13. The Factorability of Quadratics: Motivation for More Techniques

    ERIC Educational Resources Information Center

    Bosse, Michael J.; Nandakumar, N. R.

    2005-01-01

    Typically, secondary and college algebra students attempt to utilize either completing the square or the quadratic formula as techniques to solve a quadratic equation only after frustration with factoring has arisen. While both completing the square and the quadratic formula are techniques which can determine solutions for all quadratic equations,…

  14. Low energy analysis techniques for CUORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  15. Low energy analysis techniques for CUORE

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2017-12-12

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  16. An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.

    ERIC Educational Resources Information Center

    Skakun, Ernest N.; Hakstian, A. Ralph

    Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…

  17. Exploratory Factor Analysis with Small Sample Sizes

    ERIC Educational Resources Information Center

    de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.

    2009-01-01

    Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…

  18. Testing all six person-oriented principles in dynamic factor analysis.

    PubMed

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  19. Confirmatory factor analysis using Microsoft Excel.

    PubMed

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  20. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    NASA Astrophysics Data System (ADS)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  1. A Human Factors Analysis of EVA Time Requirements

    NASA Technical Reports Server (NTRS)

    Pate, Dennis W.

    1997-01-01

    Human Factors Engineering (HFE) is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. During the summer of 1995, a human factors motion and time study was initiated with the goals of developing a database of EVA task times and developing a method of utilizing the database to predict how long an EVA should take. Initial development relied on the EVA activities performed during the STS-61 (Hubble) mission. The first step of the study was to become familiar with EVA's, the previous task-time studies, and documents produced on EVA's. After reviewing these documents, an initial set of task primitives and task-time modifiers was developed. Data was collected from videotaped footage of two entire STS-61 EVA missions and portions of several others, each with two EVA astronauts. Feedback from the analysis of the data was used to further refine the primitives and modifiers used. The project was continued during the summer of 1996, during which data on human errors was also collected and analyzed. Additional data from the STS-71 mission was also collected. Analysis of variance techniques for categorical data was used to determine which factors may affect the primitive times and how much of an effect they have. Probability distributions for the various task were also generated. Further analysis of the modifiers and interactions is planned.

  2. Recurrent tricuspid insufficiency: is the surgical repair technique a risk factor?

    PubMed

    Kara, Ibrahim; Koksal, Cengiz; Cakalagaoglu, Canturk; Sahin, Muslum; Yanartas, Mehmet; Ay, Yasin; Demir, Serdar

    2013-01-01

    This study compares the medium-term results of De Vega, modified De Vega, and ring annuloplasty techniques for the correction of tricuspid insufficiency and investigates the risk factors for recurrent grades 3 and 4 tricuspid insufficiency after repair. In our clinic, 93 patients with functional tricuspid insufficiency underwent surgical tricuspid repair from May 2007 through October 2010. The study was retrospective, and all the data pertaining to the patients were retrieved from hospital records. Functional capacity, recurrent tricuspid insufficiency, and risk factors aggravating the insufficiency were analyzed for each patient. In the medium term (25.4 ± 10.3 mo), the rates of grades 3 and 4 tricuspid insufficiency in the De Vega, modified De Vega, and ring annuloplasty groups were 31%, 23.1%, and 6.1%, respectively. Logistic regression analysis revealed that chronic obstructive pulmonary disease, left ventricular dysfunction (ejection fraction, < 0.50), pulmonary artery pressure ≥60 mmHg, and the De Vega annuloplasty technique were risk factors for medium-term recurrent grades 3 and 4 tricuspid insufficiency. Medium-term survival was 90.6% for the De Vega group, 96.3% for the modified De Vega group, and 97.1% for the ring annuloplasty group. Ring annuloplasty provided the best relief from recurrent tricuspid insufficiency when compared with DeVega annuloplasty. Modified De Vega annuloplasty might be a suitable alternative to ring annuloplasty when rings are not available.

  3. Replication Analysis in Exploratory Factor Analysis: What It Is and Why It Makes Your Analysis Better

    ERIC Educational Resources Information Center

    Osborne, Jason W.; Fitzpatrick, David C.

    2012-01-01

    Exploratory Factor Analysis (EFA) is a powerful and commonly-used tool for investigating the underlying variable structure of a psychometric instrument. However, there is much controversy in the social sciences with regard to the techniques used in EFA (Ford, MacCallum, & Tait, 1986; Henson & Roberts, 2006) and the reliability of the outcome.…

  4. Factor Analysis via Components Analysis

    ERIC Educational Resources Information Center

    Bentler, Peter M.; de Leeuw, Jan

    2011-01-01

    When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…

  5. Replace-approximation method for ambiguous solutions in factor analysis of ultrasonic hepatic perfusion

    NASA Astrophysics Data System (ADS)

    Zhang, Ji; Ding, Mingyue; Yuchi, Ming; Hou, Wenguang; Ye, Huashan; Qiu, Wu

    2010-03-01

    Factor analysis is an efficient technique to the analysis of dynamic structures in medical image sequences and recently has been used in contrast-enhanced ultrasound (CEUS) of hepatic perfusion. Time-intensity curves (TICs) extracted by factor analysis can provide much more diagnostic information for radiologists and improve the diagnostic rate of focal liver lesions (FLLs). However, one of the major drawbacks of factor analysis of dynamic structures (FADS) is nonuniqueness of the result when only the non-negativity criterion is used. In this paper, we propose a new method of replace-approximation based on apex-seeking for ambiguous FADS solutions. Due to a partial overlap of different structures, factor curves are assumed to be approximately replaced by the curves existing in medical image sequences. Therefore, how to find optimal curves is the key point of the technique. No matter how many structures are assumed, our method always starts to seek apexes from one-dimensional space where the original high-dimensional data is mapped. By finding two stable apexes from one dimensional space, the method can ascertain the third one. The process can be continued until all structures are found. This technique were tested on two phantoms of blood perfusion and compared to the two variants of apex-seeking method. The results showed that the technique outperformed two variants in comparison of region of interest measurements from phantom data. It can be applied to the estimation of TICs derived from CEUS images and separation of different physiological regions in hepatic perfusion.

  6. Graph Embedding Techniques for Bounding Condition Numbers of Incomplete Factor Preconditioning

    NASA Technical Reports Server (NTRS)

    Guattery, Stephen

    1997-01-01

    We extend graph embedding techniques for bounding the spectral condition number of preconditioned systems involving symmetric, irreducibly diagonally dominant M-matrices to systems where the preconditioner is not diagonally dominant. In particular, this allows us to bound the spectral condition number when the preconditioner is based on an incomplete factorization. We provide a review of previous techniques, describe our extension, and give examples both of a bound for a model problem, and of ways in which our techniques give intuitive way of looking at incomplete factor preconditioners.

  7. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  8. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  9. Using factor analysis to identify neuromuscular synergies during treadmill walking

    NASA Technical Reports Server (NTRS)

    Merkle, L. A.; Layne, C. S.; Bloomberg, J. J.; Zhang, J. J.

    1998-01-01

    Neuroscientists are often interested in grouping variables to facilitate understanding of a particular phenomenon. Factor analysis is a powerful statistical technique that groups variables into conceptually meaningful clusters, but remains underutilized by neuroscience researchers presumably due to its complicated concepts and procedures. This paper illustrates an application of factor analysis to identify coordinated patterns of whole-body muscle activation during treadmill walking. Ten male subjects walked on a treadmill (6.4 km/h) for 20 s during which surface electromyographic (EMG) activity was obtained from the left side sternocleidomastoid, neck extensors, erector spinae, and right side biceps femoris, rectus femoris, tibialis anterior, and medial gastrocnemius. Factor analysis revealed 65% of the variance of seven muscles sampled aligned with two orthogonal factors, labeled 'transition control' and 'loading'. These two factors describe coordinated patterns of muscular activity across body segments that would not be evident by evaluating individual muscle patterns. The results show that factor analysis can be effectively used to explore relationships among muscle patterns across all body segments to increase understanding of the complex coordination necessary for smooth and efficient locomotion. We encourage neuroscientists to consider using factor analysis to identify coordinated patterns of neuromuscular activation that would be obscured using more traditional EMG analyses.

  10. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  11. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  12. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  13. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  14. BATMAN: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  15. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  16. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  17. Application of Factor Analysis on the Financial Ratios of Indian Cement Industry and Validation of the Results by Cluster Analysis

    NASA Astrophysics Data System (ADS)

    De, Anupam; Bandyopadhyay, Gautam; Chakraborty, B. N.

    2010-10-01

    Financial ratio analysis is an important and commonly used tool in analyzing financial health of a firm. Quite a large number of financial ratios, which can be categorized in different groups, are used for this analysis. However, to reduce number of ratios to be used for financial analysis and regrouping them into different groups on basis of empirical evidence, Factor Analysis technique is being used successfully by different researches during the last three decades. In this study Factor Analysis has been applied over audited financial data of Indian cement companies for a period of 10 years. The sample companies are listed on the Stock Exchange India (BSE and NSE). Factor Analysis, conducted over 44 variables (financial ratios) grouped in 7 categories, resulted in 11 underlying categories (factors). Each factor is named in an appropriate manner considering the factor loads and constituent variables (ratios). Representative ratios are identified for each such factor. To validate the results of Factor Analysis and to reach final conclusion regarding the representative ratios, Cluster Analysis had been performed.

  18. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  19. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    NASA Astrophysics Data System (ADS)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  20. Reduction and analysis techniques for infrared imaging data

    NASA Technical Reports Server (NTRS)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  1. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  2. Simulating the Effects of Common and Specific Abilities on Test Performance: An Evaluation of Factor Analysis

    ERIC Educational Resources Information Center

    McFarland, Dennis J.

    2014-01-01

    Purpose: Factor analysis is a useful technique to aid in organizing multivariate data characterizing speech, language, and auditory abilities. However, knowledge of the limitations of factor analysis is essential for proper interpretation of results. The present study used simulated test scores to illustrate some characteristics of factor…

  3. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  4. A new technique for measuring gas conversion factors for hydrocarbon mass flowmeters

    NASA Technical Reports Server (NTRS)

    Singh, J. J.; Sprinkle, D. R.

    1983-01-01

    A technique for measuring calibration conversion factors for hydrocarbon mass flowmeters was developed. It was applied to a widely used type of commercial thermal mass flowmeter for hydrocarbon gases. The values of conversion factors for two common hydrocarbons measured using this technique are in good agreement with the empirical values cited by the manufacturer. Similar agreements can be expected for all other hydrocarbons. The technique is based on Nernst theorem for matching the partial pressure of oxygen in the combustion product gases with that in normal air. It is simple, quick and relatively safe--particularly for toxic/poisonous hydrocarbons.

  5. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    PubMed

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  6. Scale-model charge-transfer technique for measuring enhancement factors

    NASA Technical Reports Server (NTRS)

    Kositsky, J.; Nanevicz, J. E.

    1991-01-01

    Determination of aircraft electric field enhancement factors is crucial when using airborne field mill (ABFM) systems to accurately measure electric fields aloft. SRI used the scale model charge transfer technique to determine enhancement factors of several canonical shapes and a scale model Learjet 36A. The measured values for the canonical shapes agreed with known analytic solutions within about 6 percent. The laboratory determined enhancement factors for the aircraft were compared with those derived from in-flight data gathered by a Learjet 36A outfitted with eight field mills. The values agreed to within experimental error (approx. 15 percent).

  7. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    NASA Astrophysics Data System (ADS)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  8. An experimental extrapolation technique using the Gafchromic EBT3 film for relative output factor measurements in small x-ray fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morales, Johnny E., E-mail: johnny.morales@lh.org.

    Purpose: An experimental extrapolation technique is presented, which can be used to determine the relative output factors for very small x-ray fields using the Gafchromic EBT3 film. Methods: Relative output factors were measured for the Brainlab SRS cones ranging in diameters from 4 to 30 mm{sup 2} on a Novalis Trilogy linear accelerator with 6 MV SRS x-rays. The relative output factor was determined from an experimental reducing circular region of interest (ROI) extrapolation technique developed to remove the effects of volume averaging. This was achieved by scanning the EBT3 film measurements with a high scanning resolution of 1200 dpi.more » From the high resolution scans, the size of the circular regions of interest was varied to produce a plot of relative output factors versus area of analysis. The plot was then extrapolated to zero to determine the relative output factor corresponding to zero volume. Results: Results have shown that for a 4 mm field size, the extrapolated relative output factor was measured as a value of 0.651 ± 0.018 as compared to 0.639 ± 0.019 and 0.633 ± 0.021 for 0.5 and 1.0 mm diameter of analysis values, respectively. This showed a change in the relative output factors of 1.8% and 2.8% at these comparative regions of interest sizes. In comparison, the 25 mm cone had negligible differences in the measured output factor between zero extrapolation, 0.5 and 1.0 mm diameter ROIs, respectively. Conclusions: This work shows that for very small fields such as 4.0 mm cone sizes, a measureable difference can be seen in the relative output factor based on the circular ROI and the size of the area of analysis using radiochromic film dosimetry. The authors recommend to scan the Gafchromic EBT3 film at a resolution of 1200 dpi for cone sizes less than 7.5 mm and to utilize an extrapolation technique for the output factor measurements of very small field dosimetry.« less

  9. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  10. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  11. Assessing Suicide Risk Among Callers to Crisis Hotlines: A Confirmatory Factor Analysis

    PubMed Central

    Witte, Tracy K.; Gould, Madelyn S.; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E.; Kalafat, John

    2012-01-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. 1,085 suicidal callers to crisis hotlines were divided into three sub-samples, which allowed us to conduct an independent Exploratory Factor Analysis (EFA), EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and CFA. Similar to previous factor analytic studies (Beck et al., 1997; Holden & DeLisle, 2005; Joiner, Rudd, & Rajab, 1997; Witte et al., 2006), we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations) and one factor representing more mild suicidal ideation (i.e., Suicidal Desire and Ideation). Using structural equation modeling techniques, we found preliminary evidence that the Resolved Plans and Preparations factor trended toward being more predictive of suicidal ideation than the Suicidal Desire and Ideation factor. This factor analytic study is the first longitudinal study of the obtained factors. PMID:20578186

  12. Automated Sneak Circuit Analysis Technique

    DTIC Science & Technology

    1990-06-01

    the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air

  13. Electron-Beam-Induced Deposition as a Technique for Analysis of Precursor Molecule Diffusion Barriers and Prefactors.

    PubMed

    Cullen, Jared; Lobo, Charlene J; Ford, Michael J; Toth, Milos

    2015-09-30

    Electron-beam-induced deposition (EBID) is a direct-write chemical vapor deposition technique in which an electron beam is used for precursor dissociation. Here we show that Arrhenius analysis of the deposition rates of nanostructures grown by EBID can be used to deduce the diffusion energies and corresponding preexponential factors of EBID precursor molecules. We explain the limitations of this approach, define growth conditions needed to minimize errors, and explain why the errors increase systematically as EBID parameters diverge from ideal growth conditions. Under suitable deposition conditions, EBID can be used as a localized technique for analysis of adsorption barriers and prefactors.

  14. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  15. A stiffness derivative finite element technique for determination of crack tip stress intensity factors

    NASA Technical Reports Server (NTRS)

    Parks, D. M.

    1974-01-01

    A finite element technique for determination of elastic crack tip stress intensity factors is presented. The method, based on the energy release rate, requires no special crack tip elements. Further, the solution for only a single crack length is required, and the crack is 'advanced' by moving nodal points rather than by removing nodal tractions at the crack tip and performing a second analysis. The promising straightforward extension of the method to general three-dimensional crack configurations is presented and contrasted with the practical impossibility of conventional energy methods.

  16. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  17. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    PubMed

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences

    PubMed Central

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566

  19. Understanding the Support Needs of People with Intellectual and Related Developmental Disabilities through Cluster Analysis and Factor Analysis of Statewide Data

    ERIC Educational Resources Information Center

    Viriyangkura, Yuwadee

    2014-01-01

    Through a secondary analysis of statewide data from Colorado, people with intellectual and related developmental disabilities (ID/DD) were classified into five clusters based on their support needs characteristics using cluster analysis techniques. Prior latent factor models of support needs in the field of ID/DD were examined to investigate the…

  20. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  1. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  2. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  3. A systematic review of the relationship factor between women and health professionals within the multivariant analysis of maternal satisfaction.

    PubMed

    Macpherson, Ignacio; Roqué-Sánchez, María V; Legget Bn, Finola O; Fuertes, Ferran; Segarra, Ignacio

    2016-10-01

    personalised support provided to women by health professionals is one of the prime factors attaining women's satisfaction during pregnancy and childbirth. However the multifactorial nature of 'satisfaction' makes difficult to assess it. Statistical multivariate analysis may be an effective technique to obtain in depth quantitative evidence of the importance of this factor and its interaction with the other factors involved. This technique allows us to estimate the importance of overall satisfaction in its context and suggest actions for healthcare services. systematic review of studies that quantitatively measure the personal relationship between women and healthcare professionals (gynecologists, obstetricians, nurse, midwifes, etc.) regarding maternity care satisfaction. The literature search focused on studies carried out between 1970 and 2014 that used multivariate analyses and included the woman-caregiver relationship as a factor of their analysis. twenty-four studies which applied various multivariate analysis tools to different periods of maternity care (antenatal, perinatal, post partum) were selected. The studies included discrete scale scores and questionnaires from women with low-risk pregnancies. The "personal relationship" factor appeared under various names: care received, personalised treatment, professional support, amongst others. The most common multivariate techniques used to assess the percentage of variance explained and the odds ratio of each factor were principal component analysis and logistic regression. the data, variables and factor analysis suggest that continuous, personalised care provided by the usual midwife and delivered within a family or a specialised setting, generates the highest level of satisfaction. In addition, these factors foster the woman's psychological and physiological recovery, often surpassing clinical action (e.g. medicalization and hospital organization) and/or physiological determinants (e.g. pain, pathologies, etc

  4. Lamp mapping technique for independent determination of the water vapor mixing ratio calibration factor for a Raman lidar system

    NASA Astrophysics Data System (ADS)

    Venable, Demetrius D.; Whiteman, David N.; Calhoun, Monique N.; Dirisu, Afusat O.; Connell, Rasheen M.; Landulfo, Eduardo

    2011-08-01

    We have investigated a technique that allows for the independent determination of the water vapor mixing ratio calibration factor for a Raman lidar system. This technique utilizes a procedure whereby a light source of known spectral characteristics is scanned across the aperture of the lidar system's telescope and the overall optical efficiency of the system is determined. Direct analysis of the temperature-dependent differential scattering cross sections for vibration and vibration-rotation transitions (convolved with narrowband filters) along with the measured efficiency of the system, leads to a theoretical determination of the water vapor mixing ratio calibration factor. A calibration factor was also obtained experimentally from lidar measurements and radiosonde data. A comparison of the theoretical and experimentally determined values agrees within 5%. We report on the sensitivity of the water vapor mixing ratio calibration factor to uncertainties in parameters that characterize the narrowband transmission filters, the temperature-dependent differential scattering cross section, and the variability of the system efficiency ratios as the lamp is scanned across the aperture of the telescope used in the Howard University Raman Lidar system.

  5. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Dong-Chang; Jans, Hans; McEwan, Sandy

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively.more » The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.« less

  6. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  7. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies

    PubMed Central

    2010-01-01

    Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data. PMID:21062443

  8. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies.

    PubMed

    Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence

    2010-11-09

    Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  9. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  10. Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.

    PubMed

    Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

    2012-01-01

    The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS.

  11. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  12. Factor Analysis of the Modified Sexual Adjustment Questionnaire-Male

    PubMed Central

    Wilmoth, Margaret C.; Hanlon, Alexandra L.; Ng, Lit Soo; Bruner, Debra W.

    2015-01-01

    Background and Purpose The Sexual Adjustment Questionnaire (SAQ) is used in National Cancer Institute–sponsored clinical trials as an outcome measure for sexual functioning. The tool was revised to meet the needs for a clinically useful, theory-based outcome measure for use in both research and clinical settings. This report describes the modifications and validity testing of the modified Sexual Adjustment Questionnaire-Male (mSAQ-Male). Methods This secondary analysis of data from a large Radiation Therapy Oncology Group trial employed principal axis factor analytic techniques in estimating validity of the revised tool. The sample size was 686; most subjects were White, older than the age 60 years, and with a high school education and a Karnofsky performance scale (KPS) score of greater than 90. Results A 16-item, 3-factor solution resulted from the factor analysis. The mSAQ-Male was also found to be sensitive to changes in physical sexual functioning as measured by the KPS. Conclusion The mSAQ-Male is a valid self-report measure of sexuality that can be used clinically to detect changes in male sexual functioning. PMID:25255676

  13. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  14. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  15. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  16. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  17. Generalized five-dimensional dynamic and spectral factor analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Fakhri, Georges; Sitek, Arkadiusz; Zimmerman, Robert E.

    2006-04-15

    We have generalized the spectral factor analysis and the factor analysis of dynamic sequences (FADS) in SPECT imaging to a five-dimensional general factor analysis model (5D-GFA), where the five dimensions are the three spatial dimensions, photon energy, and time. The generalized model yields a significant advantage in terms of the ratio of the number of equations to that of unknowns in the factor analysis problem in dynamic SPECT studies. We solved the 5D model using a least-squares approach. In addition to the traditional non-negativity constraints, we constrained the solution using a priori knowledge of both time and energy, assuming thatmore » primary factors (spectra) are Gaussian-shaped with full-width at half-maximum equal to gamma camera energy resolution. 5D-GFA was validated in a simultaneous pre-/post-synaptic dual isotope dynamic phantom study where {sup 99m}Tc and {sup 123}I activities were used to model early Parkinson disease studies. 5D-GFA was also applied to simultaneous perfusion/dopamine transporter (DAT) dynamic SPECT in rhesus monkeys. In the striatal phantom, 5D-GFA yielded significantly more accurate and precise estimates of both primary {sup 99m}Tc (bias=6.4%{+-}4.3%) and {sup 123}I (-1.7%{+-}6.9%) time activity curves (TAC) compared to conventional FADS (biases=15.5%{+-}10.6% in {sup 99m}Tc and 8.3%{+-}12.7% in {sup 123}I, p<0.05). Our technique was also validated in two primate dynamic dual isotope perfusion/DAT transporter studies. Biases of {sup 99m}Tc-HMPAO and {sup 123}I-DAT activity estimates with respect to estimates obtained in the presence of only one radionuclide (sequential imaging) were significantly lower with 5D-GFA (9.4%{+-}4.3% for {sup 99m}Tc-HMPAO and 8.7%{+-}4.1% for {sup 123}I-DAT) compared to biases greater than 15% for volumes of interest (VOI) over the reconstructed volumes (p<0.05). 5D-GFA is a novel and promising approach in dynamic SPECT imaging that can also be used in other modalities. It allows accurate and

  18. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors

    PubMed Central

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-01-01

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors. PMID:28574459

  19. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors.

    PubMed

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-06-02

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors.

  20. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  1. Factors affecting construction performance: exploratory factor analysis

    NASA Astrophysics Data System (ADS)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  2. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  3. Extension Procedures for Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Nagy, Gabriel; Brunner, Martin; Lüdtke, Oliver; Greiff, Samuel

    2017-01-01

    We present factor extension procedures for confirmatory factor analysis that provide estimates of the relations of common and unique factors with external variables that do not undergo factor analysis. We present identification strategies that build upon restrictions of the pattern of correlations between unique factors and external variables. The…

  4. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  5. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  6. Anatomy-based transmission factors for technique optimization in portable chest x-ray

    NASA Astrophysics Data System (ADS)

    Liptak, Christopher L.; Tovey, Deborah; Segars, William P.; Dong, Frank D.; Li, Xiang

    2015-03-01

    Portable x-ray examinations often account for a large percentage of all radiographic examinations. Currently, portable examinations do not employ automatic exposure control (AEC). To aid in the design of a size-specific technique chart, acrylic slabs of various thicknesses are often used to estimate x-ray transmission for patients of various body thicknesses. This approach, while simple, does not account for patient anatomy, tissue heterogeneity, and the attenuation properties of the human body. To better account for these factors, in this work, we determined x-ray transmission factors using computational patient models that are anatomically realistic. A Monte Carlo program was developed to model a portable x-ray system. Detailed modeling was done of the x-ray spectrum, detector positioning, collimation, and source-to-detector distance. Simulations were performed using 18 computational patient models from the extended cardiac-torso (XCAT) family (9 males, 9 females; age range: 2-58 years; weight range: 12-117 kg). The ratio of air kerma at the detector with and without a patient model was calculated as the transmission factor. Our study showed that the transmission factor decreased exponentially with increasing patient thickness. For the range of patient thicknesses examined (12-28 cm), the transmission factor ranged from approximately 21% to 1.9% when the air kerma used in the calculation represented an average over the entire imaging field of view. The transmission factor ranged from approximately 21% to 3.6% when the air kerma used in the calculation represented the average signals from two discrete AEC cells behind the lung fields. These exponential relationships may be used to optimize imaging techniques for patients of various body thicknesses to aid in the design of clinical technique charts.

  7. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  8. A methodological comparison of customer service analysis techniques

    Treesearch

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  9. Model building techniques for analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less

  10. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  11. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  12. Milch versus Stimson technique for nonsedated reduction of anterior shoulder dislocation: a prospective randomized trial and analysis of factors affecting success.

    PubMed

    Amar, Eyal; Maman, Eran; Khashan, Morsi; Kauffman, Ehud; Rath, Ehud; Chechik, Ofir

    2012-11-01

    The shoulder is regarded as the most commonly dislocated major joint in the human body. Most dislocations can be reduced by simple methods in the emergency department, whereas others require more complicated approaches. We compared the efficacy, safety, pain, and duration of the reduction between the Milch technique and the Stimson technique in treating dislocations. We also identified factors that affected success rate. All enrolled patients were randomized to either the Milch technique or the Stimson technique for dislocated shoulder reduction. The study cohort consisted of 60 patients (mean age, 43.9 years; age range, 18-88 years) who were randomly assigned to treatment by either the Stimson technique (n = 25) or the Milch technique (n = 35). Oral analgesics were available for both groups. The 2 groups were similar in demographics, patient characteristics, and pain levels. The first reduction attempt in the Milch and Stimson groups was successful in 82.8% and 28% of cases, respectively (P < .001), and the mean reduction time was 4.68 and 8.84 minutes, respectively (P = .007). The success rate was found to be affected by the reduction technique, the interval between dislocation occurrence and first reduction attempt, and the pain level on admittance. The success rate and time to achieve reduction without sedation were superior for the Milch technique compared with the Stimson technique. Early implementation of reduction measures and low pain levels at presentation favor successful reduction, which--in combination with oral pain medication--constitutes an acceptable and reasonable management alternative to reduction with sedation. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  13. Fourier Spectroscopy: A Simple Analysis Technique

    ERIC Educational Resources Information Center

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  14. Application of sensitivity-analysis techniques to the calculation of topological quantities

    NASA Astrophysics Data System (ADS)

    Gilchrist, Stuart

    2017-08-01

    Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.

  15. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  16. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  17. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  18. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  19. Exploratory Bi-factor Analysis: The Oblique Case.

    PubMed

    Jennrich, Robert I; Bentler, Peter M

    2012-07-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (Psychometrika 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (Psychometrika 76:537-549, 2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bifactor rotation criterion designed to produce a rotated loading matrix that has an approximate bi-factor structure. Among other things this can be used as an aid in finding an explicit bi-factor structure for use in a confirmatory bi-factor analysis. They considered only orthogonal rotation. The purpose of this paper is to consider oblique rotation and to compare it to orthogonal rotation. Because there are many more oblique rotations of an initial loading matrix than orthogonal rotations, one expects the oblique results to approximate a bi-factor structure better than orthogonal rotations and this is indeed the case. A surprising result arises when oblique bi-factor rotation methods are applied to ideal data.

  20. Influence of Topographic and Hydrographic Factors on the Spatial Distribution of Leptospirosis Disease in São Paulo County, Brazil: An Approach Using Geospatial Techniques and GIS Analysis

    NASA Astrophysics Data System (ADS)

    Ferreira, M. C.; Ferreira, M. F. M.

    2016-06-01

    Leptospirosis is a zoonosis caused by Leptospira genus bacteria. Rodents, especially Rattus norvegicus, are the most frequent hosts of this microorganism in the cities. The human transmission occurs by contact with urine, blood or tissues of the rodent and contacting water or mud contaminated by rodent urine. Spatial patterns of concentration of leptospirosis are related to the multiple environmental and socioeconomic factors, like housing near flooding areas, domestic garbage disposal sites and high-density of peoples living in slums located near river channels. We used geospatial techniques and geographical information system (GIS) to analysing spatial relationship between the distribution of leptospirosis cases and distance from rivers, river density in the census sector and terrain slope factors, in Sao Paulo County, Brazil. To test this methodology we used a sample of 183 geocoded leptospirosis cases confirmed in 2007, ASTER GDEM2 data, hydrography and census sectors shapefiles. Our results showed that GIS and geospatial analysis techniques improved the mapping of the disease and permitted identify the spatial pattern of association between location of cases and spatial distribution of the environmental variables analyzed. This study showed also that leptospirosis cases might be more related to the census sectors located on higher river density areas and households situated at shorter distances from rivers. In the other hand, it was not possible to assert that slope terrain contributes significantly to the location of leptospirosis cases.

  1. Decomposition techniques

    USGS Publications Warehouse

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  2. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. [Analysis of virulence factors of Porphyromonas endodontalis based on comparative proteomics technique].

    PubMed

    Li, H; Ji, H; Wu, S S; Hou, B X

    2016-12-09

    Objective: To analyze the protein expression profile and the potential virulence factors of Porphyromonas endodontalis (Pe) via comparison with that of two strains of Porphyromonas gingivalis (Pg) with high and low virulences, respectively. Methods: Whole cell comparative proteomics of Pe ATCC35406 was examined and compared with that of high virulent strain Pg W83 andlow virulent strain Pg ATCC33277, respectively. Isobaric tags for relative and absolute quantitation (iTRAQ) combined with nano liquid chromatography-tandem mass spectrometry (Nano-LC-MS/MS) were adopted to identify and quantitate the proteins of Pe and two strains of Pg with various virulences by using the methods of isotopically labeled peptides, mass spectrometric detection and bioinformatics analysis. The biological functions of similar proteins expressed by Pe ATCC35406 and two strains of Pg were quantified and analyzed. Results: Totally 1 210 proteins were identified while Pe compared with Pg W83. There were 130 proteins (10.74% of the total proteins) expressed similarly, including 89 known functional proteins and 41 proteins of unknown functions. Totally 1 223 proteins were identified when Pe compared with Pg ATCC33277. There were 110 proteins (8.99% of the total proteins) expressed similarly, including 72 known functional proteins and 38 proteins of unknown functions. The similarly expressed proteins in Pe and Pg strains with various virulences mainly focused on catalytic activity and binding function, including recombination activation gene (RagA), lipoprotein, chaperonin Dnak, Clp family proteins (ClpC and ClpX) and various iron-binding proteins. They were involved in metabolism and cellular processes. In addition, the type and number of similar virulence proteins between Pe and high virulence Pg were higher than those between Pe and low virulence Pg. Conclusions: Lipoprotein, oxygen resistance protein, iron binding protein were probably the potential virulence factors of Pe ATCC35406. It was

  4. Applications Of Binary Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  5. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2017-10-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  6. Critical incident technique analysis applied to perianesthetic cardiac arrests at a university teaching hospital.

    PubMed

    Hofmeister, Erik H; Reed, Rachel A; Barletta, Michele; Shepard, Molly; Quandt, Jane

    2018-05-01

    To apply the critical incident technique (CIT) methodology to a series of perianesthetic cardiac arrest events at a university teaching hospital to describe the factors that contributed to cardiac arrest. CIT qualitative analysis of a case series. A group of 16 dogs and cats that suffered a perioperative cardiac arrest between November 2013 and November 2016. If an arrest occurred, the event was discussed among the anesthesiologists. The discussion included a description of the case, a description of the sequence of events leading up to the arrest and a discussion of what could have been done to affect the outcome. A written description of the case and the event including animal signalment and a timeline of events was provided by the supervising anesthesiologist following discussion among the anesthesiologists. Only dogs or cats were included. After the data collection period, information from the medical record was collected. A qualitative document analysis was performed on the summaries provided about each case by the supervising anesthesiologist, the medical record and any supporting documents. Each case was then classified into one or more of the following: animal, human, equipment, drug and procedural factors for cardiac arrest. The most common factor was animal (n=14), followed by human (n=12), procedural (n=4), drugs (n=1) and equipment (n=1). The majority (n=11) of animals had multiple factors identified. Cardiac arrests during anesthesia at a referral teaching hospital were primarily a result of animal and human factors. Arrests because of procedural, drug and equipment factors were uncommon. Most animals experienced more than one factor and two animals arrested after a change in recumbency. Future work should focus on root cause analysis and interventions designed to minimize all factors, particularly human ones. Copyright © 2018 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. Published by Elsevier Ltd

  7. Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.

    PubMed

    Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao

    2018-01-01

    In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.

  8. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  9. Exploratory Bi-Factor Analysis: The Oblique Case

    ERIC Educational Resources Information Center

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  10. Factor Analysis of Drawings: Application to college student models of the greenhouse effect

    NASA Astrophysics Data System (ADS)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-09-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.

  11. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  12. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  13. Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)

    NASA Astrophysics Data System (ADS)

    De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.

    1993-01-01

    The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.

  14. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  15. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  16. Application of pattern recognition techniques to crime analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  17. A Simplified Technique for Scoring DSM-IV Personality Disorders with the Five-Factor Model

    ERIC Educational Resources Information Center

    Miller, Joshua D.; Bagby, R. Michael; Pilkonis, Paul A.; Reynolds, Sarah K.; Lynam, Donald R.

    2005-01-01

    The current study compares the use of two alternative methodologies for using the Five-Factor Model (FFM) to assess personality disorders (PDs). Across two clinical samples, a technique using the simple sum of selected FFM facets is compared with a previously used prototype matching technique. The results demonstrate that the more easily…

  18. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  20. Performance analysis of clustering techniques over microarray data: A case study

    NASA Astrophysics Data System (ADS)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  1. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  2. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  3. Factor Analysis of Intern Effectiveness

    ERIC Educational Resources Information Center

    Womack, Sid T.; Hannah, Shellie Louise; Bell, Columbus David

    2012-01-01

    Four factors in teaching intern effectiveness, as measured by a Praxis III-similar instrument, were found among observational data of teaching interns during the 2010 spring semester. Those factors were lesson planning, teacher/student reflection, fairness & safe environment, and professionalism/efficacy. This factor analysis was as much of a…

  4. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  5. A computational intelligent approach to multi-factor analysis of violent crime information system

    NASA Astrophysics Data System (ADS)

    Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing

    2017-02-01

    Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

  6. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  7. Using Human Factors Techniques to Design Text Message Reminders for Childhood Immunization

    ERIC Educational Resources Information Center

    Ahlers-Schmidt, Carolyn R.; Hart, Traci; Chesser, Amy; Williams, Katherine S.; Yaghmai, Beryl; Shah-Haque, Sapna; Wittler, Robert R.

    2012-01-01

    This study engaged parents to develop concise, informative, and comprehensible text messages for an immunization reminder system using Human Factors techniques. Fifty parents completed a structured interview including demographics, technology questions, willingness to receive texts from their child's doctor, and health literacy. Each participant…

  8. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    PubMed

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. An evaluation of the carbon balance technique for estimating emission factors and fuel consumption in forest fires

    Treesearch

    Nelson, Jr. Ralph M.

    1982-01-01

    Eighteen experimental fires were used to compare measured and calculated values for emission factors and fuel consumption to evaluate the carbon balance technique. The technique is based on a model for the emission factor of carbon dioxide, corrected for the production of other emissions, and which requires measurements of effluent concentrations and air volume in the...

  10. Analysis of psychological factors for quality assessment of interactive multimodal service

    NASA Astrophysics Data System (ADS)

    Yamagishi, Kazuhisa; Hayashi, Takanori

    2005-03-01

    We proposed a subjective quality assessment model for interactive multimodal services. First, psychological factors of an audiovisual communication service were extracted by using the semantic differential (SD) technique and factor analysis. Forty subjects participated in subjective tests and performed point-to-point conversational tasks on a PC-based TV phone that exhibits various network qualities. The subjects assessed those qualities on the basis of 25 pairs of adjectives. Two psychological factors, i.e., an aesthetic feeling and a feeling of activity, were extracted from the results. Then, quality impairment factors affecting these two psychological factors were analyzed. We found that the aesthetic feeling is mainly affected by IP packet loss and video coding bit rate, and the feeling of activity depends on delay time and video frame rate. We then proposed an opinion model derived from the relationships among quality impairment factors, psychological factors, and overall quality. The results indicated that the estimation error of the proposed model is almost equivalent to the statistical reliability of the subjective score. Finally, using the proposed model, we discuss guidelines for quality design of interactive audiovisual communication services.

  11. Robust Bayesian Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Yuan, Ke-Hai

    2003-01-01

    Bayesian factor analysis (BFA) assumes the normal distribution of the current sample conditional on the parameters. Practical data in social and behavioral sciences typically have significant skewness and kurtosis. If the normality assumption is not attainable, the posterior analysis will be inaccurate, although the BFA depends less on the current…

  12. Factor weighting in DRASTIC modeling.

    PubMed

    Pacheco, F A L; Pires, L M G R; Santos, R M B; Sanches Fernandes, L F

    2015-02-01

    Evaluation of aquifer vulnerability comprehends the integration of very diverse data, including soil characteristics (texture), hydrologic settings (recharge), aquifer properties (hydraulic conductivity), environmental parameters (relief), and ground water quality (nitrate contamination). It is therefore a multi-geosphere problem to be handled by a multidisciplinary team. The DRASTIC model remains the most popular technique in use for aquifer vulnerability assessments. The algorithm calculates an intrinsic vulnerability index based on a weighted addition of seven factors. In many studies, the method is subject to adjustments, especially in the factor weights, to meet the particularities of the studied regions. However, adjustments made by different techniques may lead to markedly different vulnerabilities and hence to insecurity in the selection of an appropriate technique. This paper reports the comparison of 5 weighting techniques, an enterprise not attempted before. The studied area comprises 26 aquifer systems located in Portugal. The tested approaches include: the Delphi consensus (original DRASTIC, used as reference), Sensitivity Analysis, Spearman correlations, Logistic Regression and Correspondence Analysis (used as adjustment techniques). In all cases but Sensitivity Analysis, adjustment techniques have privileged the factors representing soil characteristics, hydrologic settings, aquifer properties and environmental parameters, by leveling their weights to ≈4.4, and have subordinated the factors describing the aquifer media by downgrading their weights to ≈1.5. Logistic Regression predicts the highest and Sensitivity Analysis the lowest vulnerabilities. Overall, the vulnerability indices may be separated by a maximum value of 51 points. This represents an uncertainty of 2.5 vulnerability classes, because they are 20 points wide. Given this ambiguity, the selection of a weighting technique to integrate a vulnerability index may require additional

  13. Piggyback technique in adult orthotopic liver transplantation: an analysis of 1067 liver transplants at a single center

    PubMed Central

    Nakamura, Noboru; Vaidya, Anil; Levi, David M.; Kato, Tomoaki; Nery, Jose R.; Madariaga, Juan R.; Molina, Enrique; Ruiz, Phillip; Gyamfi, Anthony; Tzakis, Andreas G.

    2006-01-01

    Background. Orthotopic liver transplantation (OLT) in adult patients has traditionally been performed using conventional caval reconstruction technique (CV) with veno-venous bypass. Recently, the piggyback technique (PB) without veno-venous bypass has begun to be widely used. The aim of this study was to assess the effect of routine use of PB on OLTs in adult patients. Patients and methods. A retrospective analysis was undertaken of 1067 orthotopic cadaveric whole liver transplantations in adult patients treated between June 1994 and July 2001. PB was used as the routine procedure. Patient demographics, factors including cold ischemia time (CIT), warm ischemia time (WIT), operative time, transfusions, blood loss, and postoperative results were assessed. The effects of clinical factors on graft survival were assessed by univariate and multivariate analyses.In all, 918 transplantations (86%) were performed with PB. Blood transfusion, WIT, and usage of veno-venous bypass were less with PB. Seventy-five (8.3%) cases with PB had refractory ascites following OLT (p=NS). Five venous outflow stenosis cases (0.54%) with PB were noted (p=NS). The liver and renal function during the postoperative periods was similar. Overall 1-, 3-, and 5-year patient survival rates were 85%, 78%, and 72% with PB. Univariate analysis showed that cava reconstruction method, CIT, WIT, amount of transfusion, length of hospital stay, donor age, and tumor presence were significant factors influencing graft survival. Multivariate analysis further reinforced the fact that CIT, donor age, amount of transfusion, and hospital stay were prognostic factors for graft survival. Conclusions. PB can be performed safely in the majority of adult OLTs. Results of OLT with PB are as same as for CV. Liver function, renal function, morbidity, mortality, and patient and graft survival are similar to CV. However, amount of transfusion, WIT, and use of veno-venous bypass are less with PB. PMID:18333273

  14. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  15. Analysis of the Harrier forebody/inlet design using computational techniques

    NASA Technical Reports Server (NTRS)

    Chow, Chuen-Yen

    1993-01-01

    Under the support of this Cooperative Agreement, computations of transonic flow past the complex forebody/inlet configuration of the AV-8B Harrier II have been performed. The actual aircraft configuration was measured and its surface and surrounding domain were defined using computational structured grids. The thin-layer Navier-Stokes equations were used to model the flow along with the Chimera embedded multi-grid technique. A fully conservative, alternating direction implicit (ADI), approximately-factored, partially flux-split algorithm was employed to perform the computation. An existing code was altered to conform with the needs of the study, and some special engine face boundary conditions were developed. The algorithm incorporated the Chimera technique and an algebraic turbulence model in order to deal with the embedded multi-grids and viscous governing equations. Comparison with experimental data has yielded good agreement for the simplifications incorporated into the analysis. The aim of the present research was to provide a methodology for the numerical solution of complex, combined external/internal flows. This is the first time-dependent Navier-Stokes solution for a geometry in which the fuselage and inlet share a wall. The results indicate the methodology used here is a viable tool for transonic aircraft modeling.

  16. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  17. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  18. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  19. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  20. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  1. Exploratory factor analysis of self-reported symptoms in a large, population-based military cohort

    PubMed Central

    2010-01-01

    Background US military engagements have consistently raised concern over the array of health outcomes experienced by service members postdeployment. Exploratory factor analysis has been used in studies of 1991 Gulf War-related illnesses, and may increase understanding of symptoms and health outcomes associated with current military conflicts in Iraq and Afghanistan. The objective of this study was to use exploratory factor analysis to describe the correlations among numerous physical and psychological symptoms in terms of a smaller number of unobserved variables or factors. Methods The Millennium Cohort Study collects extensive self-reported health data from a large, population-based military cohort, providing a unique opportunity to investigate the interrelationships of numerous physical and psychological symptoms among US military personnel. This study used data from the Millennium Cohort Study, a large, population-based military cohort. Exploratory factor analysis was used to examine the covariance structure of symptoms reported by approximately 50,000 cohort members during 2004-2006. Analyses incorporated 89 symptoms, including responses to several validated instruments embedded in the questionnaire. Techniques accommodated the categorical and sometimes incomplete nature of the survey data. Results A 14-factor model accounted for 60 percent of the total variance in symptoms data and included factors related to several physical, psychological, and behavioral constructs. A notable finding was that many factors appeared to load in accordance with symptom co-location within the survey instrument, highlighting the difficulty in disassociating the effects of question content, location, and response format on factor structure. Conclusions This study demonstrates the potential strengths and weaknesses of exploratory factor analysis to heighten understanding of the complex associations among symptoms. Further research is needed to investigate the relationship between

  2. Physical and Cognitive-Affective Factors Associated with Fatigue in Individuals with Fibromyalgia: A Multiple Regression Analysis

    ERIC Educational Resources Information Center

    Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong

    2015-01-01

    Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…

  3. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  4. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  5. The Infinitesimal Jackknife with Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  6. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  7. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  8. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  9. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  10. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    PubMed

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  11. Noise-band factor analysis of cancer Fourier transform infrared evanescent-wave fiber optical (FTIR-FEW) spectra

    NASA Astrophysics Data System (ADS)

    Sukuta, Sydney; Bruch, Reinhard F.

    2002-05-01

    The goal of this study is to test the feasibility of using noise factor/eigenvector bands as general clinical analytical tools for diagnoses. We developed a new technique, Noise Band Factor Cluster Analysis (NBFCA), to diagnose benign tumors via their Fourier transform IR fiber optic evanescent wave spectral data for the first time. The middle IR region of human normal skin tissue and benign and melanoma tumors, were analyzed using this new diagnostic technique. Our results are not in full-agreement with pathological classifications hence there is a possibility that our approaches could complement or improve these traditional classification schemes. Moreover, the use of NBFCA make it much easier to delineate class boundaries hence this method provides results with much higher certainty.

  12. Optimization Techniques for Analysis of Biological and Social Networks

    DTIC Science & Technology

    2012-03-28

    analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational

  13. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  14. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  15. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  16. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  17. "Feeling unsafe": a photovoice analysis of factors influencing physical activity behavior among Malaysian adolescents.

    PubMed

    Saimon, Rosalia; Choo, Wan Yuen; Bulgiba, Awang

    2015-03-01

    Understanding the factors influencing physical activity (PA) in the Asia-Pacific region is critical, given the high prevalence of inactivity in this area. The photovoice technique explores the types of PA and factors influencing PA among adolescents in Kuching, Sarawak. A total of 160 photographs were collected from participants (adolescents, n = 22, mean age = 14.27 ± 0.7 years, and parents, n = 8, mean age = 48 ± 6.8 years). Data analysis used constant comparison methods of a grounded theory. The Analysis Grid for Environments Linked to Obesity was used to categorize PA factors. Study findings were centered on the concept of safety, facilities, parental restriction, friends, cultural traits, media, community cohesiveness, and weather. The central theme was "feeling unsafe" when being outdoors. To promote PA behavior, provision of PA facilities needs to be supported by other programs that build on peer support, crime prevention, and traffic safety, together with other educational campaigns. © 2013 APJPH.

  18. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  19. Bayesian linkage and segregation analysis: factoring the problem.

    PubMed

    Matthysse, S

    2000-01-01

    Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.

  20. A Sensitivity Analysis of Circular Error Probable Approximation Techniques

    DTIC Science & Technology

    1992-03-01

    SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some

  1. Application of Avco data analysis and prediction techniques (ADAPT) to prediction of sunspot activity

    NASA Technical Reports Server (NTRS)

    Hunter, H. E.; Amato, R. A.

    1972-01-01

    The results are presented of the application of Avco Data Analysis and Prediction Techniques (ADAPT) to derivation of new algorithms for the prediction of future sunspot activity. The ADAPT derived algorithms show a factor of 2 to 3 reduction in the expected 2-sigma errors in the estimates of the 81-day running average of the Zurich sunspot numbers. The report presents: (1) the best estimates for sunspot cycles 20 and 21, (2) a comparison of the ADAPT performance with conventional techniques, and (3) specific approaches to further reduction in the errors of estimated sunspot activity and to recovery of earlier sunspot historical data. The ADAPT programs are used both to derive regression algorithm for prediction of the entire 11-year sunspot cycle from the preceding two cycles and to derive extrapolation algorithms for extrapolating a given sunspot cycle based on any available portion of the cycle.

  2. Simple Assessment Techniques for Soil and Water. Environmental Factors in Small Scale Development Projects. Workshops.

    ERIC Educational Resources Information Center

    Coordination in Development, New York, NY.

    This booklet was produced in response to the growing need for reliable environmental assessment techniques that can be applied to small-scale development projects. The suggested techniques emphasize low-technology environmental analysis. Although these techniques may lack precision, they can be extremely valuable in helping to assure the success…

  3. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra

    NASA Astrophysics Data System (ADS)

    Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.

  4. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra.

    PubMed

    Luce, Robert; Hildebrandt, Peter; Kuhlmann, Uwe; Liesen, Jörg

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for nonnegative matrix factorization that is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with the vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed. © The Author(s) 2016.

  5. Some failure modes and analysis techniques for terrestrial solar cell modules

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Stern, K. H.

    1978-01-01

    Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.

  6. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  7. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    NASA Astrophysics Data System (ADS)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  8. Three Techniques for Task Analysis: Examples from the Nuclear Utilities.

    ERIC Educational Resources Information Center

    Carlisle, Kenneth E.

    1984-01-01

    Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)

  9. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  10. Some dissociating factors in the analysis of structural and functional progressive damage in open-angle glaucoma.

    PubMed

    Hudson, C J W; Kim, L S; Hancock, S A; Cunliffe, I A; Wild, J M

    2007-05-01

    To identify the presence, and origin, of any "dissociating factors" inherent to the techniques for evaluating progression that mask the relationship between structural and functional progression in open-angle glaucoma (OAG). 23 patients (14 with OAG and 9 with ocular hypertension (OHT)) who had received serial Heidelberg Retina Tomograph (HRT II) and Humphrey Field Analyser (HFA) examinations for >or=5 years (mean 78.4 months (SD 9.5), range 60-101 months) were identified. Evidence of progressive disease was retrospectively evaluated in one eye of each patient using the Topographic Change Analysis (TCA) and Glaucoma Progression Analysis (GPA) for the HRT II and HFA, respectively. Six patients were stable by both techniques; four exhibited both structural and functional progression; seven exhibited structural progression, only, and six showed functional progression, only. Three types of dissociating factors were identified. TCA failed to identify progressive structural damage in the presence of advanced optic nerve head damage. GPA failed to identify progressive functional damage at stimulus locations, with sensitivities exhibiting test-retest variability beyond the maximum stimulus luminance of the perimeter, and where a perimetric learning effect was apparent. The three dissociating factors accounted for nine of the 13 patients who exhibited a lack of concordance between structural and functional progressive damage.

  11. Analysis of Factors Related to Hypopituitarism in Patients with Nonsellar Intracranial Tumor.

    PubMed

    Lu, Song-Song; Gu, Jian-Jun; Luo, Xiao-Hong; Zhang, Jian-He; Wang, Shou-Sen

    2017-09-01

    Previous studies have suggested that postoperative hypopituitarism in patients with nonsellar intracranial tumors is caused by traumatic surgery. However, with development of minimally invasive and precise neurosurgical techniques, the degree of injury to brain tissue has been reduced significantly, especially for parenchymal tumors. Therefore, understanding preexisting hypopituitarism and related risk factors can improve perioperative management for patients with nonsellar intracranial tumors. Chart data were collected retrospectively from 83 patients with nonsellar intracranial tumors admitted to our hospital from May 2014 to April 2015. Pituitary function of each subject was determined based on results of preoperative serum pituitary hormone analysis. Univariate and multivariate logistic regression methods were used to analyze relationships between preoperative hypopituitarism and factors including age, sex, history of hypertension and secondary epilepsy, course of disease, tumor mass effect, site of tumor, intracranial pressure (ICP), cerebrospinal fluid content, and pituitary morphology. A total of 30 patients (36.14%) presented with preoperative hypopituitarism in either 1 axis or multiple axes; 23 (27.71%) were affected in 1 axis, and 7 (8.43%) were affected in multiple axes. Univariate analysis showed that risk factors for preoperative hypopituitarism in patients with a nonsellar intracranial tumor include an acute or subacute course (≤3 months), intracranial hypertension (ICP >200 mm H 2 O), and mass effect (P < 0.05). Multivariate logistic regression analysis showed that mass effect is an independent risk factor for preoperative hypopituitarism in patients with nonsellar intracranial tumors (P < 0.05; odds ratio, 3.197). Prevalence of hypopituitarism is high in patients with nonsellar intracranial tumors. The occurrence of hypopituitarism is correlated with factors including an acute or subacute course (≤3 months), intracranial hypertension (ICP >200

  12. Micropowder collecting technique for stable isotope analysis of carbonates.

    PubMed

    Sakai, Saburo; Kodan, Tsuyoshi

    2011-05-15

    Micromilling is a conventional technique used in the analysis of the isotopic composition of geological materials, which improves the spatial resolution of sample collection for analysis. However, a problem still remains concerning the recovery ratio of the milled sample. We constructed a simple apparatus consisting of a vacuum pump, a sintered metal filter, electrically conductive rubber stopper and a stainless steel tube for transferring the milled powder into a reaction vial. In our preliminary experiments on carbonate powder, we achieved a rapid recovery of 5 to 100 µg of carbonate with a high recovery ratio (>90%). This technique shortens the sample preparation time, improves the recovery ratio, and homogenizes the sample quantity, which, in turn, improves the analytical reproducibility. Copyright © 2011 John Wiley & Sons, Ltd.

  13. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  14. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  15. The balance sheet technique. Volume I. The balance sheet analysis technique for preconstruction review of airports and highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaBelle, S.J.; Smith, A.E.; Seymour, D.A.

    1977-02-01

    The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less

  16. Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media

    PubMed Central

    Chen, Jing; Fang, Yanjun

    2007-01-01

    A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.

  17. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural,more » technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and

  18. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  19. Phylogenetic Factor Analysis.

    PubMed

    Tolkoff, Max R; Alfaro, Michael E; Baele, Guy; Lemey, Philippe; Suchard, Marc A

    2018-05-01

    Phylogenetic comparative methods explore the relationships between quantitative traits adjusting for shared evolutionary history. This adjustment often occurs through a Brownian diffusion process along the branches of the phylogeny that generates model residuals or the traits themselves. For high-dimensional traits, inferring all pair-wise correlations within the multivariate diffusion is limiting. To circumvent this problem, we propose phylogenetic factor analysis (PFA) that assumes a small unknown number of independent evolutionary factors arise along the phylogeny and these factors generate clusters of dependent traits. Set in a Bayesian framework, PFA provides measures of uncertainty on the factor number and groupings, combines both continuous and discrete traits, integrates over missing measurements and incorporates phylogenetic uncertainty with the help of molecular sequences. We develop Gibbs samplers based on dynamic programming to estimate the PFA posterior distribution, over 3-fold faster than for multivariate diffusion and a further order-of-magnitude more efficiently in the presence of latent traits. We further propose a novel marginal likelihood estimator for previously impractical models with discrete data and find that PFA also provides a better fit than multivariate diffusion in evolutionary questions in columbine flower development, placental reproduction transitions and triggerfish fin morphometry.

  20. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization techniques.

    PubMed

    Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan

    2013-06-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.

  1. Changes in frontal plane dynamics and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis application of a multidimensional analysis technique.

    PubMed

    Astephen, J L; Deluzio, K J

    2005-02-01

    Osteoarthritis of the knee is related to many correlated mechanical factors that can be measured with gait analysis. Gait analysis results in large data sets. The analysis of these data is difficult due to the correlated, multidimensional nature of the measures. A multidimensional model that uses two multivariate statistical techniques, principal component analysis and discriminant analysis, was used to discriminate between the gait patterns of the normal subject group and the osteoarthritis subject group. Nine time varying gait measures and eight discrete measures were included in the analysis. All interrelationships between and within the measures were retained in the analysis. The multidimensional analysis technique successfully separated the gait patterns of normal and knee osteoarthritis subjects with a misclassification error rate of <6%. The most discriminatory feature described a static and dynamic alignment factor. The second most discriminatory feature described a gait pattern change during the loading response phase of the gait cycle. The interrelationships between gait measures and between the time instants of the gait cycle can provide insight into the mechanical mechanisms of pathologies such as knee osteoarthritis. These results suggest that changes in frontal plane loading and alignment and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis gait patterns. Subsequent investigations earlier in the disease process may suggest the importance of these factors to the progression of knee osteoarthritis.

  2. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  3. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  4. An improved technique for the 2H/1H analysis of urines from diabetic volunteers

    USGS Publications Warehouse

    Coplen, T.B.; Harper, I.T.

    1994-01-01

    The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, ~ 1-2???, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, approximately 1-2%, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.

  5. Comparative evaluation of power factor impovement techniques for squirrel cage induction motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spee, R.; Wallace, A.K.

    1992-04-01

    This paper describes the results obtained from a series of tests of relatively simple methods of improving the power factor of squirrel-cage induction motors. The methods, which are evaluated under controlled laboratory conditions for a 10-hp, high-efficiency motor, include terminal voltage reduction; terminal static capacitors; and a floating'' winding with static capacitors. The test results are compared with equivalent circuit model predictions that are then used to identify optimum conditions for each of the power factor improvement techniques compared with the basic induction motor. Finally, the relative economic value, and the implications of component failures, of the three methods aremore » discussed.« less

  6. Performance Analysis of Ranging Techniques for the KPLO Mission

    NASA Astrophysics Data System (ADS)

    Park, Sungjoon; Moon, Sangman

    2018-03-01

    In this study, the performance of ranging techniques for the Korea Pathfinder Lunar Orbiter (KPLO) space communication system is investigated. KPLO is the first lunar mission of Korea, and pseudo-noise (PN) ranging will be used to support the mission along with sequential ranging. We compared the performance of both ranging techniques using the criteria of accuracy, acquisition probability, and measurement time. First, we investigated the end-to-end accuracy error of a ranging technique incorporating all sources of errors such as from ground stations and the spacecraft communication system. This study demonstrates that increasing the clock frequency of the ranging system is not required when the dominant factor of accuracy error is independent of the thermal noise of the ranging technique being used in the system. Based on the understanding of ranging accuracy, the measurement time of PN and sequential ranging are further investigated and compared, while both techniques satisfied the accuracy and acquisition requirements. We demonstrated that PN ranging performed better than sequential ranging in the signal-to-noise ratio (SNR) regime where KPLO will be operating, and we found that the T2B (weighted-voting balanced Tausworthe, voting v = 2) code is the best choice among the PN codes available for the KPLO mission.

  7. A single factor underlies the metabolic syndrome: a confirmatory factor analysis.

    PubMed

    Pladevall, Manel; Singal, Bonita; Williams, L Keoki; Brotons, Carlos; Guyer, Heidi; Sadurni, Josep; Falces, Carles; Serrano-Rios, Manuel; Gabriel, Rafael; Shaw, Jonathan E; Zimmet, Paul Z; Haffner, Steven

    2006-01-01

    Confirmatory factor analysis (CFA) was used to test the hypothesis that the components of the metabolic syndrome are manifestations of a single common factor. Three different datasets were used to test and validate the model. The Spanish and Mauritian studies included 207 men and 203 women and 1,411 men and 1,650 women, respectively. A third analytical dataset including 847 men was obtained from a previously published CFA of a U.S. population. The one-factor model included the metabolic syndrome core components (central obesity, insulin resistance, blood pressure, and lipid measurements). We also tested an expanded one-factor model that included uric acid and leptin levels. Finally, we used CFA to compare the goodness of fit of one-factor models with the fit of two previously published four-factor models. The simplest one-factor model showed the best goodness-of-fit indexes (comparative fit index 1, root mean-square error of approximation 0.00). Comparisons of one-factor with four-factor models in the three datasets favored the one-factor model structure. The selection of variables to represent the different metabolic syndrome components and model specification explained why previous exploratory and confirmatory factor analysis, respectively, failed to identify a single factor for the metabolic syndrome. These analyses support the current clinical definition of the metabolic syndrome, as well as the existence of a single factor that links all of the core components.

  8. Probability techniques for reliability analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Wetherhold, Robert C.; Ucci, Anthony M.

    1994-01-01

    Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.

  9. Anthropometric data reduction using confirmatory factor analysis.

    PubMed

    Rohani, Jafri Mohd; Olusegun, Akanbi Gabriel; Rani, Mat Rebi Abdul

    2014-01-01

    The unavailability of anthropometric data especially in developing countries has remained a limiting factor towards the design of learning facilities with sufficient ergonomic consideration. Attempts to use anthropometric data from developed countries have led to provision of school facilities unfit for the users. The purpose of this paper is to use factor analysis to investigate the suitability of the collected anthropometric data as a database for school design in Nigerian tertiary institutions. Anthropometric data were collected from 288 male students in a Federal Polytechnic in North-West of Nigeria. Their age is between 18-25 years. Nine vertical anthropometric dimensions related to heights were collected using the conventional traditional equipment. Exploratory factor analysis was used to categorize the variables into a model consisting of two factors. Thereafter, confirmatory factor analysis was used to investigate the fit of the data to the proposed model. A just identified model, made of two factors, each with three variables was developed. The variables within the model accounted for 81% of the total variation of the entire data. The model was found to demonstrate adequate validity and reliability. Various measuring indices were used to verify that the model fits the data properly. The final model reveals that stature height and eye height sitting were the most stable variables for designs that have to do with standing and sitting construct. The study has shown the application of factor analysis in anthropometric data analysis. The study highlighted the relevance of these statistical tools to investigate variability among anthropometric data involving diverse population, which has not been widely used for analyzing previous anthropometric data. The collected data is therefore suitable for use while designing for Nigerian students.

  10. Using Linear Regression To Determine the Number of Factors To Retain in Factor Analysis and the Number of Issues To Retain in Delphi Studies and Other Surveys.

    ERIC Educational Resources Information Center

    Jurs, Stephen; And Others

    The scree test and its linear regression technique are reviewed, and results of its use in factor analysis and Delphi data sets are described. The scree test was originally a visual approach for making judgments about eigenvalues, which considered the relationships of the eigenvalues to one another as well as their actual values. The graph that is…

  11. Misplacement of left-sided double-lumen tubes into the right mainstem bronchus: incidence, risk factors and blind repositioning techniques.

    PubMed

    Seo, Jeong-Hwa; Bae, Jun-Yeol; Kim, Hyun Joo; Hong, Deok Man; Jeon, Yunseok; Bahk, Jae-Hyon

    2015-10-28

    Double-lumen endobronchial tubes (DLTs) are commonly advanced into the mainstem bronchus either blindly or by fiberoptic bronchoscopic guidance. However, blind advancement may result in misplacement of left-sided DLTs into the right bronchus. Therefore, incidence, risk factors, and blind repositioning techniques for right bronchial misplacement of left-sided DLTs were investigated. This was an observational cohort study performed on the data depository consecutively collected from patients who underwent intubation of left-sided DLTs for 2 years. Patients' clinical and anatomical characteristics were analyzed to investigate risk factors for DLT misplacements with logistic regression analysis. Moreover, when DLTs were misplaced into the right bronchus, the bronchial tube was withdrawn into the trachea and blindly readvanced without rotation, or with 90° or 180° counterclockwise rotation while the patient's head was turned right. DLTs were inadvertently advanced into the right bronchus in 48 of 1135 (4.2 %) patients. DLT misplacements occurred more frequently in females, in patients of short stature or with narrow trachea and bronchi, and when small-sized DLTs were used. All of these factors were significantly inter-correlated each other (P < 0.001). In 40 of the 48 (83.3 %) patients, blind repositioning was successful. Smaller left-sided DLTs were more frequently misplaced into the right mainstem bronchus than larger DLTs. Moreover, we were usually able to reposition the misplaced DLTs into the left bronchus by using the blind techniques. ClinicalTrials.gov Identifier: NCT01371773.

  12. Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.

    ERIC Educational Resources Information Center

    Lynch, James; And Others

    1996-01-01

    Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…

  13. Medical University admission test: a confirmatory factor analysis of the results.

    PubMed

    Luschin-Ebengreuth, Marion; Dimai, Hans P; Ithaler, Daniel; Neges, Heide M; Reibnegger, Gilbert

    2016-05-01

    The Graz Admission Test has been applied since the academic year 2006/2007. The validity of the Test was demonstrated by a significant improvement of study success and a significant reduction of dropout rate. The purpose of this study was a detailed analysis of the internal correlation structure of the various components of the Graz Admission Test. In particular, the question investigated was whether or not the various test parts constitute a suitable construct which might be designated as "Basic Knowledge in Natural Science." This study is an observational investigation, analyzing the results of the Graz Admission Test for the study of human medicine and dentistry. A total of 4741 applicants were included in the analysis. Principal component factor analysis (PCFA) as well as techniques from structural equation modeling, specifically confirmatory factor analysis (CFA), were employed to detect potential underlying latent variables governing the behavior of the measured variables. PCFA showed good clustering of the science test parts, including also text comprehension. A putative latent variable "Basic Knowledge in Natural Science," investigated by CFA, was indeed shown to govern the response behavior of the applicants in biology, chemistry, physics, and mathematics as well as text comprehension. The analysis of the correlation structure of the various test parts confirmed that the science test parts together with text comprehension constitute a satisfactory instrument for measuring a latent construct variable "Basic Knowledge in Natural Science." The present results suggest the fundamental importance of basic science knowledge for results obtained in the framework of the admission process for medical universities.

  14. Background recovery via motion-based robust principal component analysis with matrix factorization

    NASA Astrophysics Data System (ADS)

    Pan, Peng; Wang, Yongli; Zhou, Mingyuan; Sun, Zhipeng; He, Guoping

    2018-03-01

    Background recovery is a key technique in video analysis, but it still suffers from many challenges, such as camouflage, lighting changes, and diverse types of image noise. Robust principal component analysis (RPCA), which aims to recover a low-rank matrix and a sparse matrix, is a general framework for background recovery. The nuclear norm is widely used as a convex surrogate for the rank function in RPCA, which requires computing the singular value decomposition (SVD), a task that is increasingly costly as matrix sizes and ranks increase. However, matrix factorization greatly reduces the dimension of the matrix for which the SVD must be computed. Motion information has been shown to improve low-rank matrix recovery in RPCA, but this method still finds it difficult to handle original video data sets because of its batch-mode formulation and implementation. Hence, in this paper, we propose a motion-assisted RPCA model with matrix factorization (FM-RPCA) for background recovery. Moreover, an efficient linear alternating direction method of multipliers with a matrix factorization (FL-ADM) algorithm is designed for solving the proposed FM-RPCA model. Experimental results illustrate that the method provides stable results and is more efficient than the current state-of-the-art algorithms.

  15. Comparisons of Exploratory and Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…

  16. Impact during equine locomotion: techniques for measurement and analysis.

    PubMed

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  17. Using Multilevel Factor Analysis with Clustered Data: Investigating the Factor Structure of the Positive Values Scale

    ERIC Educational Resources Information Center

    Huang, Francis L.; Cornell, Dewey G.

    2016-01-01

    Advances in multilevel modeling techniques now make it possible to investigate the psychometric properties of instruments using clustered data. Factor models that overlook the clustering effect can lead to underestimated standard errors, incorrect parameter estimates, and model fit indices. In addition, factor structures may differ depending on…

  18. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    PubMed Central

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  19. Image Analysis Technique for Material Behavior Evaluation in Civil Structures.

    PubMed

    Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca

    2017-07-08

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.

  20. Bootstrap Standard Error Estimates in Dynamic Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Browne, Michael W.

    2010-01-01

    Dynamic factor analysis summarizes changes in scores on a battery of manifest variables over repeated measurements in terms of a time series in a substantially smaller number of latent factors. Algebraic formulae for standard errors of parameter estimates are more difficult to obtain than in the usual intersubject factor analysis because of the…

  1. An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique

    USGS Publications Warehouse

    Tipper, J.C.

    1979-01-01

    Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.

  2. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  3. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    DTIC Science & Technology

    2018-04-30

    Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice

  4. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    PubMed

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  5. Measurements of K shell absorption jump factors and jump ratios using EDXRF technique

    NASA Astrophysics Data System (ADS)

    Kacal, Mustafa Recep; Han, İbrahim; Akman, Ferdi

    2015-04-01

    In the present work, the K-shell absorption jump factors and jump ratios for 30 elements between Ti ( Z = 22) and Er ( Z = 68) were measured by energy dispersive X-ray fluorescence (EDXRF) technique. The jump factors and jump ratios for these elements were determined by measuring the K shell fluorescence parameters such as the Kα X-ray production cross-sections, K shell fluorescence yields, Kβ-to- Kα X-rays intensity ratios, total atomic absorption cross sections and mass attenuation coefficients. The measurements were performed using an Am-241 radioactive point source and a Si (Li) detector in direct excitation and transmission experimental geometry. The results for jump factors and jump ratios were compared with theoretically calculated and the ones available in the literature.

  6. A CHARTING TECHNIQUE FOR THE ANALYSIS OF BUSINESS SYSTEMS,

    DTIC Science & Technology

    This paper describes a charting technique useful in the analysis of business systems and in studies of the information economics of the firm. The...planning advanced systems. It is not restricted to any particular kind of business or information system. (Author)

  7. Success rate and risk factors of failure of the induced membrane technique in children: a systematic review.

    PubMed

    Aurégan, Jean-Charles; Bégué, Thierry; Rigoulot, Guillaume; Glorion, Christophe; Pannier, Stéphanie

    2016-12-01

    The induced membrane technique was designed by Masquelet et al. to address segmental bone defects of critical size in adults. It has been used after bone defects of traumatic, infectious and tumoral origin with satisfactory results. Recently, it has been used in children but, after an initial enthusiasm, several cases of failure have been reported. The purpose of this study was to assess the success rate and the risk factors of failure of the induced membrane for children. We conducted a systematic review of all the studies reporting the results of the induced membrane technique to address bone defects of critical size in children. Our primary outcome was the success rate of the technique defined as a bone union before any iterative surgery. Our secondary outcomes were the complications and the risk factors of failure. We searched Medline via Pubmed, EMBASE and the Cochrane Library. Twelve studies, including 69 patients, met the inclusion criteria. There were 41 boys and 28 girls. Mean age at surgery was 10 years. Mean size of resection was 12.38 cm and the mean time between the two stages was 5.86 months. Mean rate of bone union after the two stages of the induced membrane technique was 58% (40/69) but this rate increased to 87% after revision surgeries (60/69). Main complications were non-unions (19/69), lysis of the graft (6/69) and fractures of the bone graft (6/69). Only 1/69 deep infection was reported. Other non specific complications were regularly reported such limb length discrepancies, joint stiffness and protruding wires. Risk factor of failure that could be suspected comprised the resection of a malignant tumour, a bone defect located at the femur, a wide resection, a long time between the two stages, an unstable osteosynthesis and a bone graft associating autograft to other graft materials. The induced membrane technique is suitable for bone defects of critical size in children. It is a reliable technique with no need of micro vascular surgery

  8. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  9. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  10. Chromatin Immunoprecipitation Sequencing (ChIP-Seq) for Transcription Factors and Chromatin Factors in Arabidopsis thaliana Roots: From Material Collection to Data Analysis.

    PubMed

    Cortijo, Sandra; Charoensawan, Varodom; Roudier, François; Wigge, Philip A

    2018-01-01

    Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-seq) is a powerful technique to investigate in vivo transcription factor (TF) binding to DNA, as well as chromatin marks. Here we provide a detailed protocol for all the key steps to perform ChIP-seq in Arabidopsis thaliana roots, also working on other A. thaliana tissues and in most non-ligneous plants. We detail all steps from material collection, fixation, chromatin preparation, immunoprecipitation, library preparation, and finally computational analysis based on a combination of publicly available tools.

  11. Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory

    NASA Astrophysics Data System (ADS)

    Steels, R. S., Jr.; Babelay, E. F., Jr.

    1980-07-01

    Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.

  12. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  13. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  14. Evaluation of analysis techniques for low frequency interior noise and vibration of commercial aircraft

    NASA Technical Reports Server (NTRS)

    Landmann, A. E.; Tillema, H. F.; Marshall, S. E.

    1989-01-01

    The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.

  15. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  16. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  17. Brazilian version of the Jefferson Scale of Empathy: psychometric properties and factor analysis

    PubMed Central

    2012-01-01

    Background Empathy is a central characteristic of medical professionalism and has recently gained attention in medical education research. The Jefferson Scale of Empathy is the most commonly used measure of empathy worldwide, and to date it has been translated in 39 languages. This study aimed to adapt the Jefferson Scale of Empathy to the Brazilian culture and to test its reliability and validity among Brazilian medical students. Methods The Portuguese version of the Jefferson Scale of Empathy was adapted to Brazil using back-translation techniques. This version was pretested among 39 fifth-year medical students in September 2010. During the final fifth- and sixth-year Objective Structured Clinical Examination (October 2011), 319 students were invited to respond to the scale anonymously. Cronbach’s alpha, exploratory factor analysis, item-total correlation, and gender comparisons were performed to check the reliability and validity of the scale. Results The student response rate was 93.7% (299 students). Cronbach’s coefficient for the scale was 0.84. A principal component analysis confirmed the construct validity of the scale for three main factors: Compassionate Care (first factor), Ability to Stand in the Patient’s Shoes (second factor), and Perspective Taking (third factor). Gender comparisons did not reveal differences in the scores between female and male students. Conclusions The adapted Brazilian version of the Jefferson Scale of Empathy proved to be a valid, reliable instrument for use in national and cross-cultural studies in medical education. PMID:22873730

  18. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    ERIC Educational Resources Information Center

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…

  19. Recent advances in capillary electrophoretic migration techniques for pharmaceutical analysis.

    PubMed

    Deeb, Sami El; Wätzig, Hermann; El-Hady, Deia Abd; Albishri, Hassan M; de Griend, Cari Sänger-van; Scriba, Gerhard K E

    2014-01-01

    Since the introduction about 30 years ago, CE techniques have gained a significant impact in pharmaceutical analysis. The present review covers recent advances and applications of CE for the analysis of pharmaceuticals. Both small molecules and biomolecules such as proteins are considered. The applications range from the determination of drug-related substances to the analysis of counterions and the determination of physicochemical parameters. Furthermore, general considerations of CE methods in pharmaceutical analysis are described. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.

    PubMed

    Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko

    2017-11-01

    A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.

  1. A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    2016-07-18

    This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.

  2. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    NASA Astrophysics Data System (ADS)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  3. The combined use of order tracking techniques for enhanced Fourier analysis of order components

    NASA Astrophysics Data System (ADS)

    Wang, K. S.; Heyns, P. S.

    2011-04-01

    Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.

  4. A study of data analysis techniques for the multi-needle Langmuir probe

    NASA Astrophysics Data System (ADS)

    Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.

    2018-06-01

    In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.

  5. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  6. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  7. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  8. Measured extent of agricultural expansion depends on analysis technique

    DOE PAGES

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.; ...

    2017-01-31

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  9. Measured extent of agricultural expansion depends on analysis technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  10. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    NASA Astrophysics Data System (ADS)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  11. Application of Petri net based analysis techniques to signal transduction pathways.

    PubMed

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-11-02

    Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The

  12. Bad splits in bilateral sagittal split osteotomy: systematic review and meta-analysis of reported risk factors.

    PubMed

    Steenen, S A; van Wijk, A J; Becking, A G

    2016-08-01

    An unfavourable and unanticipated pattern of the bilateral sagittal split osteotomy (BSSO) is generally referred to as a 'bad split'. Patient factors predictive of a bad split reported in the literature are controversial. Suggested risk factors are reviewed in this article. A systematic review was undertaken, yielding a total of 30 studies published between 1971 and 2015 reporting the incidence of bad split and patient age, and/or surgical technique employed, and/or the presence of third molars. These included 22 retrospective cohort studies, six prospective cohort studies, one matched-pair analysis, and one case series. Spearman's rank correlation showed a statistically significant but weak correlation between increasing average age and increasing occurrence of bad splits in 18 studies (ρ=0.229; P<0.01). No comparative studies were found that assessed the incidence of bad split among the different splitting techniques. A meta-analysis pooling the effect sizes of seven cohort studies showed no significant difference in the incidence of bad split between cohorts of patients with third molars present and concomitantly removed during surgery, and patients in whom third molars were removed at least 6 months preoperatively (odds ratio 1.16, 95% confidence interval 0.73-1.85, Z=0.64, P=0.52). In summary, there is no robust evidence to date to show that any risk factor influences the incidence of bad split. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Determination of minor and trace elements concentration in kidney stones using elemental analysis techniques

    NASA Astrophysics Data System (ADS)

    Srivastava, Anjali

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.

  14. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  15. Job compensable factors and factor weights derived from job analysis data.

    PubMed

    Chi, Chia-Fen; Chang, Tin-Chang; Hsia, Ping-Ling; Song, Jen-Chieh

    2007-06-01

    Government data on 1,039 job titles in Taiwan were analyzed to assess possible relationships between job attributes and compensation. For each job title, 79 specific variables in six major classes (required education and experience, aptitude, interest, work temperament, physical demands, task environment) were coded to derive the statistical predictors of wage for managers, professionals, technical, clerical, service, farm, craft, operatives, and other workers. Of the 79 variables, only 23 significantly related to pay rate were subjected to a factor and multiple regression analysis for predicting monthly wages. Given the heterogeneous nature of collected job titles, a 4-factor solution (occupational knowledge and skills, human relations skills, work schedule hardships, physical hardships) explaining 43.8% of the total variance but predicting only 23.7% of the monthly pay rate was derived. On the other hand, multiple regression with 9 job analysis items (required education, professional training, professional certificate, professional experience, coordinating, leadership and directing, demand on hearing, proportion of shift working indoors, outdoors and others, rotating shift) better predicted pay and explained 32.5% of the variance. A direct comparison of factors and subfactors of job evaluation plans indicated mental effort and responsibility (accountability) had not been measured with the current job analysis data. Cross-validation of job evaluation factors and ratings with the wage rates is required to calibrate both.

  16. Temporal Trends and Factors Associated with Home Hemodialysis Technique Survival in Canada.

    PubMed

    Perl, Jeffrey; Na, Yingbo; Tennankore, Karthik K; Chan, Christopher T

    2017-07-24

    The last 15 years has seen growth in home hemodialysis (HD) utilization in Canada owing to reports of improved outcomes relative to patients on conventional in-center HD. What effect growth has had on home HD technique and patient survival during this period is not known. We compared the risk of home HD technique failure, mortality, and the composite outcome among three incident cohorts of patients on home HD in Canada: 1996-2002, 2003-2007, and 2008-2012. A multivariable piece-wise exponential model was used to evaluate all outcomes using inverse probability of treatment and censoring weights. A total of 1869 incident patients on home HD were identified from the Canadian Organ Replacement Register. Relative to those treated between 2003 and 2007 ( n =568), the risk of home HD technique failure was similar between patients treated between 1996 and 2002 ( n =233; adjusted hazard ratio [AHR], 1.39; 95% confidence interval [95% CI], 0.78 to 2.46) but higher among incident patients on home HD treated between 2008 and 2012 ( n =1068; AHR, 1.51; 95% CI, 1.06 to 2.15). Relative to patients treated between 2003 and 2007, adjusted mortality was similar among those treated between 2008 and 2012 (AHR, 0.83; 95% CI, 0.58 to 1.19) and those treated between 1996 and 2002 (AHR, 0.67; 95% CI, 0.38 to 1.21). The risk of the composite outcome of death and technique failure was similar across cohorts, as was the risk of receiving a kidney transplant. Increasing age, diabetes as a comorbidity, and smoking status were associated with an increased risk of death as well as the composite outcome. Medium-sized facilities had a lower risk of death, technique failure, and the composite outcome compared with larger facilities. A higher risk of technique failure was seen in the most contemporary era. Further characterization of the risk factors for, and causes of technique failure is needed to develop strategies to improve patient retention on home HD. Copyright © 2017 by the American Society

  17. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  18. Analysis of Factors Affecting the Success of Onions Development Program in Kampar Regency

    NASA Astrophysics Data System (ADS)

    Amalia; Putri, Asgami

    2017-12-01

    The purpose of this study is to analyze the factors influencing the success of the onion plant development program in Kampar regency. The research method used was the applied survey method using interview technique and observation or direct supervision on the location of the object. The briefing of the interviews as well as the accuracy of collecting the required data was guided by the structured questionnaires. Determination technique of location / region sampling was done purposively based on the potency and capacity of commodity development. While the respondents were taken by cluster purvosive sampling method in order to classify the samples in accordance with the purpose of the study, determined by as many as 100 people taken from members of the farmer group. Analytical technique used is by using Logic Regression Analysis to determine the factors that influence the success of the program seen from the characteristics of farmers. From the results of this study it can be concluded that the factors influencing the success of onion development program in Kampar regency were a age (X1), education (X2), income (X3), ethnicity (X4), jobs (X5) And family responsibility (X6) could be made as follows: Log Y (P/1-p) = -1.778 +X10.021 + X20.028 - X30.213 + X41.986 + X52.930 - X60.455 From the above equation, it can be explained that the attributes that are positively related are X1 (age), X2 (education), X4 (ethnicity) and X5 (jobs) while the negative correlates are X3 (income) and X6 (family responsibility). From the logical regression result it can be seen that the significant value <0,05, then the independent variable influenced the dependent variable, so that when viewed from the table in the equation it was found that factors affecting the success rate of red onion development program in Kampar regency were X2 (education), X4 (ethnicity), X5 (jobs), and X6 (family responsibility).

  19. New Factorization Techniques and Fast Serial and Parrallel Algorithms for Operational Space Control of Robot Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Djouani, Karim; Fried, George; Pontnau, Jean

    1997-01-01

    In this paper a new factorization technique for computation of inverse of mass matrix, and the operational space mass matrix, as arising in implementation of the operational space control scheme, is presented.

  20. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  1. Decision Analysis Techniques for Adult Learners: Application to Leadership

    ERIC Educational Resources Information Center

    Toosi, Farah

    2017-01-01

    Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…

  2. Analysis of questioning technique during classes in medical education.

    PubMed

    Cho, Young Hye; Lee, Sang Yeoup; Jeong, Dong Wook; Im, Sun Ju; Choi, Eun Jung; Lee, Sun Hee; Baek, Sun Yong; Kim, Yun Jin; Lee, Jeong Gyu; Yi, Yu Hyone; Bae, Mi Jin; Yune, So Jung

    2012-06-12

    Questioning is one of the essential techniques used by lecturers to make lectures more interactive and effective. This study surveyed the perception of questioning techniques by medical school faculty members and analyzed how the questioning technique is used in actual classes. Data on the perceptions of the questioning skills used during lectures was collected using a self-questionnaire for faculty members (N = 33) during the second semester of 2008. The questionnaire consisted of 18 items covering the awareness and characteristics of questioning skills. Recorded video tapes were used to observe the faculty members' questioning skills. Most faculty members regarded the questioning technique during classes as being important and expected positive outcomes in terms of the students' participation in class, concentration in class and understanding of the class contents. In the 99 classes analyzed, the median number of questions per class was 1 (0-29). Among them, 40 classes (40.4 %) did not use questioning techniques. The frequency of questioning per lecture was similar regardless of the faculty members' perception. On the other hand, the faculty members perceived that their usual wait time after question was approximately 10 seconds compared to only 2.5 seconds measured from video analysis. More lecture-experienced faculty members tended to ask more questions in class. There were some discrepancies regarding the questioning technique between the faculty members' perceptions and reality, even though they had positive opinions of the technique. The questioning skills during a lecture need to be emphasized to faculty members.

  3. A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.

    PubMed

    Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian

    2018-01-19

    This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Using BMDP and SPSS for a Q factor analysis.

    PubMed

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  5. Analysis of significant factors for dengue fever incidence prediction.

    PubMed

    Siriyasatien, Padet; Phumee, Atchara; Ongruk, Phatsavee; Jampachaisri, Katechan; Kesorn, Kraisak

    2016-04-16

    Many popular dengue forecasting techniques have been used by several researchers to extrapolate dengue incidence rates, including the K-H model, support vector machines (SVM), and artificial neural networks (ANN). The time series analysis methodology, particularly ARIMA and SARIMA, has been increasingly applied to the field of epidemiological research for dengue fever, dengue hemorrhagic fever, and other infectious diseases. The main drawback of these methods is that they do not consider other variables that are associated with the dependent variable. Additionally, new factors correlated to the disease are needed to enhance the prediction accuracy of the model when it is applied to areas of similar climates, where weather factors such as temperature, total rainfall, and humidity are not substantially different. Such drawbacks may consequently lower the predictive power for the outbreak. The predictive power of the forecasting model-assessed by Akaike's information criterion (AIC), Bayesian information criterion (BIC), and the mean absolute percentage error (MAPE)-is improved by including the new parameters for dengue outbreak prediction. This study's selected model outperforms all three other competing models with the lowest AIC, the lowest BIC, and a small MAPE value. The exclusive use of climate factors from similar locations decreases a model's prediction power. The multivariate Poisson regression, however, effectively forecasts even when climate variables are slightly different. Female mosquitoes and seasons were strongly correlated with dengue cases. Therefore, the dengue incidence trends provided by this model will assist the optimization of dengue prevention. The present work demonstrates the important roles of female mosquito infection rates from the previous season and climate factors (represented as seasons) in dengue outbreaks. Incorporating these two factors in the model significantly improves the predictive power of dengue hemorrhagic fever forecasting

  6. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  8. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  9. Techniques for Analysis of DSN 64-meter Antenna Azimuth Bearing Film Height Records

    NASA Technical Reports Server (NTRS)

    Stevens, R.; Quach, C. T.

    1983-01-01

    The DSN 64-m antennas use oil pad azimuth thrust bearings. Instrumentation on the bearing pads measures the height of the oil film between the pad and the bearing runner. Techniques to analyze the film height record are developed and discussed. The analysis techniques present the unwieldy data in a compact form for assessment of bearing condition. The techniques are illustrated by analysis of a small sample of film height records from each of the three 64-m antennas. The results show the general condition of the bearings of DSS 43 and DSS 63 as good to excellent, and a DSS 14 as marginal.

  10. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  11. Advanced analysis technique for the evaluation of linear alternators and linear motors

    NASA Technical Reports Server (NTRS)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  12. Use of different spectroscopic techniques in the analysis of Roman age wall paintings.

    PubMed

    Agnoli, Francesca; Calliari, Irene; Mazzocchin, Gian-Antonio

    2007-01-01

    In this paper the analysis of samples of Roman age wall paintings coming from: Pordenone, Vicenza and Verona is carried out by using three different techniques: energy dispersive x-rays spectroscopy (EDS), x-rays fluorescence (XRF) and proton induced x-rays emission (PIXE). The features of the three spectroscopic techniques in the analysis of samples of archaeological interest are discussed. The studied pigments were: cinnabar, yellow ochre, green earth, Egyptian blue and carbon black.

  13. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  14. Item Factor Analysis: Current Approaches and Future Directions

    ERIC Educational Resources Information Center

    Wirth, R. J.; Edwards, Michael C.

    2007-01-01

    The rationale underlying factor analysis applies to continuous and categorical variables alike; however, the models and estimation methods for continuous (i.e., interval or ratio scale) data are not appropriate for item-level data that are categorical in nature. The authors provide a targeted review and synthesis of the item factor analysis (IFA)…

  15. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  16. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  17. Text mining factor analysis (TFA) in green tea patent data

    NASA Astrophysics Data System (ADS)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  18. Technique for information retrieval using enhanced latent semantic analysis generating rank approximation matrix by factorizing the weighted morpheme-by-document matrix

    DOEpatents

    Chew, Peter A; Bader, Brett W

    2012-10-16

    A technique for information retrieval includes parsing a corpus to identify a number of wordform instances within each document of the corpus. A weighted morpheme-by-document matrix is generated based at least in part on the number of wordform instances within each document of the corpus and based at least in part on a weighting function. The weighted morpheme-by-document matrix separately enumerates instances of stems and affixes. Additionally or alternatively, a term-by-term alignment matrix may be generated based at least in part on the number of wordform instances within each document of the corpus. At least one lower rank approximation matrix is generated by factorizing the weighted morpheme-by-document matrix and/or the term-by-term alignment matrix.

  19. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  20. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  1. Passive fishing techniques: a cause of turtle mortality in the Mississippi River

    USGS Publications Warehouse

    Barko, V.A.; Briggler, J.T.; Ostendorf, D.E.

    2004-01-01

    We investigated variation of incidentally captured turtle mortality in response to environmental factors and passive fishing techniques. We used Long Term Resource Monitoring Program (LTRMP) data collected from 1996 to 2001 in the unimpounded upper Mississippi River (UMR) adjacent to Missouri and Illinois, USA. We used a principle components analysis (PCA) and a stepwise discriminant function analysis to identify factors correlated with mortality of captured turtles. Furthermore, we were interested in what percentage of turtles died from passive fishing techniques and what techniques caused the most turtle mortality. The main factors influencing captured turtle mortality were water temperature and depth at net deployment. Fyke nets captured the most turtles and caused the most turtle mortality. Almost 90% of mortalities occurred in offshore aquatic areas (i.e., side channel or tributary). Our results provide information on causes of turtle mortality (as bycatch) in a riverine system and implications for river turtle conservation by suggesting management strategies to reduce turtle bycatch and decrease mortality of captured turtles.

  2. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically

  3. Common factor analysis versus principal component analysis: choice for symptom cluster research.

    PubMed

    Kim, Hee-Ju

    2008-03-01

    The purpose of this paper is to examine differences between two factor analytical methods and their relevance for symptom cluster research: common factor analysis (CFA) versus principal component analysis (PCA). Literature was critically reviewed to elucidate the differences between CFA and PCA. A secondary analysis (N = 84) was utilized to show the actual result differences from the two methods. CFA analyzes only the reliable common variance of data, while PCA analyzes all the variance of data. An underlying hypothetical process or construct is involved in CFA but not in PCA. PCA tends to increase factor loadings especially in a study with a small number of variables and/or low estimated communality. Thus, PCA is not appropriate for examining the structure of data. If the study purpose is to explain correlations among variables and to examine the structure of the data (this is usual for most cases in symptom cluster research), CFA provides a more accurate result. If the purpose of a study is to summarize data with a smaller number of variables, PCA is the choice. PCA can also be used as an initial step in CFA because it provides information regarding the maximum number and nature of factors. In using factor analysis for symptom cluster research, several issues need to be considered, including subjectivity of solution, sample size, symptom selection, and level of measure.

  4. Potential barriers to the application of multi-factor portfolio analysis in public hospitals: evidence from a pilot study in the Netherlands.

    PubMed

    Pavlova, Milena; Tsiachristas, Apostolos; Vermaeten, Gerhard; Groot, Wim

    2009-01-01

    Portfolio analysis is a business management tool that can assist health care managers to develop new organizational strategies. The application of portfolio analysis to US hospital settings has been frequently reported. In Europe however, the application of this technique has received little attention, especially concerning public hospitals. Therefore, this paper examines the peculiarities of portfolio analysis and its applicability to the strategic management of European public hospitals. The analysis is based on a pilot application of a multi-factor portfolio analysis in a Dutch university hospital. The nature of portfolio analysis and the steps in a multi-factor portfolio analysis are reviewed along with the characteristics of the research setting. Based on these data, a multi-factor portfolio model is developed and operationalized. The portfolio model is applied in a pilot investigation to analyze the market attractiveness and hospital strengths with regard to the provision of three orthopedic services: knee surgery, hip surgery, and arthroscopy. The pilot portfolio analysis is discussed to draw conclusions about potential barriers to the overall adoption of portfolio analysis in the management of a public hospital. Copyright (c) 2008 John Wiley & Sons, Ltd.

  5. Reduced-Smoke Solid Propellant Combustion Products Analysis. Development of a Micromotor Combustor Technique.

    DTIC Science & Technology

    1976-10-01

    A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of

  6. Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.

    PubMed

    Demerdash, Omar N A; Mitchell, Julie C

    2012-07-01

    Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.

  7. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Matthew W.

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include themore » inherently weak Raman cross section and susceptibility to fluorescence interference.« less

  8. Derived Basic Ability Factors: A Factor Analysis Replication Study.

    ERIC Educational Resources Information Center

    Lee, Mickey, M.; Lee, Lynda Newby

    The purpose of this study was to replicate the study conducted by Potter, Sagraves, and McDonald to determine whether their recommended analysis could separate criterion variables into similar factors that were stable from year to year and from school to school. The replication samples consisted of all students attending Louisiana State University…

  9. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    PubMed

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (P<0.001). Recurrence was found in 18 cases of the benign group and 39 cases of the malignant group, and results were significantly different (P<0.001). Tumor recurrence was shorter in patients with a higher McCormick grade (P<0.001). Recurrence was found in 13 patients with resection and all the patients with partial resection or biopsy/decompression. The results were significantly different (P<0.001). Logistic regression analysis of total resection-related factors showed that total resection

  10. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors

    PubMed Central

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-01-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (P<0.001). Recurrence was found in 18 cases of the benign group and 39 cases of the malignant group, and results were significantly different (P<0.001). Tumor recurrence was shorter in patients with a higher McCormick grade (P<0.001). Recurrence was found in 13 patients with resection and all the patients with partial resection or biopsy/decompression. The results were significantly different (P<0.001). Logistic regression analysis of total resection-related factors showed that total resection

  11. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  12. Individual and organizational factors related to community clinicians’ use of therapy techniques in a large public mental health system

    PubMed Central

    Marcus, Steven; Aarons, Gregory A.; Hoagwood, Kimberly E.; Schoenwald, Sonja; Evans, Arthur C.; Hurford, Matthew O.; Hadley, Trevor; Barg, Frances K.; Walsh, Lucia M.; Adams, Danielle R.; Mandell, David S.

    2015-01-01

    Importance Few studies have examined the effects of both clinician and organizational characteristics on the use of evidence-based practices in mental healthcare. Improved understanding of these factors could guide future implementation efforts to ensure effective adoption, implementation, and sustainment of evidence-based practices. Objective To estimate the relative contribution of clinician and organizational factors on clinician self-reported use of cognitive-behavioral, family, and psychodynamic techniques within the context of a large-scale effort to increase use of evidence-based practices in an urban public mental health system serving youth and families. Design Observational and cross-sectional. Data collected in 2013. Setting Twenty-three organizations. Participants We used purposive sampling to recruit the 29 largest child-serving agencies, which together serve approximately 80% of youth receiving publically funded mental health care. The final sample included 19 agencies with 23 sites, 130 therapists, 36 supervisors, and 22 executive administrators. Main Outcome Measures Clinician self-reported use of cognitive-behavioral, family, and psychodynamic techniques, as measured by the Therapist Procedures Checklist – Family Revised. Results Linear mixed-effects regression models were used; models included random intercepts for organization to account for nesting of clinicians within organization. Clinician factors accounted for the following percentage of the overall variation: cognitive-behavioral (16%), family (7%), psychodynamic (20%). Organizational factors accounted for the following percentage of the overall variation: cognitive-behavioral (23%), family (19%), psychodynamic (7%). Older clinicians and clinicians with more open attitudes were more likely to endorse use of cognitive behavioral techniques, as were those in organizations that had spent fewer years participating in evidence-based practice initiatives, had more resistant cultures, and had

  13. On the Likelihood Ratio Test for the Number of Factors in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.; Yuan, Ke-Hai

    2007-01-01

    In the exploratory factor analysis, when the number of factors exceeds the true number of factors, the likelihood ratio test statistic no longer follows the chi-square distribution due to a problem of rank deficiency and nonidentifiability of model parameters. As a result, decisions regarding the number of factors may be incorrect. Several…

  14. A Brief History of the Philosophical Foundations of Exploratory Factor Analysis.

    ERIC Educational Resources Information Center

    Mulaik, Stanley A.

    1987-01-01

    Exploratory factor analysis derives its key ideas from many sources, including Aristotle, Francis Bacon, Descartes, Pearson and Yule, and Kant. The conclusions of exploratory factor analysis are never complete without subsequent confirmatory factor analysis. (Author/GDC)

  15. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  16. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  17. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  18. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish

  19. The association of placenta previa and assisted reproductive techniques: a meta-analysis.

    PubMed

    Karami, Manoochehr; Jenabi, Ensiyeh; Fereidooni, Bita

    2018-07-01

    Several epidemiological studies have determined that assisted reproductive techniques (ART) can increase the risk of placenta previa. To date, only a meta-analysis has been performed for assessing the relationship between placenta previa and ART. This meta-analysis was conducted to estimate the association between placenta previa and ART in singleton and twin pregnancies. A literature search was performed in major databases PubMed, Web of Science, and Scopus from the earliest possible year to April 2017. The heterogeneity across studies was explored by Q-test and I 2 statistic. The publication bias was assessed using Begg's and Egger's tests. The results were reported using odds ratio (OR) and relative risk (RR) estimates with its 95% confidence intervals (CI) using a random-effects model. The literature search yielded 1529 publications until September 2016 with 1,388,592 participants. The overall estimate of OR was 2.67 (95%CI: 2.01, 3.34) and RR was 3.62 (95%CI: 0.21, 7.03) based on singleton pregnancies. The overall estimate of OR was 1.50 (95%CI: 1.26, 1.74) based on twin pregnancies. We showed based on odds ratio reports in observational studies that ART procedures are a risk factor for placenta previa.

  20. Hand function evaluation: a factor analysis study.

    PubMed

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  1. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Flame analysis using image processing techniques

    NASA Astrophysics Data System (ADS)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  3. Plasma spectroscopy analysis technique based on optimization algorithms and spectral synthesis for arc-welding quality assurance.

    PubMed

    Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M

    2007-02-19

    A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.

  4. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  5. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System

    PubMed Central

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie

    2017-01-01

    Background Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Methods Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. Results A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20–65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10–2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10–3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53–2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Conclusions Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD. PMID:28056058

  6. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System.

    PubMed

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie; Oh, Kook-Hwan

    2017-01-01

    Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20-65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10-2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10-3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53-2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD.

  7. Is There an Economical Running Technique? A Review of Modifiable Biomechanical Factors Affecting Running Economy.

    PubMed

    Moore, Isabel S

    2016-06-01

    Running economy (RE) has a strong relationship with running performance, and modifiable running biomechanics are a determining factor of RE. The purposes of this review were to (1) examine the intrinsic and extrinsic modifiable biomechanical factors affecting RE; (2) assess training-induced changes in RE and running biomechanics; (3) evaluate whether an economical running technique can be recommended and; (4) discuss potential areas for future research. Based on current evidence, the intrinsic factors that appeared beneficial for RE were using a preferred stride length range, which allows for stride length deviations up to 3 % shorter than preferred stride length; lower vertical oscillation; greater leg stiffness; low lower limb moment of inertia; less leg extension at toe-off; larger stride angles; alignment of the ground reaction force and leg axis during propulsion; maintaining arm swing; low thigh antagonist-agonist muscular coactivation; and low activation of lower limb muscles during propulsion. Extrinsic factors associated with a better RE were a firm, compliant shoe-surface interaction and being barefoot or wearing lightweight shoes. Several other modifiable biomechanical factors presented inconsistent relationships with RE. Running biomechanics during ground contact appeared to play an important role, specifically those during propulsion. Therefore, this phase has the strongest direct links with RE. Recurring methodological problems exist within the literature, such as cross-comparisons, assessing variables in isolation, and acute to short-term interventions. Therefore, recommending a general economical running technique should be approached with caution. Future work should focus on interdisciplinary longitudinal investigations combining RE, kinematics, kinetics, and neuromuscular and anatomical aspects, as well as applying a synergistic approach to understanding the role of kinetics.

  8. A comparison of autonomous techniques for multispectral image analysis and classification

    NASA Astrophysics Data System (ADS)

    Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso

    2012-10-01

    Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.

  9. Technique Feature Analysis or Involvement Load Hypothesis: Estimating Their Predictive Power in Vocabulary Learning.

    PubMed

    Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan

    2018-02-05

    Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.

  10. Trace elements in lake sediments measured by the PIXE technique

    NASA Astrophysics Data System (ADS)

    Gatti, Luciana V.; Mozeto, Antônio A.; Artaxo, Paulo

    1999-04-01

    Lakes are ecosystems where there is a great potential of metal accumulation in sediments due to their depositional characteristics. Total concentration of trace elements was measured on a 50 cm long sediment core from the Infernão Lake, that is an oxbow lake of the Moji-Guaçu River basin, in the state of São Paulo, Brazil. Dating of the core shows up to 180 yrs old sediment layers. The use of the PIXE technique for elemental analysis avoids the traditional acid digestion procedure common in other techniques. The multielemental characteristic of PIXE allows a simultaneous determination of about 20 elements in the sediment samples, such as, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Rb, Sr, Zr, Ba, and Pb. Average values for the elemental composition were found to be similar to the bulk crustal composition. The lake flooding pattern strongly influences the time series of the elemental profiles. Factor analysis of the elemental variability shows five factors. Two of the factors represent the mineralogical matrix, and others represent the organic component, a factor with lead, and another loaded with chromium. The mineralogical component consists of elements such as, Fe, Al, V, Ti, Mn, Ni, K, Zr, Sr, Cu and Zn. The variability of Si is explained by two distinct factors, because it is influenced by two different sources, aluminum-silicates and quartz, and the effect of inundation are different for each other. The organic matter is strongly associated with calcium, and also bounded with S, Zn, Cu and P. Lead and chromium appears as separated factors, although it is not clear the evidences for their anthropogenic origin. The techniques developed for sample preparation and PIXE analysis was proven as advantageous and provided very good reproducibility and accuracy.

  11. Factor analysis and psychometric properties of the Mother-Adolescent Sexual Communication (MASC) instrument for sexual risk behavior.

    PubMed

    Cox, Mary Foster; Fasolino, Tracy K; Tavakoli, Abbas S

    2008-01-01

    Sexual risk behavior is a public health problem among adolescents living at or below poverty level. Approximately 1 million pregnancies and 3 million cases of sexually transmitted infections (STIs) are reported yearly. Parenting plays a significant role in adolescent behavior, with mother-adolescent sexual communication correlated with absent or delayed sexual behavior. This study developed an instrument examining constructs of mother-adolescent communication, the Mother-Adolescent Sexual Communication (MASC) instrument. A convenience sample of 99 mothers of middle school children completed the self-administered questionnaires. The original 34-item MASC was reduced to 18 items. Exploratory factor analysis was conducted on the 18-item scale, which resulted in four factors explaining 84.63% of the total variance. Internal consistency analysis produced Cronbach alpha coefficients of .87, .90, .82, and .71 for the four factors, respectively. Convergent validity via hypothesis testing was supported by significant correlations with several subscales of the Parent-Child Relationship Questionnaire (PCRQ) with MASC factors, that is, content and style factors with warmth, personal relationships and disciplinary warmth subscales of the PCRQ, the context factor with personal relationships, and the timing factor with warmth. In light of these findings, the psychometric characteristics and multidimensional perspective of the MASC instrument show evidence of usefulness for measuring and advancing knowledge of mother and adolescent sexual communication techniques.

  12. Multivariate analysis of prognostic factors in synovial sarcoma.

    PubMed

    Koh, Kyoung Hwan; Cho, Eun Yoon; Kim, Dong Wook; Seo, Sung Wook

    2009-11-01

    Many studies have described the diversity of synovial sarcoma in terms of its biological characteristics and clinical features. Moreover, much effort has been expended on the identification of prognostic factors because of unpredictable behaviors of synovial sarcomas. However, with the exception of tumor size, published results have been inconsistent. We attempted to identify independent risk factors using survival analysis. Forty-one consecutive patients with synovial sarcoma were prospectively followed from January 1997 to March 2008. Overall and progression-free survival for age, sex, tumor size, tumor location, metastasis at presentation, histologic subtype, chemotherapy, radiation therapy, and resection margin were analyzed, and standard multivariate Cox proportional hazard regression analysis was used to evaluate potential prognostic factors. Tumor size (>5 cm), nonlimb-based tumors, metastasis at presentation, and a monophasic subtype were associated with poorer overall survival. Multivariate analysis showed metastasis at presentation and monophasic tumor subtype affected overall survival. For the progression-free survival, monophasic subtype was found to be only 1 prognostic factor. The study confirmed that histologic subtype is the single most important independent prognostic factors of synovial sarcoma regardless of tumor stage.

  13. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  14. Analysis of hairy root culture of Rauvolfia serpentina using direct analysis in real time mass spectrometric technique.

    PubMed

    Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K

    2008-06-01

    The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.

  15. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  16. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    DOE PAGES

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less

  17. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    PubMed Central

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632

  18. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    NASA Astrophysics Data System (ADS)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an

  19. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  20. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  1. Factor Analysis for Clustered Observations.

    ERIC Educational Resources Information Center

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  2. Analysis of motor fan radiated sound and vibration waveform by automatic pattern recognition technique using "Mahalanobis distance"

    NASA Astrophysics Data System (ADS)

    Toma, Eiji

    2018-06-01

    In recent years, as the weight of IT equipment has been reduced, the demand for motor fans for cooling the interior of electronic equipment is on the rise. Sensory test technique by inspectors is the mainstream for quality inspection of motor fans in the field. This sensory test requires a lot of experience to accurately diagnose differences in subtle sounds (sound pressures) of the fans, and the judgment varies depending on the condition of the inspector and the environment. In order to solve these quality problems, development of an analysis method capable of quantitatively and automatically diagnosing the sound/vibration level of a fan is required. In this study, it was clarified that the analysis method applying the MT system based on the waveform information of noise and vibration is more effective than the conventional frequency analysis method for the discrimination diagnosis technology of normal and abnormal items. Furthermore, it was found that due to the automation of the vibration waveform analysis system, there was a factor influencing the discrimination accuracy in relation between the fan installation posture and the vibration waveform.

  3. A novel pulse height analysis technique for nuclear spectroscopic and imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, H. H.; Wang, C. Y.; Chou, H. P.

    2005-08-01

    The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.

  4. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  5. Comparative study of glass tube and mist chamber sampling techniques for the analysis of gaseous carbonyl compounds

    NASA Astrophysics Data System (ADS)

    François, Stéphanie; Perraud, Véronique; Pflieger, Maryline; Monod, Anne; Wortham, Henri

    In this work, glass tube and mist chamber sampling techniques using 2,4-dinitrophenylhydrazine as derivative agent for the analysis of gaseous carbonyl compounds are compared. Trapping efficiencies of formaldehyde, acetaldehyde, propionaldehyde, acetone, acrolein, glyoxal, crotonaldehyde, benzaldehyde, butyraldehyde and valeraldehyde are experimentally determined using a gas-phase generator. In addition to generalise our results to all atmospheric gaseous compounds and derivative agents, theoretical trapping efficiencies and enrichment factors are expressed taking into account mechanisms involved in the two kinds of traps. Theoretical and experimental results show that, as expected, the trapping efficiencies of the glass tube depend mainly on solubility of compounds. The results provide new information and better understanding of phenomena occurring in the mist chamber and the ability of this sampler to concentrate the samples. Hence, the mist chamber is the more convenient sampling method when the trapping is associated to a fast derivatisation of the compounds and the glass tube technique must be used to trap atmospheric compounds without simultaneous derivatisation.

  6. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  7. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments

    PubMed Central

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto

    2013-01-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793

  8. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments.

    PubMed

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana M F; Silva, Rosângela; Souza, Sheila Mendonça de; Araujo, Adauto

    2013-04-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  9. Technique for measuring gas conversion factors

    NASA Technical Reports Server (NTRS)

    Singh, J. J.; Sprinkle, D. R. (Inventor)

    1985-01-01

    A method for determining hydrocarbon conversion factors for a flowmeter. A mixture of air, O2 and C sub x H sub y is burned and the partial paressure of O2 in the resulting gas is forced to equal the partial pressure of O2 in air. The flowrate of O2 flowing into the mixture is measured by flowmeter and the flowrate of C sub x H sub y flowing into the mixture is measured by the flowmeter conversion factor is to be determined. These measured values are used to calculate the conversion factor.

  10. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  11. SEM Analysis Techniques for LSI Microcircuits. Volume 2

    DTIC Science & Technology

    1980-08-01

    4~, 1 v’ ’ RADC-TR80-250, Vol 11 (of two), Final Technical -Report, Augut1980 SEM, ANALYSIS TECHNIQUES, FOR LSI MICROCIRCUITS: ’Martin...Bit Static ’RAM.. Volume II - 1024 Bit Stat’i RAM, 4096 Bit Dynamic RAM (SiGATE WOS,)., 4096 Bit -Dynamic RAM ( 1 2 L Bipolar)., ,Summary. RADC-TR-80-250...States, ithout.irst obtani an export nse, is a violation t Internatio 1 Tr ffic in A . eguiations. Such violation is subject o penalty of to 2 years impr

  12. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  13. A Factor Analysis of the BSRI and the PAQ.

    ERIC Educational Resources Information Center

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  14. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  15. Artificial intelligence techniques for automatic screening of amblyogenic factors.

    PubMed

    Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard

    2008-01-01

    To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the "gold standard" specialist examination with a "refer/do not refer" decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than -7. Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years.

  16. The Review of Nuclear Microscopy Techniques: An Approach for Nondestructive Trace Elemental Analysis and Mapping of Biological Materials.

    PubMed

    Mulware, Stephen Juma

    2015-01-01

    The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.

  17. A retrospective analysis of mathieu and tip urethroplasty techniques for distal hypospadias repair; A 20 year experience.

    PubMed

    Oztorun, Kenan; Bagbanci, Sahin; Dadali, Mumtaz; Emir, Levent; Karabulut, Ayhan

    2017-09-01

    We aimed to identify the changes in the application rate of two surgical techniques in distal hypospadias repair in years and compare the most popular two surgical repair techniques for distal hypospadias in terms of surgical outcomes, the factors that affect the outcomes, which were performed over a 20 year period. In this study, the records of 492 consecutive patients that had undergone an operation for distal hypospadias in the urology clinic of Ankara between May 1990 and December 2010 using either Mathieu or TIPU surgical techniques were reviewed retrospectively. The patients who had glanular, coronal, and subcoronal meatus, were accepted as distal hypospadias cases. Among the 492 examined medical records, it was revealed that 331 and 161 surgical interventions were performed by using the Mathieu urethroplasty technique (Group-1) and TIP urethroplasty technique (Group-2), respectively. Group-1 was divided into two subgroups; namely Group-1a (patients with primary hypospadias) and Group-1b (patients with previous hypospadias operation). Likewise, Group-2 was divided into two subgroups; namely group-2a and group-2b. The patients' ages, number of previously urethroplasty operations, localization of the external urethral meatus prior to the operation, chordee state, length of the newly formed urethra, whether urinary diversion was done or not, post-operative complications and data regarding the follow-up period were evaluated, and the effects of these variables on the surgical outcome were investigated via statistical analyses. The primary objective of this study is to identify the changes in the application rate of two surgical techniques in distal hypospadias repair over the a 20 year period, and the secondary objectives are to compare the most popular two surgical repair techniques for distal hypospadias in terms of surgical outcomes, and the factors affecting the outcomes. Independent samples t test and Pearson's Chisquare test was used for statistical

  18. Flood Vulnerability Analysis of the part of Karad Region, Satara District, Maharashtra using Remote Sensing and Geographic Information System technique

    NASA Astrophysics Data System (ADS)

    Warghat, Sumedh R.; Das, Sandipan; Doad, Atul; Mali, Sagar; Moon, Vishal S.

    2012-07-01

    Karad City is situated on the bank of confluence of river Krishna & Koyana, which is severely flood prone area. The floodwaters enter the city through the roads and disrupt the infrastructure in the whole city. Furthermore, due to negligence of the authorities and unplanned growth of the city, the people living in the city have harnessed the natural flow of water by constructing unnecessary embankments in the river Koyna. Due to this reason now river koyna is flowing in the form of a narrow channel, which very easily over-flows during very minor flooding.Flood Vulnerabilty Analysis has been done for the karad region of satara district, maharashtra using remote sensing and geographic information system technique. The aim of this study is to identify flood vulnerability zone by using GIS and RS technique and an attempt has been to demonstrat the application of remote sensing and GIS in order to map flood vulnerabilty area by utilizing ArcMap, and Erdas software. Flood vulnerabilty analysis of part the Karad Regian of Satara District, Maharashtra has been carried out with the objectives - Identify the Flood Prone area in the Koyana and Krishna river basin, Calculate surface runoff and Delineate flood sensitive areas. Delineate classified hazard Map, Evaluate the Flood affected area, Prepare the Flood Vulnerability Map by utilizing Remote Sensing and GIS technique. (C.J. Kumanan;S.M. Ramasamy)The study is based on GIS and spatial technique is used for analysis and understanding of flood problem in Karad Tahsil. The flood affected areas of the different magnitude has been identified and mapped using Arc GIS software. The analysis is useful for local planning authority for identification of risk areas and taking proper decision in right moment. In the analysis causative factors for flooding in watershed are taken into account as annual rainfall, size of watershed, basin slope, drainage density of natural channels and land use. (Dinand Alkema; Farah Aziz.)This study of

  19. The Shock and Vibration Bulletin. Part 3. Dynamic Analysis, Design Techniques

    DTIC Science & Technology

    1980-09-01

    response at certain discrete frequen- nique for dynamic analysis was pioneered by cies, not over a random-frequence spectrum. Myklestad[l]. Later Pestel and...34Fundamentals of Vibra- v’ angle of rotation due to tion Analysis ," McGraw-Hill, New York, 1956. bending 2. E.C. Pestel and F.A. Leckie, "Matrix o’ angle of...Bulletin 50IC FILE COPY (Part 03ofP,) to THE SHOCK AND VIBRATION BULLETIN Part 3 Dynamic Analysis , Design Techniques IELECTE SEPTEMBER 1980 S NOV 1

  20. Establishing Factor Validity Using Variable Reduction in Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Hofmann, Rich

    1995-01-01

    Using a 21-statement attitude-type instrument, an iterative procedure for improving confirmatory model fit is demonstrated within the context of the EQS program of P. M. Bentler and maximum likelihood factor analysis. Each iteration systematically eliminates the poorest fitting statement as identified by a variable fit index. (SLD)

  1. An iterative forward analysis technique to determine the equation of state of dynamically compressed materials

    DOE PAGES

    Ali, S. J.; Kraus, R. G.; Fratanduono, D. E.; ...

    2017-05-18

    Here, we developed an iterative forward analysis (IFA) technique with the ability to use hydrocode simulations as a fitting function for analysis of dynamic compression experiments. The IFA method optimizes over parameterized quantities in the hydrocode simulations, breaking the degeneracy of contributions to the measured material response. Velocity profiles from synthetic data generated using a hydrocode simulation are analyzed as a first-order validation of the technique. We also analyze multiple magnetically driven ramp compression experiments on copper and compare with more conventional techniques. Excellent agreement is obtained in both cases.

  2. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  3. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    NASA Astrophysics Data System (ADS)

    Dahing, Lahasen@Normanshah; Yahya, Redzuan; Yahya, Roslan; Hassan, Hearie

    2014-09-01

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm3 and 15×15×15 cm3 were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  4. A technique for conducting point pattern analysis of cluster plot stem-maps

    Treesearch

    C.W. Woodall; J.M. Graham

    2004-01-01

    Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...

  5. Factors affecting job satisfaction in nurse faculty: a meta-analysis.

    PubMed

    Gormley, Denise K

    2003-04-01

    Evidence in the literature suggests job satisfaction can make a difference in keeping qualified workers on the job, but little research has been conducted focusing specifically on nursing faculty. Several studies have examined nurse faculty satisfaction in relationship to one or two influencing factors. These factors include professional autonomy, leader role expectations, organizational climate, perceived role conflict and role ambiguity, leadership behaviors, and organizational characteristics. This meta-analysis attempts to synthesize the various studies conducted on job satisfaction in nursing faculty and analyze which influencing factors have the greatest effect. The procedure used for this meta-analysis consisted of reviewing studies to identify factors influencing job satisfaction, research questions, sample size reported, instruments used for measurement of job satisfaction and influencing factors, and results of statistical analysis.

  6. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    PubMed

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  7. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  8. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  9. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  10. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.

  11. Risk factor analysis of new brain lesions associated with carotid endarterectmy.

    PubMed

    Lee, Jae Hoon; Suh, Bo Yang

    2014-01-01

    Carotid endarterectomy (CEA) is the standard treatment for carotid artery stenosis. New brain ischemia is a major concern associated with CEA and diffusion weighted imaging (DWI) is a good imaging modality for detecting early ischemic brain lesions. We aimed to investigate the surgical complications and identify the potential risk factors for the incidence of new brain lesions (NBL) on DWI after CEA. From January 2006 to November 2011, 94 patients who had been studied by magnetic resonance imaging including DWI within 1 week after CEA were included in this study. Data were retrospectively investigated by review of vascular registry protocol. Seven clinical variables and three procedural variables were analyzed as risk factors for NBL after CEA. The incidence of periprocedural NBL on DWI was 27.7%. There were no fatal complications, such as ipsilateral disabling stroke, myocardial infarction or mortality. A significantly higher incidence of NBL was found in ulcer positive patients as opposed to ulcer negative patients (P = 0.029). The incidence of NBL after operation was significantly higher in patients treated with conventional technique than with eversion technique (P = 0.042). Our data shows CEA has acceptable periprocedural complication rates and the existence of ulcerative plaque and conventional technique of endarterectomy are high risk factors for NBL development after CEA.

  12. Bootstrap Confidence Intervals for Ordinary Least Squares Factor Loadings and Correlations in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong

    2010-01-01

    This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…

  13. Donor retention in health care in Iran: a factor analysis

    PubMed Central

    Aghababa, Sara; Nasiripour, Amir Ashkan; Maleki, Mohammadreza; Gohari, Mahmoodreza

    2017-01-01

    Background: Long-term financial support is essential for the survival of a charitable organization. Health charities need to identify the effective factors influencing donor retention. Methods: In the present study, the items of a questionnaire were derived from both literature review and semi-structured interviews related to donor retention. Using a purposive sampling, 300 academic and executive practitioners were selected. After the follow- up, a total of 243 usable questionnaires were prepared for factor analysis. The questionnaire was validated based on the face and content validity and reliability through Cronbach’s α-coefficient. Results: The results of exploratory factor analysis extracted 2 factors for retention: donor factor (variance = 33.841%; Cronbach’s α-coefficient = 90.2) and charity factor (variance = 29.038%; Cronbach’s α-coefficient = 82.8), respectively. Subsequently, confirmatory factor analysis was applied to support the overall reasonable fit. Conclusions: In this study, it was found that repeated monetary donations are supplied to the charitable organizations when both aspects of donor factor (retention factor and charity factor) for retention are taken into consideration. This model could provide a perspective for making sustainable donations and charitable giving PMID:28955663

  14. The Factor Structure of the English Language Development Assessment: A Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Kuriakose, Anju

    2011-01-01

    This study investigated the internal factor structure of the English language development Assessment (ELDA) using confirmatory factor analysis. ELDA is an English language proficiency test developed by a consortium of multiple states and is used to identify and reclassify English language learners in kindergarten to grade 12. Scores on item…

  15. Influential Observations in Principal Factor Analysis.

    ERIC Educational Resources Information Center

    Tanaka, Yutaka; Odaka, Yoshimasa

    1989-01-01

    A method is proposed for detecting influential observations in iterative principal factor analysis. Theoretical influence functions are derived for two components of the common variance decomposition. The major mathematical tool is the influence function derived by Tanaka (1988). (SLD)

  16. Patient size and x-ray technique factors in head computed tomography examinations. I. Radiation doses.

    PubMed

    Huda, Walter; Lieberman, Kristin A; Chang, Jack; Roskopf, Marsha L

    2004-03-01

    We investigated how patient age, size and composition, together with the choice of x-ray technique factors, affect radiation doses in head computed tomography (CT) examinations. Head size dimensions, cross-sectional areas, and mean Hounsfield unit (HU) values were obtained from head CT images of 127 patients. For radiation dosimetry purposes patients were modeled as uniform cylinders of water. Dose computations were performed for 18 x 7 mm sections, scanned at a constant 340 mAs, for x-ray tube voltages ranging from 80 to 140 kV. Values of mean section dose, energy imparted, and effective dose were computed for patients ranging from the newborn to adults. There was a rapid growth of head size over the first two years, followed by a more modest increase of head size until the age of 18 or so. Newborns have a mean HU value of about 50 that monotonically increases with age over the first two decades of life. Average adult A-P and lateral dimensions were 186+/-8 mm and 147+/-8 mm, respectively, with an average HU value of 209+/-40. An infant head was found to be equivalent to a water cylinder with a radius of approximately 60 mm, whereas an adult head had an equivalent radius 50% greater. Adult males head dimensions are about 5% larger than for females, and their average x-ray attenuation is approximately 20 HU greater. For adult examinations performed at 120 kV, typical values were 32 mGy for the mean section dose, 105 mJ for the total energy imparted, and 0.64 mSv for the effective dose. Increasing the x-ray tube voltage from 80 to 140 kV increases patient doses by about a factor of 5. For the same technique factors, mean section doses in infants are 35% higher than in adults. Energy imparted for adults is 50% higher than for infants, but infant effective doses are four times higher than for adults. CT doses need to take into account patient age, head size, and composition as well as the selected x-ray technique factors.

  17. Preliminary assessment of aerial photography techniques for canvasback population analysis

    USGS Publications Warehouse

    Munro, R.E.; Trauger, D.L.

    1976-01-01

    Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.

  18. Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis

    NASA Technical Reports Server (NTRS)

    Lindstrom, D. G.; Normand, E.; Wilcox, A. D.

    1972-01-01

    In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.

  19. Methods for Improving Information from ’Undesigned’ Human Factors Experiments.

    DTIC Science & Technology

    Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction

  20. Differences in head impulse test results due to analysis techniques.

    PubMed

    Cleworth, Taylor W; Carpenter, Mark G; Honegger, Flurin; Allum, John H J

    2017-01-01

    Different analysis techniques are used to define vestibulo-ocular reflex (VOR) gain between eye and head angular velocity during the video head impulse test (vHIT). Comparisons would aid selection of gain techniques best related to head impulse characteristics and promote standardisation. Compare and contrast known methods of calculating vHIT VOR gain. We examined lateral canal vHIT responses recorded from 20 patients twice within 13 weeks of acute unilateral peripheral vestibular deficit onset. Ten patients were tested with an ICS Impulse system (GN Otometrics) and 10 with an EyeSeeCam (ESC) system (Interacoustics). Mean gain and variance were computed with area, average sample gain, and regression techniques over specific head angular velocity (HV) and acceleration (HA) intervals. Results for the same gain technique were not different between measurement systems. Area and average sample gain yielded equally lower variances than regression techniques. Gains computed over the whole impulse duration were larger than those computed for increasing HV. Gain over decreasing HV was associated with larger variances. Gains computed around peak HV were smaller than those computed around peak HA. The median gain over 50-70 ms was not different from gain around peak HV. However, depending on technique used, the gain over increasing HV was different from gain around peak HA. Conversion equations between gains obtained with standard ICS and ESC methods were computed. For low gains, the conversion was dominated by a constant that needed to be added to ESC gains to equal ICS gains. We recommend manufacturers standardize vHIT gain calculations using 2 techniques: area gain around peak HA and peak HV.

  1. Thermal Response Analysis of Phospholipid Bilayers Using Ellipsometric Techniques.

    PubMed

    González-Henríquez, Carmen M; Villegas-Opazo, Vanessa A; Sagredo-Oyarce, Dallits H; Sarabia-Vallejos, Mauricio A; Terraza, Claudio A

    2017-08-18

    Biomimetic planar artificial membranes have been widely studied due to their multiple applications in several research fields. Their humectation and thermal response are crucial for reaching stability; these characteristics are related to the molecular organization inside the bilayer, which is affected by the aliphatic chain length, saturations, and molecule polarity, among others. Bilayer stability becomes a fundamental factor when technological devices are developed-like biosensors-based on those systems. Thermal studies were performed for different types of phosphatidylcholine (PC) molecules: two pure PC bilayers and four binary PC mixtures. These analyses were carried out through the detection of slight changes in their optical and structural parameters via Ellipsometry and Surface Plasmon Resonance (SPR) techniques. Phospholipid bilayers were prepared by Langmuir-Blodgett technique and deposited over a hydrophilic silicon wafer. Their molecular inclination degree, mobility, and stability of the different phases were detected and analyzed through bilayer thickness changes and their optical phase-amplitude response. Results show that certain binary lipid mixtures-with differences in its aliphatic chain length-present a co-existence of two thermal responses due to non-ideal mixing.

  2. Qualitative and quantitative analysis of lignocellulosic biomass using infrared techniques: A mini-review

    USDA-ARS?s Scientific Manuscript database

    Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...

  3. Application of factor analysis to the water quality in reservoirs

    NASA Astrophysics Data System (ADS)

    Silva, Eliana Costa e.; Lopes, Isabel Cristina; Correia, Aldina; Gonçalves, A. Manuela

    2017-06-01

    In this work we present a Factor Analysis of chemical and environmental variables of the water column and hydro-morphological features of several Portuguese reservoirs. The objective is to reduce the initial number of variables, keeping their common characteristics. Using the Factor Analysis, the environmental variables measured in the epilimnion and in the hypolimnion, together with the hydromorphological characteristics of the dams were reduced from 63 variables to only 13 factors, which explained a total of 83.348% of the variance in the original data. After performing rotation using the Varimax method, the relations between the factors and the original variables got clearer and more explainable, which provided a Factor Analysis model for these environmental variables using 13 varifactors: Water quality and distance to the source, Hypolimnion chemical composition, Sulfite-reducing bacteria and nutrients, Coliforms and faecal streptococci, Reservoir depth, Temperature, Location, among other factors.

  4. Selecting Strategies to Reduce High-Risk Unsafe Work Behaviors Using the Safety Behavior Sampling Technique and Bayesian Network Analysis.

    PubMed

    Ghasemi, Fakhradin; Kalatpour, Omid; Moghimbeigi, Abbas; Mohammadfam, Iraj

    2017-03-04

    High-risk unsafe behaviors (HRUBs) have been known as the main cause of occupational accidents. Considering the financial and societal costs of accidents and the limitations of available resources, there is an urgent need for managing unsafe behaviors at workplaces. The aim of the present study was to find strategies for decreasing the rate of HRUBs using an integrated approach of safety behavior sampling technique and Bayesian networks analysis. A cross-sectional study. The Bayesian network was constructed using a focus group approach. The required data was collected using the safety behavior sampling, and the parameters of the network were estimated using Expectation-Maximization algorithm. Using sensitivity analysis and belief updating, it was determined that which factors had the highest influences on unsafe behavior. Based on BN analyses, safety training was the most important factor influencing employees' behavior at the workplace. High quality safety training courses can reduce the rate of HRUBs about 10%. Moreover, the rate of HRUBs increased by decreasing the age of employees. The rate of HRUBs was higher in the afternoon and last days of a week. Among the investigated variables, training was the most important factor affecting safety behavior of employees. By holding high quality safety training courses, companies would be able to reduce the rate of HRUBs significantly.

  5. Management of septic non-union of the tibia by the induced membrane technique. What factors could improve results?

    PubMed

    Siboni, Renaud; Joseph, Etienne; Blasco, Laurent; Barbe, Coralie; Bajolet, Odile; Diallo, Saïdou; Ohl, Xavier

    2018-06-07

    Management of septic non-union of the tibia requires debridement and excision of all infected bone and soft tissues. Various surgical techniques have been described to fill the bone defect. The "Induced Membrane" technique, described by A. C. Masquelet in 1986, is a two-step procedure using a PMMA cement spacer around which an induced membrane develops, to be used in the second step as a bone graft holder for the bone graft. The purpose of this study was to assess our clinical and radiological results with this technique in a series managed in our department. Nineteen traumatic septic non-unions of the tibia were included in a retrospective single-center study between November 2007 and November 2014. All patients were followed up clinically and radiologically to assess bone union time. Multivariate analysis was used to identify factors influencing union. The series comprised 4 women and 14 men (19 legs); mean age was 53.9 years. Vascularized flap transfer was required in 26% of cases before the first stage of treatment. All patients underwent a two-step procedure, with a mean interval of 7.9 weeks. Mean bone defect after the first step was 52.4mm. The bone graft was harvested from the iliac crest in the majority of cases (18/19). The bone was stabilized with an external fixator, locking plate or plaster cast after the second step. Mean follow-up was 34 months. Bony union rate was 89% (17/19), at a mean 16 months after step 2. Eleven patients underwent one or more (mean 2.1) complementary procedures. Severity of index fracture skin opening was significantly correlated with union time (Gustilo III vs. Gustilo I or II, p=0.028). A trend was found for negative impact of smoking on union (p=0.06). Bone defect size did not correlate with union rate or time. The union rate was acceptable, at 89%, but with longer union time than reported in the literature. Many factors could explain this: lack of rigid fixation after step 2 (in case of plaster cast or external fixator

  6. Artificial Intelligence Techniques for Automatic Screening of Amblyogenic Factors

    PubMed Central

    Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard

    2008-01-01

    Purpose To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. Methods In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. Results The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the “gold standard” specialist examination with a “refer/do not refer” decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than −7. Conclusions Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years. PMID:19277222

  7. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  8. Sampling and analysis techniques for monitoring serum for trace elements.

    PubMed

    Ericson, S P; McHalsky, M L; Rabinow, B E; Kronholm, K G; Arceo, C S; Weltzer, J A; Ayd, S W

    1986-07-01

    We describe techniques for controlling contamination in the sampling and analysis of human serum for trace metals. The relatively simple procedures do not require clean-room conditions. The atomic absorption and atomic emission methods used have been applied in studying zinc, copper, chromium, manganese, molybdenum, selenium, and aluminum concentrations. Values obtained for a group of 16 normal subjects agree with the most reliable values reported in the literature, obtained by much more elaborate techniques. All of these metals can be measured in 3 to 4 mL of serum. The methods may prove especially useful in monitoring concentrations of essential trace elements in blood of patients being maintained on total parenteral nutrition.

  9. Factors associated with sealant outcome in 2 pediatric dental clinics: a multivariate hierarchical analysis.

    PubMed

    West, Nathan G; Ilief-Ala, Melina A; Douglass, Joanna M; Hagadorn, James I

    2011-01-01

    This study's purpose was to determine whether one-time sealants placed by pediatric dental residents vs dental students have different outcomes. The effect of isolation technique, behavior, duration of follow-up, and caries history was also examined. Records from 2 inner-city pediatric dental clinics were audited for 6- to 10-year-old patients with a permanent first molar sealant with at least 2 years of follow-up. A successful sealant was a one-time sealant that received no further treatment and was sealed or unsealed but not carious or restored at the final audit. Charts from 203 children with 481 sealants were audited. Of these, 281 sealants were failures. Univariate analysis revealed longer follow-up and younger age were associated with sealant failure. Operator type, child behavior, and isolation technique were not associated with sealant failure. After adjusting for follow-up duration, increased age at treatment reduced the odds of sealant failure while a history of caries reduced the protective effect of increased age. After adjusting for these factors, practitioner type, behavior, and type of isolation were not associated with sealant outcome in multivariate analysis. Age at sealant placement, history of caries prior to placement, and longer duration of follow-up are associated with sealant failure.

  10. [Analysis of lifestyle and risk factors of atherosclerosis in students of selected universities in Krakow].

    PubMed

    Skrzypek, Agnieszka; Szeliga, Marta; Stalmach-Przygoda, Agata; Kowalska, Bogumila; Jabłoński, Konrad; Nowakowski, Michal

    Reduction of risk factors of atherosclerosis, lifestyle modification significantly cause the reduction in the incidence, morbidity and mortality of cardiovascular diseases (CVDs). Objective: To evaluate cardiovascular risk factors and analyze the lifestyle of students finishing the first year of studies at selected universities in Krakow. The study was performed in 2015roku. 566 students finishing the first year of study, including 319 (56.4%) men and 247 (43.6%) women were examined. The students were in age from 18 to 27 years, an average of 20.11± 1.15 years. They represented 6 different universities in Cracow. In order to assess eating habits, lifestyle and analysis of risk factors of cardiovascular disease was used method of diagnostic survey using the survey technique. BMI was calculated from anthropometric measurements. The program Statistica 12.0 were used in statistical analysis. The analysis showed that most fruits and vegetables consume UR students and AWF, least of AGH. Only 34.8% of students regularly consume fish of the sea, there were no significant differences between universities. Sports frequently cultivate the students of AWF (93% of the students of this university). Academy of Fine Arts students drink the most coffee. Students of AGH frequently consume alcohol. 60% of all students never tried drugs, but only 25.7% of student of Fine Arts never tried drugs. Overweight occurs in 12.6% of students, and obesity in 1.1%. The most risk factors of atherosclerosis occur in students of AGH and ASP. The results of the study clearly indicate on the necessity of implementation of prevention and improvement of health behaviors in students of AGH and ASP universities.

  11. Analysis of factors in successful nasal endoscopic resection of nasopharyngeal angiofibroma.

    PubMed

    Ye, Dong; Shen, Zhisen; Wang, Guoli; Deng, Hongxia; Qiu, Shijie; Zhang, Yuna

    2016-01-01

    Endoscopic resection of nasopharyngeal angiofibroma is less traumatic, causes less bleeding, and provides a good curative effect. Using pre-operative embolization and controlled hypotension, reasonable surgical strategies and techniques lead to successful resection tumors of a maximum Andrews-Fisch classification stage of III. To investigate surgical indications, methods, surgical technique, and curative effects of transnasal endoscopic resection of nasopharyngeal angiofibroma, this study evaluated factors that improve diagnosis and treatment, prevent large intra-operative blood loss and residual tumor, and increase the cure rate. A retrospective analysis was performed of the clinical data and treatment programs of 23 patients with nasopharyngeal angiofibroma who underwent endoscopic resection with pre-operative embolization and controlled hypotension. The surgical method applied was based on the size of tumor and extent of invasion. Curative effects were observed. No intra-operative or perioperative complications were observed in 22 patients. Upon removal of nasal packing material 3-7 days post-operatively, one patient experienced heavy bleeding of the nasopharyngeal wound, which was treated compression hemostasis using post-nasal packing. Twenty-three patients were followed up for 6-60 months. Twenty-two patients experienced cure; one patient experienced recurrence 10 months post-operatively, and repeat nasal endoscopic surgery was performed and resulted in cure.

  12. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    PubMed

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  13. Epidemiological analysis of factors influencing rate of progress in Echinococcus granulosus control in New Zealand.

    PubMed Central

    Burridge, M. J.; Schwabe, C. W.

    1977-01-01

    The factors influencing the rate of progress in Echinococcus granulosus control in New Zealand were analysed by hydatid control area using stepwise multiple regression techniques. The results indicated that the rate of progress was related positively to initial E. granulosus prevalence in dogs and the efficiency with which local authorities implemented national control policy, and negatively to the Maori proportion in the local population and the number of dogs per owner. Problems in analysis of the New Zealand data are discussed and improved methods of monitoring progress in hydatid disease control programmes are described. Images Fig. 1 PMID:265340

  14. Comparative factor analysis models for an empirical study of EEG data, II: A data-guided resolution of the rotation indeterminacy.

    PubMed

    Rogers, L J; Douglas, R R

    1984-02-01

    In this paper (the second in a series), we consider a (generic) pair of datasets, which have been analyzed by the techniques of the previous paper. Thus, their "stable subspaces" have been established by comparative factor analysis. The pair of datasets must satisfy two confirmable conditions. The first is the "Inclusion Condition," which requires that the stable subspace of one of the datasets is nearly identical to a subspace of the other dataset's stable subspace. On the basis of that, we have assumed the pair to have similar generating signals, with stochastically independent generators. The second verifiable condition is that the (presumed same) generating signals have distinct ratios of variances for the two datasets. Under these conditions a small elaboration of some elementary linear algebra reduces the rotation problem to several eigenvalue-eigenvector problems. Finally, we emphasize that an analysis of each dataset by the method of Douglas and Rogers (1983) is an essential prerequisite for the useful application of the techniques in this paper. Nonempirical methods of estimating the number of factors simply will not suffice, as confirmed by simulations reported in the previous paper.

  15. Successive ion layer adsorption and reaction (SILAR) technique synthesis of Al(III)-8-hydroxy-5-nitrosoquinolate nano-sized thin films: characterization and factors optimization.

    PubMed

    Haggag, Sawsan M S; Farag, A A M; Abdel Refea, M

    2013-02-01

    Nano Al(III)-8-hydroxy-5-nitrosoquinolate [Al(III)-(HNOQ)(3)] thin films were synthesized by the rapid, direct, simple and efficient successive ion layer adsorption and reaction (SILAR) technique. Thin film formation optimized factors were evaluated. Stoichiometry and structure were confirmed by elemental analysis and FT-IR. The particle size (27-71 nm) was determined using scanning electron microscope (SEM). Thermal stability and thermal parameters were determined by thermal gravimetric analysis (TGA). Optical properties were investigated using spectrophotometric measurements of transmittance and reflectance at normal incidence. Refractive index, n, and absorption index, k, were determined. Spectral behavior of the absorption coefficient in the intrinsic absorption region revealed a direct allowed transition with 2.45 eV band gap. The current-voltage (I-V) characteristics of [Al(III)-(HNOQ)(3)]/p-Si heterojunction was measured at room temperature. The forward and reverse I-V characteristics were analyzed. The calculated zero-bias barrier height (Φ(b)) and ideality factor (n) showed strong bias dependence. Energy distribution of interface states (N(ss)) was obtained. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Application of commercial aircraft accident investigation techniques to a railroad derailment.

    DOT National Transportation Integrated Search

    1973-01-01

    Crash investigation techniques utilized by human factors teams in investigating commercial airline crashes have been applied in the analysis of a railroad train derailment - crash. Passengers in cars that remained upright experienced very low deceler...

  17. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available

  18. Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.

    PubMed

    Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J

    2016-05-01

    There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  19. A replication of a factor analysis of motivations for trapping

    USGS Publications Warehouse

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  20. On the Relations among Regular, Equal Unique Variances, and Image Factor Analysis Models.

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.

    2000-01-01

    Investigated the conditions under which the matrix of factor loadings from the factor analysis model with equal unique variances will give a good approximation to the matrix of factor loadings from the regular factor analysis model. Extends the results to the image factor analysis model. Discusses implications for practice. (SLD)

  1. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    PubMed

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P < .001 and P < .003, respectively). Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  2. The support-control continuum: An investigation of staff perspectives on factors influencing the success or failure of de-escalation techniques for the management of violence and aggression in mental health settings.

    PubMed

    Price, Owen; Baker, John; Bee, Penny; Lovell, Karina

    2018-01-01

    De-escalation techniques are recommended to manage violence and aggression in mental health settings yet restrictive practices continue to be frequently used. Barriers and enablers to the implementation and effectiveness of de-escalation techniques in practice are not well understood. To obtain staff descriptions of de-escalation techniques currently used in mental health settings and explore factors perceived to influence their implementation and effectiveness. Qualitative, semi-structured interviews and Framework Analysis. Five in-patient wards including three male psychiatric intensive care units, one female acute ward and one male acute ward in three UK Mental Health NHS Trusts. 20 ward-based clinical staff. Individual semi-structured interviews were digitally recorded, transcribed verbatim and analysed using a qualitative data analysis software package. Participants described 14 techniques used in response to escalated aggression applied on a continuum between support and control. Techniques along the support-control continuum could be classified in three groups: 'support' (e.g. problem-solving, distraction, reassurance) 'non-physical control' (e.g. reprimands, deterrents, instruction) and 'physical control' (e.g. physical restraint and seclusion). Charting the reasoning staff provided for technique selection against the described behavioural outcome enabled a preliminary understanding of staff, patient and environmental influences on de-escalation success or failure. Importantly, the more coercive 'non-physical control' techniques are currently conceptualised by staff as a feature of de-escalation techniques, yet, there was evidence of a link between these and increased aggression/use of restrictive practices. Risk was not a consistent factor in decisions to adopt more controlling techniques. Moral judgements regarding the function of the aggression; trial-and-error; ingrained local custom (especially around instruction to low stimulus areas); knowledge of

  3. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  4. A study of the stress wave factor technique for the characterization of composite materials

    NASA Technical Reports Server (NTRS)

    Govada, A. K.; Duke, J. C., Jr.; Henneke, E. G., II; Stinchcomb, W. W.

    1985-01-01

    This study has investigated the potential of the Stress Wave Factor as an NDT technique for thin composite laminates. The conventional SWF and an alternate method for quantifying the SWF were investigated. Agreement between the initial SWF number, ultrasonic C-scan, inplane displacements as obtained by full field moire interferometry, and the failure location have been observed. The SWF number was observed to be the highest when measured along the fiber direction and the lowest when measured across the fibers. The alternate method for quantifying the SWF used square root of the zeroth moment (square root of M sub o) of the frequency spectrum of the received signal as a quantitative parameter. From this study it therefore appears that the stress wave factor has an excellent potential to monitor damage development in thin composite laminates.

  5. A Spatiotemporal Analysis of Extreme Heat Vulnerability Across the United States using Geospatial Techniques

    NASA Astrophysics Data System (ADS)

    Schoessow, F. S.; Li, Y.; Howe, P. D.

    2016-12-01

    Extreme heat events are the deadliest natural hazard in the United States and are expected to increase in both severity and frequency in the coming years due to the effects of climate change. The risks of climate change and weather-related events such as heat waves to a population can be more comprehensively assessed by coupling the traditional examination of natural hazards using remote sensing and geospatial analysis techniques with human vulnerability factors and individual perceptions of hazards. By analyzing remote-sensed and empirical survey data alongside national hazards advisories, this study endeavors to establish a nationally-representative baseline quantifying the spatiotemporal variation of individual heat vulnerabilities at multiple scales and between disparate population groups affected by their unique socioenvironmental factors. This is of immediate academic interest because the study of heat waves risk perceptions remains relatively unexplored - despite the intensification of extreme heat events. The use of "human sensors", georeferenced & timestamped individual response data, provides invaluable contextualized data at a high spatial resolution, which will enable policy-makers to more effectively implement targeted strategies for risk prevention, mitigation, and communication. As climate change risks are further defined, this cognizance will help identify vulnerable populations and enhance national hazard preparedness and recovery frameworks.

  6. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    PubMed Central

    Sediyama, Cristina Y. N.; Moura, Ricardo; Garcia, Marina S.; da Silva, Antonio G.; Soraggi, Carolina; Neves, Fernando S.; Albuquerque, Maicon R.; Whiteside, Setephen P.; Malloy-Diniz, Leandro F.

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS. PMID:28484414

  7. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale.

    PubMed

    Sediyama, Cristina Y N; Moura, Ricardo; Garcia, Marina S; da Silva, Antonio G; Soraggi, Carolina; Neves, Fernando S; Albuquerque, Maicon R; Whiteside, Setephen P; Malloy-Diniz, Leandro F

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach's alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  8. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  9. Use of Several Thermal Analysis Techniques on a Hypalon Paint Coating for the Solid Rocket Booster (SRB) of the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Wingard, Charles D.; Whitaker, Ann F. (Technical Monitor)

    2000-01-01

    White Hypalon paint is brush-applied as a moisture barrier coating over cork surfaces on each of the two Space Shuttle SRBs. Fine cracks have been observed in the Hypalon coating three times historically on laboratory witness panels, but never on flight hardware. Samples of the cracked and standard ("good") Hypalon were removed from witness panel cork surfaces, and were tested in 1998 by Thermogravimetric Analysis (TGA), TMA and Differential Scanning Calorimetry (DSC) thermal analysis techniques. The TGA data showed that at 700C, where only paint pigment solids remain, the cracked material had about 9 weight percent more material remaining than the standard material, probably indicating incomplete mixing of the paint before it was brush-applied to produce the cracked material. Use of the TMA film/fiber technique showed that the average modulus (stiffness) vs. temperature was about 3 to 6 times higher for the cracked material than for the standard material. The TMA data also showed that an increase in coating thickness for the cracked Hypalon was not a factor in the anomaly.

  10. A Factor Analysis of Learning Data and Selected Ability Test Scores

    ERIC Educational Resources Information Center

    Jones, Dorothy L.

    1976-01-01

    A verbal concept-learning task permitting the externalizing and quantifying of learning behavior and 16 ability tests were administered to female graduate students. Data were analyzed by alpha factor analysis and incomplete image analysis. Six alpha factors and 12 image factors were extracted and orthogonally rotated. Four areas of cognitive…

  11. Analysis of composite laminates with multiple fasteners by boundary collocation technique

    NASA Astrophysics Data System (ADS)

    Sergeev, Boris Anatolievich

    Mechanical fasteners remain the primary means of load transfer between structural components made of composite laminates. As, in pursuit of increasing efficiency of the structure, the operational load continues to grow, the load carried by each fastener increases accordingly. This accelerates initiation of fatigue-related cracks near the fasteners holes and increases probability of failure. Therefore, the assessment of the stresses around the fastener holes and the stress intensity factors associated with edge cracks becomes critical for damage-tolerant design. Because of the presence of unknown contact stresses and the contact region between the fastener and the laminate, the analysis of a pin-loaded hole becomes considerably more complex than that of a traction-free hole. The accurate prediction of the contact stress distribution along the hole boundary is critical for determining the stress intensity factors and is essential for reliable strength evaluation and failure prediction. This study concerns the development of an analytical methodology, based on the boundary collocation technique, to determine the contact stresses and stress intensity factors required for strength and life prediction of bolted joints with many fasteners. It provides an analytical capability for determining the non-linear contact stresses in mechanically fastened composite laminates while capturing the effects of finite geometry, presence of edge cracks, interaction among fasteners, material anisotropy, fastener flexibility, fastener-hole clearance, friction between the pin and the laminate, and by-pass loading. Also, the proposed approach permits the determination of the fastener load distribution, which significantly influences the failure load of a multi-fastener joint. The well known phenomenon of the fastener tightening torque (clamping force) influence on the load distribution among the different fastener in a multi-fastener joints is taken into account by means of bi

  12. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    PubMed

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-06-01

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p < 0.001]. No difference was seen in time to cannulate [SMD (95% CI) -0.31 (-0.65, 0.04); p = 0.30] and mean number of attempt [MD (95% CI) -0.65 (-1.32, 0.02); p = 0.06] between USG guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of

  13. Effect of various putty-wash impression techniques on marginal fit of cast crowns.

    PubMed

    Nissan, Joseph; Rosner, Ofir; Bukhari, Mohammed Amin; Ghelfan, Oded; Pilo, Raphael

    2013-01-01

    Marginal fit is an important clinical factor that affects restoration longevity. The accuracy of three polyvinyl siloxane putty-wash impression techniques was compared by marginal fit assessment using the nondestructive method. A stainless steel master cast containing three abutments with three metal crowns matching the three preparations was used to make 45 impressions: group A = single-step technique (putty and wash impression materials used simultaneously), group B = two-step technique with a 2-mm relief (putty as a preliminary impression to create a 2-mm wash space followed by the wash stage), and group C = two-step technique with a polyethylene spacer (plastic spacer used with the putty impression followed by the wash stage). Accuracy was assessed using a toolmaker microscope to measure and compare the marginal gaps between each crown and finish line on the duplicated stone casts. Each abutment was further measured at the mesial, buccal, and distal aspects. One-way analysis of variance was used for statistical analysis. P values and Scheffe post hoc contrasts were calculated. Significance was determined at .05. One-way analysis of variance showed significant differences among the three impression techniques in all three abutments and at all three locations (P < .001). Group B yielded dies with minimal gaps compared to groups A and C. The two-step impression technique with 2-mm relief was the most accurate regarding the crucial clinical factor of marginal fit.

  14. Biological risk factors for suicidal behaviors: a meta-analysis

    PubMed Central

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09–1.81) and suicide death (wOR=1.28; CI: 1.13–1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias—cytokines (wOR=2.87; CI: 1.40–5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01–1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors. PMID:27622931

  15. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yonggang

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less

  16. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    PubMed

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  17. Efficient geometric rectification techniques for spectral analysis algorithm

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  18. Application of Information-Theoretic Data Mining Techniques in a National Ambulatory Practice Outcomes Research Network

    PubMed Central

    Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin

    2005-01-01

    The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156

  19. Understanding factors influencing vulnerable older people keeping warm and well in winter: a qualitative study using social marketing techniques

    PubMed Central

    Lusambili, Adelaide; Homer, Catherine; Abbott, Joanne; Cooke, Joanne Mary; Stocks, Amanda Jayne; McDaid, Kathleen Anne

    2012-01-01

    Objectives To understand the influences and decisions of vulnerable older people in relation to keeping warm in winter. Design A qualitative study incorporating in-depth, semi-structured individual and group interviews, framework analysis and social marketing segmentation techniques. Setting Rotherham, South Yorkshire, UK. Participants 50 older people (>55) and 25 health and social care staff underwent individual interview. The older people also had household temperature measurements. 24 older people and 19 health and social care staff participated in one of the six group interviews. Results Multiple complex factors emerged to explain whether vulnerable older people were able to keep warm. These influences combined in various ways that meant older people were not able to or preferred not to access help or change home heating behaviour. Factors influencing behaviours and decisions relating to use of heating, spending money, accessing cheaper tariffs, accessing benefits or asking for help fell into three main categories. These were situational and contextual factors, attitudes and values, and barriers. Barriers included poor knowledge and awareness, technology, disjointed systems and the invisibility of fuel and fuel payment. Findings formed the basis of a social marketing segmentation model used to develop six pen portraits that illustrated how factors that conspire against older people being able to keep warm. Conclusions The findings illustrate how and why vulnerable older people may be at risk of a cold home. The pen portraits provide an accessible vehicle and reflective tool to raise the capacity of the NHS in responding to their needs in line with the Cold Weather Plan. PMID:22798252

  20. Understanding factors influencing vulnerable older people keeping warm and well in winter: a qualitative study using social marketing techniques.

    PubMed

    Tod, Angela Mary; Lusambili, Adelaide; Homer, Catherine; Abbott, Joanne; Cooke, Joanne Mary; Stocks, Amanda Jayne; McDaid, Kathleen Anne

    2012-01-01

    To understand the influences and decisions of vulnerable older people in relation to keeping warm in winter. A qualitative study incorporating in-depth, semi-structured individual and group interviews, framework analysis and social marketing segmentation techniques. Rotherham, South Yorkshire, UK. 50 older people (>55) and 25 health and social care staff underwent individual interview. The older people also had household temperature measurements. 24 older people and 19 health and social care staff participated in one of the six group interviews. Multiple complex factors emerged to explain whether vulnerable older people were able to keep warm. These influences combined in various ways that meant older people were not able to or preferred not to access help or change home heating behaviour. Factors influencing behaviours and decisions relating to use of heating, spending money, accessing cheaper tariffs, accessing benefits or asking for help fell into three main categories. These were situational and contextual factors, attitudes and values, and barriers. Barriers included poor knowledge and awareness, technology, disjointed systems and the invisibility of fuel and fuel payment. Findings formed the basis of a social marketing segmentation model used to develop six pen portraits that illustrated how factors that conspire against older people being able to keep warm. The findings illustrate how and why vulnerable older people may be at risk of a cold home. The pen portraits provide an accessible vehicle and reflective tool to raise the capacity of the NHS in responding to their needs in line with the Cold Weather Plan.

  1. Human Factors Vehicle Displacement Analysis: Engineering In Motion

    NASA Technical Reports Server (NTRS)

    Atencio, Laura Ashley; Reynolds, David; Robertson, Clay

    2010-01-01

    While positioned on the launch pad at the Kennedy Space Center, tall stacked launch vehicles are exposed to the natural environment. Varying directional winds and vortex shedding causes the vehicle to sway in an oscillating motion. The Human Factors team recognizes that vehicle sway may hinder ground crew operation, impact the ground system designs, and ultimately affect launch availability . The objective of this study is to physically simulate predicted oscillation envelopes identified by analysis. and conduct a Human Factors Analysis to assess the ability to carry out essential Upper Stage (US) ground operator tasks based on predicted vehicle motion.

  2. Collection analysis techniques used to evaluate a graduate-level toxicology collection.

    PubMed

    Crawley-Low, Jill V

    2002-07-01

    Collections librarians from academic libraries are often asked, on short notice, to evaluate whether their collections are able to support changes in their institutions' curricula, such as new programs or courses or revisions to existing programs or courses. With insufficient time to perform an exhaustive critique of the collection and a need to prepare a report for faculty external to the library, a selection of reliable but brief qualitative and quantitative tests is needed. In this study, materials-centered and use-centered methods were chosen to evaluate the toxicology collection of the University of Saskatchewan (U of S) Library. Strengths and weaknesses of the techniques are reviewed, along with examples of their use in evaluating the toxicology collection. The monograph portion of the collection was evaluated using list checking, citation analysis, and classified profile methods. Cost-effectiveness and impact factor data were compiled to rank journals from the collection. Use-centered methods such as circulation and interlibrary loan data identified highly used items that should be added to the collection. Finally, although the data were insufficient to evaluate the toxicology electronic journals at the U of S, a brief discussion of three initiatives that aim to assist librarians as they evaluate the use of networked electronic resources in their collections is presented.

  3. Upper limb kinetic analysis of three sitting pivot wheelchair transfer techniques.

    PubMed

    Koontz, Alicia M; Kankipati, Padmaja; Lin, Yen-Sheng; Cooper, Rory A; Boninger, Michael L

    2011-11-01

    The objective of this study was to investigate differences in shoulder, elbow and hand kinetics while performing three different SPTs that varied in terms of hand and trunk positioning. Fourteen unimpaired individuals (8 male and 6 female) performed three variations of sitting pivot transfers in a random order from a wheelchair to a level tub bench. Two transfers involved a forward flexed trunk (head-hips technique) and the third with the trunk remaining upright. The two transfers involving a head hips technique were performed with two different leading hand initial positions. Motion analysis equipment recorded upper body movements and force sensors recorded hand reaction forces. Shoulder and elbow joint and hand kinetics were computed for the lift phase of the transfer. Transferring using either of the head hips techniques compared to the trunk upright style of transferring resulted in reduced superior forces at the shoulder (P<0.002), elbow (P<0.004) and hand (P<0.013). There was a significant increase in the medial forces in the leading elbow (P=0.049) for both head hip transfers and the trailing hand for the head hip technique with the arm further away from the body (P<0.028). The head hip techniques resulted in higher shoulder external rotation, flexion and extension moments compared to the trunk upright technique (P<0.021). Varying the hand placement and trunk positioning during transfers changes the load distribution across all upper limb joints. The results of this study may be useful for determining a technique that helps preserve upper limb function overtime. Published by Elsevier Ltd.

  4. KAP Surveys and Dengue Control in Colombia: Disentangling the Effect of Sociodemographic Factors Using Multiple Correspondence Analysis

    PubMed Central

    Quintero, Juliana

    2016-01-01

    During the last few decades, several studies have analyzed and described knowledge, attitudes, and practices (KAP) of populations regarding dengue. However, few studies have applied geometric data analytic techniques to generate indices from KAP domains. Results of such analyses have not been used to determine the potential effects of sociodemographic variables on the levels of KAP. The objective was to determine the sociodemographic factors related to different levels of KAP regarding dengue in two hyper-endemic cities of Colombia, using a multiple correspondence analysis (MCA) technique. In the context of a cluster randomized trial, 3,998 households were surveyed in Arauca and Armenia between 2012 and 2013. To generate KAP indexes, we performed a MCA followed by a hierarchical cluster analysis to classify each score in different groups. A quantile regression for each of the score groups was conducted. KAP indexes explained 56.1%, 79.7%, and 83.2% of the variance, with means of 4.2, 1.4, and 3.2 and values that ranged from 1 to 7, 7 and 11, respectively. The highest values of the index denoted higher levels of knowledge and practices. The attitudes index did not show the same relationship and was excluded from the analysis. In the quantile regression, age (0.06; IC: 0.03, 0.09), years of education (0.14; IC: 0.06, 0.22), and history of dengue in the family (0.21; IC: 0.12, 0.31) were positively related to lower levels of knowledge regarding dengue. The effect of such factors gradually decreased or disappeared when knowledge was higher. The practices indexes did not evidence a correlation with sociodemographic variables. These results suggest that the transformation of categorical variables into a single index by the use of MCA is possible when analyzing knowledge and practices regarding dengue from KAP questionnaires. Additionally, the magnitude of the effect of socioeconomic variables on the knowledge scores varies according to the levels of knowledge, suggesting

  5. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  6. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  7. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures.

    PubMed

    Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each

  8. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    PubMed

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  9. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    NASA Astrophysics Data System (ADS)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  10. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  11. Analysis of the effects of geological and geomorphological factors on earthquake triggered landslides using artificial neural networks (ANN)

    NASA Astrophysics Data System (ADS)

    Kawabata, D.; Bandibas, J.

    2007-12-01

    The occurrence of landslide is the result of the interaction of complex and diverse environmental factors. The geomorphic and geologic features, rock types and vegetative cover are important base factors of landslide occurrence. However, determining the relationship between these factors and landslide occurrence is very difficult using conventional mathematical analysis. The use of an advanced computing technique for this kind of analysis is very important. Artificial neural network (ANN) has recently been included in the list of analytical tools for a wide range of applications in the natural sciences research fields. One of the advantages of using ANN for pattern recognition is that it can handle data at any measurement scale ranging from nominal, ordinal to linear and ratio, and any form of data distribution (Wang et al., 1995). In addition, it can easily handle qualitative variables making it widely used in integrated analysis of spatial data from multiple sources for predicting and classification. This study focuses on the definition of the relationship between geological factors and landslide occurrence using artificial neural networks. The study also focuses on the effect of the DTMs (e.g. ASTER DTM, ALSM, digitized from paper map and digital photogrammetric measurement data). The main aim of the study is to generate landslide susceptibility index map using the defined relationship using ANN. Landslide data in the Chuetsu region were used in this research. The 2004 earthquake triggered many landslides in the region. The initial results of the study showed that ANN is more accurate in defining the relationship between geological and geomorphological factors and landslide occurrence. It also determined the best combination of geological and geomorphological factors that is directly related to landslide occurrence.

  12. Spatial analysis of leprosy incidence and associated socioeconomic factors.

    PubMed

    Cury, Maria Rita de Cassia Oliveira; Paschoal, Vania Del'Arco; Nardi, Susilene Maria Tonelli; Chierotti, Ana Patrícia; Rodrigues Júnior, Antonio Luiz; Chiaravalloti-Neto, Francisco

    2012-02-01

    To identify clusters of the major occurrences of leprosy and their associated socioeconomic and demographic factors. Cases of leprosy that occurred between 1998 and 2007 in São José do Rio Preto (southeastern Brazil) were geocodified and the incidence rates were calculated by census tract. A socioeconomic classification score was obtained using principal component analysis of socioeconomic variables. Thematic maps to visualize the spatial distribution of the incidence of leprosy with respect to socioeconomic levels and demographic density were constructed using geostatistics. While the incidence rate for the entire city was 10.4 cases per 100,000 inhabitants annually between 1998 and 2007, the incidence rates of individual census tracts were heterogeneous, with values that ranged from 0 to 26.9 cases per 100,000 inhabitants per year. Areas with a high leprosy incidence were associated with lower socioeconomic levels. There were identified clusters of leprosy cases, however there was no association between disease incidence and demographic density. There was a disparity between the places where the majority of ill people lived and the location of healthcare services. The spatial analysis techniques utilized identified the poorer neighborhoods of the city as the areas with the highest risk for the disease. These data show that health departments must prioritize politico-administrative policies to minimize the effects of social inequality and improve the standards of living, hygiene, and education of the population in order to reduce the incidence of leprosy.

  13. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  14. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  15. Deep Learning with Hierarchical Convolutional Factor Analysis

    PubMed Central

    Chen, Bo; Polatkan, Gungor; Sapiro, Guillermo; Blei, David; Dunson, David; Carin, Lawrence

    2013-01-01

    Unsupervised multi-layered (“deep”) models are considered for general data, with a particular focus on imagery. The model is represented using a hierarchical convolutional factor-analysis construction, with sparse factor loadings and scores. The computation of layer-dependent model parameters is implemented within a Bayesian setting, employing a Gibbs sampler and variational Bayesian (VB) analysis, that explicitly exploit the convolutional nature of the expansion. In order to address large-scale and streaming data, an online version of VB is also developed. The number of basis functions or dictionary elements at each layer is inferred from the data, based on a beta-Bernoulli implementation of the Indian buffet process. Example results are presented for several image-processing applications, with comparisons to related models in the literature. PMID:23787342

  16. Factor Analysis by Generalized Least Squares.

    ERIC Educational Resources Information Center

    Joreskog, Karl G.; Goldberger, Arthur S.

    Aitkin's generalized least squares (GLS) principle, with the inverse of the observed variance-covariance matrix as a weight matrix, is applied to estimate the factor analysis model in the exploratory (unrestricted) case. It is shown that the GLS estimates are scale free and asymptotically efficient. The estimates are computed by a rapidly…

  17. Exploratory Factor Analysis of a Force Concept Inventory Data Set

    ERIC Educational Resources Information Center

    Scott, Terry F.; Schumayer, Daniel; Gray, Andrew R.

    2012-01-01

    We perform a factor analysis on a "Force Concept Inventory" (FCI) data set collected from 2109 respondents. We address two questions: the appearance of conceptual coherence in student responses to the FCI and some consequences of this factor analysis on the teaching of Newtonian mechanics. We will highlight the apparent conflation of Newton's…

  18. Autonomous selection of PDE inpainting techniques vs. exemplar inpainting techniques for void fill of high resolution digital surface models

    NASA Astrophysics Data System (ADS)

    Rahmes, Mark; Yates, J. Harlan; Allen, Josef DeVaughn; Kelley, Patrick

    2007-04-01

    High resolution Digital Surface Models (DSMs) may contain voids (missing data) due to the data collection process used to obtain the DSM, inclement weather conditions, low returns, system errors/malfunctions for various collection platforms, and other factors. DSM voids are also created during bare earth processing where culture and vegetation features have been extracted. The Harris LiteSite TM Toolkit handles these void regions in DSMs via two novel techniques. We use both partial differential equations (PDEs) and exemplar based inpainting techniques to accurately fill voids. The PDE technique has its origin in fluid dynamics and heat equations (a particular subset of partial differential equations). The exemplar technique has its origin in texture analysis and image processing. Each technique is optimally suited for different input conditions. The PDE technique works better where the area to be void filled does not have disproportionately high frequency data in the neighborhood of the boundary of the void. Conversely, the exemplar based technique is better suited for high frequency areas. Both are autonomous with respect to detecting and repairing void regions. We describe a cohesive autonomous solution that dynamically selects the best technique as each void is being repaired.

  19. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  20. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise ofmore » an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.« less

  1. Glyphosate analysis using sensors and electromigration separation techniques as alternatives to gas or liquid chromatography.

    PubMed

    Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin

    2018-01-01

    Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.

  2. Examination of fungi in domestic interiors by using factor analysis: Correlations and associations with home factors. [Cladosporium, Alternaria, Epicoccum, Aureobasidium, Aspergillus; Penicillium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, H.J.; Rotnitzky, A.; Spengler, J.D.

    1992-01-01

    Factor analysis was utilized to investigate correlations among airborne microorganisms collected with Andersen samplers from homes in Topeka, Kans., during the winter of 1987 to 1988. The factors derived were used to relate microbial concentrations with categorical, questionnaire-derived descriptions of housing conditions. This approach successfully identified groups of common aboveground decay fungi including Cladosporium, Alternaria, Epicoccum, and Aureobasidium spp. The common soil fungi Aspergillus and Penicillium spp. were also separated as a group. These previously known ecological groupings were confirmed with air sampling data by a quantitative evaluation technique. The above ground decay fungi sampled indoors in winter were presentmore » at relatively high concentrations in homes with gas stoves for cooking, suggesting a possible association between these fungi and increased humidity from the combustion process. Elevated concentrations of the soil fungi were significantly associated with the dirt floor, crawl-space type of basement. Elevated concentrations of water-requiring fungi, such as Fusarium spp., were shown to be associated with water collection in domestic interiors. Also, elevated mean concentrations for the group of fungi including Cladosporium, Epicoccum, Aureobasidium, and yeast spp. were found to be associated with symptoms reported on a health questionnaire. This finding was consistent with the authors previous study of associations between respiratory health and airborne microorganisms by univariate logistic regression analysis.« less

  3. Analysis of Factors Influencing Creative Personality of Elementary School Students

    ERIC Educational Resources Information Center

    Park, Jongman; Kim, Minkee; Jang, Shinho

    2017-01-01

    This quantitative research examined factors that affect elementary students' creativity and how those factors correlate. Aiming to identify significant factors that affect creativity and to clarify the relationship between these factors by path analysis, this research was designed to be a stepping stone for creativity enhancement studies. Data…

  4. Blood volume analysis: a new technique and new clinical interest reinvigorate a classic study.

    PubMed

    Manzone, Timothy A; Dam, Hung Q; Soltis, Daniel; Sagar, Vidya V

    2007-06-01

    Blood volume studies using the indicator dilution technique and radioactive tracers have been performed in nuclear medicine departments for over 50 y. A nuclear medicine study is the gold standard for blood volume measurement, but the classic dual-isotope blood volume study is time-consuming and can be prone to technical errors. Moreover, a lack of normal values and a rubric for interpretation made volume status measurement of limited interest to most clinicians other than some hematologists. A new semiautomated system for blood volume analysis is now available and provides highly accurate results for blood volume analysis within only 90 min. The availability of rapid, accurate blood volume analysis has brought about a surge of clinical interest in using blood volume data for clinical management. Blood volume analysis, long a low-volume nuclear medicine study all but abandoned in some laboratories, is poised to enter the clinical mainstream. This article will first present the fundamental principles of fluid balance and the clinical means of volume status assessment. We will then review the indicator dilution technique and how it is used in nuclear medicine blood volume studies. We will present an overview of the new semiautomated blood volume analysis technique, showing how the study is done, how it works, what results are provided, and how those results are interpreted. Finally, we will look at some of the emerging areas in which data from blood volume analysis can improve patient care. The reader will gain an understanding of the principles underlying blood volume assessment, know how current nuclear medicine blood volume analysis studies are performed, and appreciate their potential clinical impact.

  5. Wheeze sound analysis using computer-based techniques: a systematic review.

    PubMed

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  6. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  7. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  8. A Cross-Cultural Examination of Music Instruction Analysis and Evaluation Techniques.

    ERIC Educational Resources Information Center

    Price, Harry E.; Ogawa, Yoko; Arizumi, Koji

    1997-01-01

    Examines whether analysis techniques of student/teacher interactions widely used throughout the United Sates could be applied to a music instruction setting in Japan by analyzing two videotaped music lessons of different teachers. Finds that teacher A, who used five times more feedback, was rated higher overall by both Japanese and U.S. students.…

  9. A novel CT acquisition and analysis technique for breathing motion modeling

    NASA Astrophysics Data System (ADS)

    Low, Daniel A.; White, Benjamin M.; Lee, Percy P.; Thomas, David H.; Gaudio, Sergio; Jani, Shyam S.; Wu, Xiao; Lamb, James M.

    2013-06-01

    To report on a novel technique for providing artifact-free quantitative four-dimensional computed tomography (4DCT) image datasets for breathing motion modeling. Commercial clinical 4DCT methods have difficulty managing irregular breathing. The resulting images contain motion-induced artifacts that can distort structures and inaccurately characterize breathing motion. We have developed a novel scanning and analysis method for motion-correlated CT that utilizes standard repeated fast helical acquisitions, a simultaneous breathing surrogate measurement, deformable image registration, and a published breathing motion model. The motion model differs from the CT-measured motion by an average of 0.65 mm, indicating the precision of the motion model. The integral of the divergence of one of the motion model parameters is predicted to be a constant 1.11 and is found in this case to be 1.09, indicating the accuracy of the motion model. The proposed technique shows promise for providing motion-artifact free images at user-selected breathing phases, accurate Hounsfield units, and noise characteristics similar to non-4D CT techniques, at a patient dose similar to or less than current 4DCT techniques.

  10. Triangular covariance factorizations for. Ph.D. Thesis. - Calif. Univ.

    NASA Technical Reports Server (NTRS)

    Thornton, C. L.

    1976-01-01

    An improved computational form of the discrete Kalman filter is derived using an upper triangular factorization of the error covariance matrix. The covariance P is factored such that P = UDUT where U is unit upper triangular and D is diagonal. Recursions are developed for propagating the U-D covariance factors together with the corresponding state estimate. The resulting algorithm, referred to as the U-D filter, combines the superior numerical precision of square root filtering techniques with an efficiency comparable to that of Kalman's original formula. Moreover, this method is easily implemented and involves no more computer storage than the Kalman algorithm. These characteristics make the U-D method an attractive realtime filtering technique. A new covariance error analysis technique is obtained from an extension of the U-D filter equations. This evaluation method is flexible and efficient and may provide significantly improved numerical results. Cost comparisons show that for a large class of problems the U-D evaluation algorithm is noticeably less expensive than conventional error analysis methods.

  11. Phasor Analysis of Binary Diffraction Gratings with Different Fill Factors

    ERIC Educational Resources Information Center

    Martinez, Antonio; Sanchez-Lopez, Ma del Mar; Moreno, Ignacio

    2007-01-01

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving…

  12. Examining evolving performance on the Force Concept Inventory using factor analysis

    NASA Astrophysics Data System (ADS)

    Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W.

    2017-06-01

    The application of factor analysis to the Force Concept Inventory (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a pre- and post-test, we see factor analysis as a tool by which the changes in conceptual associations made by our students may be gauged given the evolution of their response patterns. This analysis allows us to identify and track conceptual linkages, affording us insight as to how our students have matured due to instruction. We report on our analysis of 427 pre- and post-tests. The factor models for the pre- and post-tests are explored and compared along with the methodology by which these models were fit to the data. The post-test factor pattern is more aligned with an expert's interpretation of the questions' content, as it allows for a more readily identifiable relationship between factors and physical concepts. We discuss this evolution in the context of approaching the characteristics of an expert with force concepts. Also, we find that certain test items do not significantly contribute to the pre- or post-test factor models and attempt explanations as to why this is so. This may suggest that such questions may not be effective in probing the conceptual understanding of our students.

  13. Data Analysis Techniques for Ligo Detector Characterization

    NASA Astrophysics Data System (ADS)

    Valdes Sanchez, Guillermo A.

    Gravitational-wave astronomy is a branch of astronomy which aims to use gravitational waves to collect observational data about astronomical objects and events such as black holes, neutron stars, supernovae, and processes including those of the early universe shortly after the Big Bang. Einstein first predicted gravitational waves in the early century XX, but it was not until Septem- ber 14, 2015, that the Laser Interferometer Gravitational-Wave Observatory (LIGO) directly ob- served the first gravitational waves in history. LIGO consists of two twin detectors, one in Livingston, Louisiana and another in Hanford, Washington. Instrumental and sporadic noises limit the sensitivity of the detectors. Scientists conduct Data Quality studies to distinguish a gravitational-wave signal from the noise, and new techniques are continuously developed to identify, mitigate, and veto unwanted noise. This work presents the application of data analysis techniques, such as Hilbert-Huang trans- form (HHT) and Kalman filtering (KF), in LIGO detector characterization. We investigated the application of HHT to characterize the gravitational-wave signal of the first detection, we also demonstrated the functionality of HHT identifying noise originated from light being scattered by perturbed surfaces, and we estimated thermo-optical aberration using KF. We put particular attention to the scattering origin application, for which a tool was developed to identify disturbed surfaces originating scattering noise. The results reduced considerably the time to search for the scattering surface and helped LIGO commissioners to mitigate the noise.

  14. Analysis of biochemical phase shift oscillators by a harmonic balancing technique.

    PubMed

    Rapp, P

    1976-11-25

    The use of harmonic balancing techniques for theoretically investigating a large class of biochemical phase shift oscillators is outlined and the accuracy of this approximate technique for large dimension nonlinear chemical systems is considered. It is concluded that for the equations under study these techniques can be successfully employed to both find periodic solutions and to indicate those cases which can not oscillate. The technique is a general one and it is possible to state a step by step procedure for its application. It has a substantial advantage in producing results which are immediately valid for arbitrary dimension. As the accuracy of the method increases with dimension, it complements classical small dimension methods. The results obtained by harmonic balancing analysis are compared with those obtained by studying the local stability properties of the singular points of the differential equation. A general theorem is derived which identifies those special cases where the results of first order harmonic balancing are identical to those of local stability analysis, and a necessary condition for this equivalence is derived. As a concrete example, the n-dimensional Goodwin oscillator is considered where p, the Hill coefficient of the feedback metabolite, is equal to three and four. It is shown that for p = 3 or 4 and n less than or equal to 4 the approximation indicates that it is impossible to construct a set of physically permissible reaction constants such that the system possesses a periodic solution. However for n greater than or equal to 5 it is always possible to find a large domain in the reaction constant space giving stable oscillations. A means of constructing such a parameter set is given. The results obtained here are compared with previously derived results for p = 1 and p = 2.

  15. Search for a supersymmetric partner to the top quark using a multivariate analysis technique

    NASA Astrophysics Data System (ADS)

    Darmora, Smita

    Supersymmetry (SUSY) is an extension to the Standard Model (SM) which introduces supersymmetric partners of the known fermions and bosons. Top squark (stop) searches are a natural extension of inclusive SUSY searches at the LHC. If SUSY solves the naturalness problem, the stop should be light enough to cancel the top loop contribution to the Higgs mass parameter. The 3rd generation squarks may be the first SUSY particles to be discovered at the LHC. The stop can decay into a variety of final states, depending, amongst other factors, on the hierarchy of the mass eigenstates formed from the linear superposition of the SUSY partners of the Higgs boson and electroweak gauge bosons. In this study the relevant mass eigenstates are the lightest chargino (chi+/-1) and the neutralino (chi +/-0). A search is presented for a heavy SUSY top partner decaying to a lepton, neutrino and the lightest supersymmetric particle (chi+/-0), via a b-quark and a chargino (chi +/-1) in events with two leptons in the final state. The analysis targets searches for a SUSY top partner by means of Multivariate Analysis Technique, used to discriminate between the stop signal and the background with a learning algorithm based on Monte Carlo generated signal and background samples. The analysis uses data corresponding to 20.3 fb --1 of integrated luminosity at √s = 8 TeV, collected by the ATLAS experiment at the Large Hadron Collider in 2012.

  16. Job Satisfaction: Factor Analysis of Greek Primary School Principals' Perceptions

    ERIC Educational Resources Information Center

    Saiti, Anna; Fassoulis, Konstantinos

    2012-01-01

    Purpose: The purpose of this paper is to investigate the factors that affect the level of job satisfaction that school principals experience and, based on the findings, to suggest policies or techniques for improving it. Design/methodology/approach: Questionnaires were administered to 180 primary school heads in 13 prefectures--one from each of…

  17. An analysis of thermal response factors and how to reduce their computational time requirement

    NASA Technical Reports Server (NTRS)

    Wiese, M. R.

    1982-01-01

    Te RESFAC2 version of the Thermal Response Factor Program (RESFAC) is the result of numerous modifications and additions to the original RESFAC. These modifications and additions have significantly reduced the program's computational time requirement. As a result of this work, the program is more efficient and its code is both readable and understandable. This report describes what a thermal response factor is; analyzes the original matrix algebra calculations and root finding techniques; presents a new root finding technique and streamlined matrix algebra; supplies ten validation cases and their results.

  18. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    PubMed

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  19. Application of Hyphenated Techniques in Speciation Analysis of Arsenic, Antimony, and Thallium

    PubMed Central

    Michalski, Rajmund; Szopa, Sebastian; Jabłońska, Magdalena; Łyko, Aleksandra

    2012-01-01

    Due to the fact that metals and metalloids have a strong impact on the environment, the methods of their determination and speciation have received special attention in recent years. Arsenic, antimony, and thallium are important examples of such toxic elements. Their speciation is especially important in the environmental and biomedical fields because of their toxicity, bioavailability, and reactivity. Recently, speciation analytics has been playing a unique role in the studies of biogeochemical cycles of chemical compounds, determination of toxicity and ecotoxicity of selected elements, quality control of food products, control of medicines and pharmaceutical products, technological process control, research on the impact of technological installation on the environment, examination of occupational exposure, and clinical analysis. Conventional methods are usually labor intensive, time consuming, and susceptible to interferences. The hyphenated techniques, in which separation method is coupled with multidimensional detectors, have become useful alternatives. The main advantages of those techniques consist in extremely low detection and quantification limits, insignificant interference, influence as well as high precision and repeatability of the determinations. In view of their importance, the present work overviews and discusses different hyphenated techniques used for arsenic, antimony, and thallium species analysis, in different clinical, environmental and food matrices. PMID:22654649

  20. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  1. Dimensions of Early Speech Sound Disorders: A Factor Analytic Study

    ERIC Educational Resources Information Center

    Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry

    2006-01-01

    The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…

  2. Factors Affecting the Adoption of R&D Project Selection Techniques at the Air Force Wright Aeronautical Laboratories

    DTIC Science & Technology

    1988-09-01

    tested. To measure 42 the adequacy of the sample, the Kaiser - Meyer - Olkin measure of sampling adequacy was used. This technique is described in Factor...40 4- 0 - 7 0 0 07 -58d the relatively large number of variables, there was concern about the adequacy of the sample size. A Kaiser - Meyer - Olkin

  3. The Coplane Analysis Technique for Three-Dimensional Wind Retrieval Using the HIWRAP Airborne Doppler Radar

    NASA Technical Reports Server (NTRS)

    Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.

    2015-01-01

    The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.

  4. Exploratory factor analysis of borderline personality disorder criteria in hospitalized adolescents.

    PubMed

    Becker, Daniel F; McGlashan, Thomas H; Grilo, Carlos M

    2006-01-01

    The authors examined the factor structure of borderline personality disorder (BPD) in hospitalized adolescents and also sought to add to the theoretical and clinical understanding of any homogeneous components by determining whether they may be related to specific forms of Axis I pathology. Subjects were 123 adolescent inpatients, who were reliably assessed with structured diagnostic interviews for Diagnostic and Statistical Manual of Mental Disorders, Revised Third Edition Axes I and II disorders. Exploratory factor analysis identified BPD components, and logistic regression analyses tested whether these components were predictive of specific Axis I disorders. Factor analysis revealed a 4-factor solution that accounted for 67.0% of the variance. Factor 1 ("suicidal threats or gestures" and "emptiness or boredom") predicted depressive disorders and alcohol use disorders. Factor 2 ("affective instability," "uncontrolled anger," and "identity disturbance") predicted anxiety disorders and oppositional defiant disorder. Factor 3 ("unstable relationships" and "abandonment fears") predicted only anxiety disorders. Factor 4 ("impulsiveness" and "identity disturbance") predicted conduct disorder and substance use disorders. Exploratory factor analysis of BPD criteria in adolescent inpatients revealed 4 BPD factors that appear to differ from those reported for similar studies of adults. The factors represent components of self-negation, irritability, poorly modulated relationships, and impulsivity--each of which is associated with characteristic Axis I pathology. These findings shed light on the nature of BPD in adolescents and may also have implications for treatment.

  5. Exploratory factor analysis of the Oral Health Impact Profile.

    PubMed

    John, M T; Reissmann, D R; Feuerstahler, L; Waller, N; Baba, K; Larsson, P; Celebić, A; Szabo, G; Rener-Sitar, K

    2014-09-01

    Although oral health-related quality of life (OHRQoL) as measured by the Oral Health Impact Profile (OHIP) is thought to be multidimensional, the nature of these dimensions is not known. The aim of this report was to explore the dimensionality of the OHIP using the Dimensions of OHRQoL (DOQ) Project, an international study of general population subjects and prosthodontic patients. Using the project's Learning Sample (n = 5173), we conducted an exploratory factor analysis on the 46 OHIP items not specifically referring to dentures for 5146 subjects with sufficiently complete data. The first eigenvalue (27·0) of the polychoric correlation matrix was more than ten times larger than the second eigenvalue (2·6), suggesting the presence of a dominant, higher-order general factor. Follow-up analyses with Horn's parallel analysis revealed a viable second-order, four-factor solution. An oblique rotation of this solution revealed four highly correlated factors that we named Oral Function, Oro-facial Pain, Oro-facial Appearance and Psychosocial Impact. These four dimensions and the strong general factor are two viable hypotheses for the factor structure of the OHIP. © 2014 John Wiley & Sons Ltd.

  6. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  7. Factor analysis of an instrument to measure the impact of disease on daily life.

    PubMed

    Pedrosa, Rafaela Batista Dos Santos; Rodrigues, Roberta Cunha Matheus; Padilha, Kátia Melissa; Gallani, Maria Cecília Bueno Jayme; Alexandre, Neusa Maria Costa

    2016-01-01

    to verify the structure of factors of an instrument to measure the Heart Valve Disease Impact on Daily Life (IDCV) when applied to coronary artery disease patients. the study included 153 coronary artery disease patients undergoing outpatient follow-up care. The IDCV structure of factors was initially assessed by means of confirmatory factor analysis and, subsequently, by exploratory factor analysis. The Varimax rotation method was used to estimate the main components of analysis, eigenvalues greater than one for extraction of factors, and factor loading greater than 0.40 for selection of items. Internal consistency was estimated using Cronbach's alpha coefficient. confirmatory factor analysis did not confirm the original structure of factors of the IDCV. Exploratory factor analysis showed three dimensions, which together explained 78% of the measurement variance. future studies with expansion of case selection are necessary to confirm the IDCV new structure of factors.

  8. Analysis of the correlative factors for velopharyngeal closure of patients with cleft palate after primary repair.

    PubMed

    Chen, Qi; Li, Yang; Shi, Bing; Yin, Heng; Zheng, Guang-Ning; Zheng, Qian

    2013-12-01

    The objective of this study was to analyze the correlative factors for velopharyngeal closure of patients with cleft palate after primary repair. Ninety-five nonsyndromic patients with cleft palate were enrolled. Two surgical techniques were applied in the patients: simple palatoplasty and combined palatoplasty with pharyngoplasty. All patients were assessed 6 months after the operation. The postoperative velopharyngeal closure (VPC) rate was compared by χ(2) test and the correlative factors were analyzed with logistic regression model. The postoperative VPC rate of young patients was higher than that of old patients, the group with incomplete cleft palate was higher than the group with complete cleft palate, and combined palatoplasty with pharyngoplasty was higher than simple palatoplasty. Operative age, cleft type, and surgical technique were the contributing factors for postoperative VPC rate. Operative age, cleft type, and surgical technique were significant factors influencing postoperative VPC rate of patients with cleft palate. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. A Novel Technique to Detect Code for SAC-OCDMA System

    NASA Astrophysics Data System (ADS)

    Bharti, Manisha; Kumar, Manoj; Sharma, Ajay K.

    2018-04-01

    The main task of optical code division multiple access (OCDMA) system is the detection of code used by a user in presence of multiple access interference (MAI). In this paper, new method of detection known as XOR subtraction detection for spectral amplitude coding OCDMA (SAC-OCDMA) based on double weight codes has been proposed and presented. As MAI is the main source of performance deterioration in OCDMA system, therefore, SAC technique is used in this paper to eliminate the effect of MAI up to a large extent. A comparative analysis is then made between the proposed scheme and other conventional detection schemes used like complimentary subtraction detection, AND subtraction detection and NAND subtraction detection. The system performance is characterized by Q-factor, BER and received optical power (ROP) with respect to input laser power and fiber length. The theoretical and simulation investigations reveal that the proposed detection technique provides better quality factor, security and received power in comparison to other conventional techniques. The wide opening of eye in case of proposed technique also proves its robustness.

  10. Machine learning techniques applied to the determination of road suitability for the transportation of dangerous substances.

    PubMed

    Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G

    2007-08-17

    This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.

  11. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rankmore » impacts both overcompleteness and sparsity.« less

  12. Analysis of soil samples from Gebeng area using NAA technique

    NASA Astrophysics Data System (ADS)

    Elias, Md Suhaimi; Wo, Yii Mei; Hamzah, Mohd Suhaimi; Shukor, Shakirah Abd; Rahman, Shamsiah Ab; Salim, Nazaratul Ashifa Abdullah; Azman, Muhamad Azfar; Hashim, Azian

    2017-01-01

    Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (Igeo) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (Igeo) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated).

  13. Fault detection in digital and analog circuits using an i(DD) temporal analysis technique

    NASA Technical Reports Server (NTRS)

    Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark

    1993-01-01

    An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

  14. Factor Analysis of Drawings: Application to College Student Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-01-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance,…

  15. Analysis techniques for multivariate root loci. [a tool in linear control systems

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  16. Confirmatory factor analysis applied to the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Eaton, Philip; Willoughby, Shannon D.

    2018-06-01

    In 1995, Huffman and Heller used exploratory factor analysis to draw into question the factors of the Force Concept Inventory (FCI). Since then several papers have been published examining the factors of the FCI on larger sets of student responses and understandable factors were extracted as a result. However, none of these proposed factor models have been verified to not be unique to their original sample through the use of independent sets of data. This paper seeks to confirm the factor models proposed by Scott et al. in 2012, and Hestenes et al. in 1992, as well as another expert model proposed within this study through the use of confirmatory factor analysis (CFA) and a sample of 20 822 postinstruction student responses to the FCI. Upon application of CFA using the full sample, all three models were found to fit the data with acceptable global fit statistics. However, when CFA was performed using these models on smaller sample sizes the models proposed by Scott et al. and Eaton and Willoughby were found to be far more stable than the model proposed by Hestenes et al. The goodness of fit of these models to the data suggests that the FCI can be scored on factors that are not unique to a single class. These scores could then be used to comment on how instruction methods effect the performance of students along a single factor and more in-depth analyses of curriculum changes may be possible as a result.

  17. Using the technique of computed tomography for nondestructive analysis of pharmaceutical dosage forms

    NASA Astrophysics Data System (ADS)

    de Oliveira, José Martins, Jr.; Mangini, F. Salvador; Carvalho Vila, Marta Maria Duarte; ViníciusChaud, Marco

    2013-05-01

    This work presents an alternative and non-conventional technique for evaluatingof physic-chemical properties of pharmaceutical dosage forms, i.e. we used computed tomography (CT) technique as a nondestructive technique to visualize internal structures of pharmaceuticals dosage forms and to conduct static and dynamical studies. The studies were conducted involving static and dynamic situations through the use of tomographic images, generated by the scanner at University of Sorocaba - Uniso. We have shown that through the use of tomographic images it is possible to conduct studies of porosity, densities, analysis of morphological parameters and performing studies of dissolution. Our results are in agreement with the literature, showing that CT is a powerful tool for use in the pharmaceutical sciences.

  18. Temperature analysis of laser ignited metalized material using spectroscopic technique

    NASA Astrophysics Data System (ADS)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  19. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  20. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis.

    PubMed

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-07-23

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.

  1. Characterization of rock populations on planetary surfaces - Techniques and a preliminary analysis of Mars and Venus

    NASA Technical Reports Server (NTRS)

    Garvin, J. B.; Mouginis-Mark, P. J.; Head, J. W.

    1981-01-01

    A data collection and analysis scheme developed for the interpretation of rock morphology from lander images is reviewed with emphasis on rock population characterization techniques. Data analysis techniques are also discussed in the context of identifying key characteristics of a rock that place it in a single category with similar rocks. Actual rock characteristics observed from Viking and Venera lander imagery are summarized. Finally, some speculations regarding the block fields on Mars and Venus are presented.

  2. Enrichment and separation techniques for large-scale proteomics analysis of the protein post-translational modifications.

    PubMed

    Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa

    2014-11-06

    Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. SEPARABLE FACTOR ANALYSIS WITH APPLICATIONS TO MORTALITY DATA

    PubMed Central

    Fosdick, Bailey K.; Hoff, Peter D.

    2014-01-01

    Human mortality data sets can be expressed as multiway data arrays, the dimensions of which correspond to categories by which mortality rates are reported, such as age, sex, country and year. Regression models for such data typically assume an independent error distribution or an error model that allows for dependence along at most one or two dimensions of the data array. However, failing to account for other dependencies can lead to inefficient estimates of regression parameters, inaccurate standard errors and poor predictions. An alternative to assuming independent errors is to allow for dependence along each dimension of the array using a separable covariance model. However, the number of parameters in this model increases rapidly with the dimensions of the array and, for many arrays, maximum likelihood estimates of the covariance parameters do not exist. In this paper, we propose a submodel of the separable covariance model that estimates the covariance matrix for each dimension as having factor analytic structure. This model can be viewed as an extension of factor analysis to array-valued data, as it uses a factor model to estimate the covariance along each dimension of the array. We discuss properties of this model as they relate to ordinary factor analysis, describe maximum likelihood and Bayesian estimation methods, and provide a likelihood ratio testing procedure for selecting the factor model ranks. We apply this methodology to the analysis of data from the Human Mortality Database, and show in a cross-validation experiment how it outperforms simpler methods. Additionally, we use this model to impute mortality rates for countries that have no mortality data for several years. Unlike other approaches, our methodology is able to estimate similarities between the mortality rates of countries, time periods and sexes, and use this information to assist with the imputations. PMID:25489353

  4. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  5. Model reduction of the numerical analysis of Low Impact Developments techniques

    NASA Astrophysics Data System (ADS)

    Brunetti, Giuseppe; Šimůnek, Jirka; Wöhling, Thomas; Piro, Patrizia

    2017-04-01

    Mechanistic models have proven to be accurate and reliable tools for the numerical analysis of the hydrological behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption is limited by their complexity and computational cost. Recent studies have tried to address this issue by investigating the application of new techniques, such as surrogate-based modeling. However, current results are still limited and fragmented. One of such approaches, the Model Order Reduction (MOR) technique, can represent a valuable tool for reducing the computational complexity of a numerical problems by computing an approximation of the original model. While this technique has been extensively used in water-related problems, no studies have evaluated its use in LIDs modeling. Thus, the main aim of this study is to apply the MOR technique for the development of a reduced order model (ROM) for the numerical analysis of the hydrologic behavior of LIDs, in particular green roofs. The model should be able to correctly reproduce all the hydrological processes of a green roof while reducing the computational cost. The proposed model decouples the subsurface water dynamic of a green roof in a) one-dimensional (1D) vertical flow through a green roof itself and b) one-dimensional saturated lateral flow along the impervious rooftop. The green roof is horizontally discretized in N elements. Each element represents a vertical domain, which can have different properties or boundary conditions. The 1D Richards equation is used to simulate flow in the substrate and drainage layers. Simulated outflow from the vertical domain is used as a recharge term for saturated lateral flow, which is described using the kinematic wave approximation of the Boussinesq equation. The proposed model has been compared with the mechanistic model HYDRUS-2D, which numerically solves the Richards equation for the whole domain. The HYDRUS-1D code has been used for the description of vertical flow

  6. Using data mining techniques to predict the severity of bicycle crashes.

    PubMed

    Prati, Gabriele; Pietrantoni, Luca; Fraboni, Federico

    2017-04-01

    To investigate the factors predicting severity of bicycle crashes in Italy, we used an observational study of official statistics. We applied two of the most widely used data mining techniques, CHAID decision tree technique and Bayesian network analysis. We used data provided by the Italian National Institute of Statistics on road crashes that occurred on the Italian road network during the period ranging from 2011 to 2013. In the present study, the dataset contains information about road crashes occurred on the Italian road network during the period ranging from 2011 to 2013. We extracted 49,621 road accidents where at least one cyclist was injured or killed from the original database that comprised a total of 575,093 road accidents. CHAID decision tree technique was employed to establish the relationship between severity of bicycle crashes and factors related to crash characteristics (type of collision and opponent vehicle), infrastructure characteristics (type of carriageway, road type, road signage, pavement type, and type of road segment), cyclists (gender and age), and environmental factors (time of the day, day of the week, month, pavement condition, and weather). CHAID analysis revealed that the most important predictors were, in decreasing order of importance, road type (0.30), crash type (0.24), age of cyclist (0.19), road signage (0.08), gender of cyclist (0.07), type of opponent vehicle (0.05), month (0.04), and type of road segment (0.02). These eight most important predictors of the severity of bicycle crashes were included as predictors of the target (i.e., severity of bicycle crashes) in Bayesian network analysis. Bayesian network analysis identified crash type (0.31), road type (0.19), and type of opponent vehicle (0.18) as the most important predictors of severity of bicycle crashes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Methodological factors affecting gas and methane production during in vitro rumen fermentation evaluated by meta-analysis approach.

    PubMed

    Maccarana, Laura; Cattani, Mirko; Tagliapietra, Franco; Schiavon, Stefano; Bailoni, Lucia; Mantovani, Roberto

    2016-01-01

    Effects of some methodological factors on in vitro measures of gas production (GP, mL/g DM), CH4 production (mL/g DM) and proportion (% CH4 on total GP) were investigated by meta-analysis. These factors were considered: pressure in the GP equipment (0 = constant; 1 = increasing), incubation time (0 = 24; 1 = ≥ 48 h), time of rumen fluid collection (0 = before feeding; 1 = after feeding of donor animals), donor species of rumen fluid (0 = sheep; 1 = bovine), presence of N in the buffer solution (0 = presence; 1 = absence), and ratio between amount of buffered rumen fluid and feed sample (BRF/FS; 0 = ≤ 130 mL/g DM; 1 = 130-140 mL/g DM; 2 = ≥ 140 mL/g DM). The NDF content of feed sample incubated (NDF) was considered as a continuous variable. From an initial database of 105 papers, 58 were discarded because one of the above-mentioned factors was not stated. After discarding 17 papers, the final dataset comprised 30 papers (339 observations). A preliminary mixed model analysis was carried out on experimental data considering the study as random factor. Variables adjusted for study effect were analyzed using a backward stepwise analysis including the above-mentioned variables. The analysis showed that the extension of incubation time and reduction of NDF increased GP and CH4 values. Values of GP and CH4 also increased when rumen fluid was collected after feeding compared to before feeding (+26.4 and +9.0 mL/g DM, for GP and CH4), from bovine compared to sheep (+32.8 and +5.2 mL/g DM, for GP and CH4), and when the buffer solution did not contain N (+24.7 and +6.7 mL/g DM for GP and CH4). The increase of BRF/FS ratio enhanced GP and CH4 production (+7.7 and +3.3 mL/g DM per each class of increase, respectively). In vitro techniques for measuring GP and CH4 production are mostly used as screening methods, thus a full standardization of such techniques is not feasible. However, a greater harmonization

  8. Physics Metacognition Inventory Part Ii: Confirmatory Factor Analysis and Rasch Analysis

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-01-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition,…

  9. Synergistic effects of Mo and F doping on the quality factor of ZnO thin films prepared by a fully automated home-made nebulizer spray technique

    NASA Astrophysics Data System (ADS)

    Ravichandran, K.; Dineshbabu, N.; Arun, T.; Manivasaham, A.; Sindhuja, E.

    2017-01-01

    Transparent conducting oxide films of undoped, Mo doped, Mo + F co-doped ZnO were deposited using a facile homemade nebulizer spray pyrolysis technique. The effects of Mo and F doping on the structural, optical, electrical and surface morphological properties were investigated using XRD, UV-vis-NIR spectroscopy, I-V and Hall probe techniques, FESEM and AFM, and XPS, respectively. The XRD analysis confirms that all the films are well crystallized with hexagonal wurtzite structure. All the synthesized samples exhibit high transmittance (above 85%) in the visible region. The current-voltage (I-V) characteristics show the ohmic conduction nature of the films. The Hall probe measurements show that the synergistic effects of Mo and F doping cause desirable improvements in the quality factor of the ZnO films. A minimum resistivity of 5.12 × 10-3 Ω cm with remarkably higher values of mobility and carrier concentration is achieved for Mo (2 at.%) + F (15 at.%) co-doped ZnO films. A considerable variation in the intensity of deep level emission caused by Mo and F doping is observed in the photoluminescence (PL) studies. The presence of the constituent elements in the samples is confirmed by XPS analysis.

  10. Estimating the settling velocity of bioclastic sediment using common grain-size analysis techniques

    USGS Publications Warehouse

    Cuttler, Michael V. W.; Lowe, Ryan J.; Falter, James L.; Buscombe, Daniel D.

    2017-01-01

    Most techniques for estimating settling velocities of natural particles have been developed for siliciclastic sediments. Therefore, to understand how these techniques apply to bioclastic environments, measured settling velocities of bioclastic sedimentary deposits sampled from a nearshore fringing reef in Western Australia were compared with settling velocities calculated using results from several common grain-size analysis techniques (sieve, laser diffraction and image analysis) and established models. The effects of sediment density and shape were also examined using a range of density values and three different models of settling velocity. Sediment density was found to have a significant effect on calculated settling velocity, causing a range in normalized root-mean-square error of up to 28%, depending upon settling velocity model and grain-size method. Accounting for particle shape reduced errors in predicted settling velocity by 3% to 6% and removed any velocity-dependent bias, which is particularly important for the fastest settling fractions. When shape was accounted for and measured density was used, normalized root-mean-square errors were 4%, 10% and 18% for laser diffraction, sieve and image analysis, respectively. The results of this study show that established models of settling velocity that account for particle shape can be used to estimate settling velocity of irregularly shaped, sand-sized bioclastic sediments from sieve, laser diffraction, or image analysis-derived measures of grain size with a limited amount of error. Collectively, these findings will allow for grain-size data measured with different methods to be accurately converted to settling velocity for comparison. This will facilitate greater understanding of the hydraulic properties of bioclastic sediment which can help to increase our general knowledge of sediment dynamics in these environments.

  11. The palisade cartilage tympanoplasty technique: a systematic review and meta-analysis.

    PubMed

    Jeffery, Caroline C; Shillington, Cameron; Andrews, Colin; Ho, Allan

    2017-06-17

    Tympanoplasty is a common procedure performed by Otolaryngologists. Many types of autologous grafts have been used with variations of techniques with varying results. This is the first systematic review of the literature and meta-analysis with the aim to evaluate the effectiveness of one of the techniques which is gaining popularity, the palisade cartilage tympanoplasty. PubMed, EMBASE, and Cochrane databases were searched for "palisade", "cartilage", "tympanoplasty", "perforation" and their synonyms. In total, 199 articles reporting results of palisade cartilage tympanoplasty were identified. Five articles satisfied the following inclusion criteria: adult patients, minimum 6 months follow-up, hearing and surgical outcomes reported. Studies with patients undergoing combined mastoidectomy, ossicular chain reconstruction, and/or other middle ear surgery were excluded. Perforation closure, rate of complications, and post-operative pure-tone average change were extracted for pooled analysis. Study failure and complication proportions that were used to generate odds ratios were pooled. Fixed effects and random effects weightings were generated. The resulting pooled odds ratios are reported. Palisade cartilage tympanoplasty has an overall take rate of 96% at beyond 6 months and has similar odds of complications compared to temporalis fascia (OR 0.89, 95% CI 0.62, 1.30). The air-bone gap closure is statistically similar to reported results from temporalis fascia tympanoplasty. Cartilage palisade tympanoplasty offers excellent graft take rates and good postoperative hearing outcomes for perforations of various sizes and for both primary and revision cases. This technique has predictable, long-term results with low complication rates, similar to temporalis fascia tympanoplasty.

  12. Connectivism in Postsecondary Online Courses: An Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hogg, Nanette; Lomicky, Carol S.

    2012-01-01

    This study explores 465 postsecondary students' experiences in online classes through the lens of connectivism. Downes' 4 properties of connectivism (diversity, autonomy, interactivity, and openness) were used as the study design. An exploratory factor analysis was performed. This study found a 4-factor solution. Subjects indicated that autonomy…

  13. Assessment of phytoplankton class abundance using fluorescence excitation-emission matrix by parallel factor analysis and nonnegative least squares

    NASA Astrophysics Data System (ADS)

    Su, Rongguo; Chen, Xiaona; Wu, Zhenzhen; Yao, Peng; Shi, Xiaoyong

    2015-07-01

    The feasibility of using fluorescence excitation-emission matrix (EEM) along with parallel factor analysis (PARAFAC) and nonnegative least squares (NNLS) method for the differentiation of phytoplankton taxonomic groups was investigated. Forty-one phytoplankton species belonging to 28 genera of five divisions were studied. First, the PARAFAC model was applied to EEMs, and 15 fluorescence components were generated. Second, 15 fluorescence components were found to have a strong discriminating capability based on Bayesian discriminant analysis (BDA). Third, all spectra of the fluorescence component compositions for the 41 phytoplankton species were spectrographically sorted into 61 reference spectra using hierarchical cluster analysis (HCA), and then, the reference spectra were used to establish a database. Finally, the phytoplankton taxonomic groups was differentiated by the reference spectra database using the NNLS method. The five phytoplankton groups were differentiated with the correct discrimination ratios (CDRs) of 100% for single-species samples at the division level. The CDRs for the mixtures were above 91% for the dominant phytoplankton species and above 73% for the subdominant phytoplankton species. Sixteen of the 85 field samples collected from the Changjiang River estuary were analyzed by both HPLC-CHEMTAX and the fluorometric technique developed. The results of both methods reveal that Bacillariophyta was the dominant algal group in these 16 samples and that the subdominant algal groups comprised Dinophyta, Chlorophyta and Cryptophyta. The differentiation results by the fluorometric technique were in good agreement with those from HPLC-CHEMTAX. The results indicate that the fluorometric technique could differentiate algal taxonomic groups accurately at the division level.

  14. A retrospective analysis of laparoscopic partial nephrectomy with segmental renal artery clamping and factors that predict postoperative renal function.

    PubMed

    Li, Pu; Qin, Chao; Cao, Qiang; Li, Jie; Lv, Qiang; Meng, Xiaoxin; Ju, Xiaobing; Tang, Lijun; Shao, Pengfei

    2016-10-01

    To evaluate the feasibility and efficiency of laparoscopic partial nephrectomy (LPN) with segmental renal artery clamping, and to analyse the factors affecting postoperative renal function. We conducted a retrospective analysis of 466 consecutive patients undergoing LPN using main renal artery clamping (group A, n = 152) or segmental artery clamping (group B, n = 314) between September 2007 and July 2015 in our department. Blood loss, operating time, warm ischaemia time (WIT) and renal function were compared between groups. Univariable and multivariable linear regression analyses were applied to assess the correlations of selected variables with postoperative glomerular filtration rate (GFR) reduction. Volumetric data and estimated GFR of a subset of 60 patients in group B were compared with GFR to evaluate the correlation between these functional variables and preserved renal function after LPN. The novel technique slightly increased operating time, WIT and intra-operative blood loss (P < 0.001), while it provided better postoperative renal function (P < 0.001) compared with the conventional technique. The blocking method and tumour characteristics were independent factors affecting GFR reduction, while WIT was not an independent factor. Correlation analysis showed that estimated GFR presented better correlation with GFR compared with kidney volume (R(2) = 0.794 cf. R(2) = 0.199) in predicting renal function after LPN. LPN with segmental artery clamping minimizes warm ischaemia injury and provides better early postoperative renal function compared with clamping the main renal artery. Kidney volume has a significantly inferior role compared with eGFR in predicting preserved renal function. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  15. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    PubMed

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  16. Factor analysis of the Hamilton Depression Rating Scale in Parkinson's disease.

    PubMed

    Broen, M P G; Moonen, A J H; Kuijf, M L; Dujardin, K; Marsh, L; Richard, I H; Starkstein, S E; Martinez-Martin, P; Leentjens, A F G

    2015-02-01

    Several studies have validated the Hamilton Depression Rating Scale (HAMD) in patients with Parkinson's disease (PD), and reported adequate reliability and construct validity. However, the factorial validity of the HAMD has not yet been investigated. The aim of our analysis was to explore the factor structure of the HAMD in a large sample of PD patients. A principal component analysis of the 17-item HAMD was performed on data of 341 PD patients, available from a previous cross sectional study on anxiety. An eigenvalue ≥1 was used to determine the number of factors. Factor loadings ≥0.4 in combination with oblique rotations were used to identify which variables made up the factors. Kaiser-Meyer-Olkin measure (KMO), Cronbach's alpha, Bartlett's test, communality, percentage of non-redundant residuals and the component correlation matrix were computed to assess factor validity. KMO verified the sample's adequacy for factor analysis and Cronbach's alpha indicated a good internal consistency of the total scale. Six factors had eigenvalues ≥1 and together explained 59.19% of the variance. The number of items per factor varied from 1 to 6. Inter-item correlations within each component were low. There was a high percentage of non-redundant residuals and low communality. This analysis demonstrates that the factorial validity of the HAMD in PD is unsatisfactory. This implies that the scale is not appropriate for studying specific symptom domains of depression based on factorial structure in a PD population. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  18. Evaluating voice characteristics of first-year acting students in Israel: factor analysis.

    PubMed

    Amir, Ofer; Primov-Fever, Adi; Kushnir, Tami; Kandelshine-Waldman, Osnat; Wolf, Michael

    2013-01-01

    Acting students require diverse, high-quality, and high-intensity vocal performance from early stages of their training. Demanding vocal activities, before developing the appropriate vocal skills, put them in high risk for developing vocal problems. A retrospective analysis of voice characteristics of first-year acting students using several voice evaluation tools. A total of 79 first-year acting students (55 women and 24 men) were assigned into two study groups: laryngeal findings (LFs) and no laryngeal findings, based on stroboscopic findings. Their voice characteristics were evaluated using acoustic analysis, aerodynamic examination, perceptual scales, and self-report questionnaires. Results obtained from each set of measures were examined using a factor analysis approach. Significant differences between the two groups were found for a single fundamental frequency (F(0))-Regularity factor; a single Grade, Roughness, Breathiness, Asthenia, Strain perceptual factor; and the three self-evaluation factors. Gender differences were found for two acoustic analysis factors, which were based on F(0) and its derivatives, namely an aerodynamic factor that represents expiratory volume measurements and a single self-evaluation factor that represents the tendency to seek therapy. Approximately 50% of the first-year acting students had LFs. These students differed from their peers in the control group in a single acoustic analysis factor, as well as perceptual and self-report factors. No group differences, however, were found for the aerodynamic factors. Early laryngeal examination and voice evaluation of future professional voice users could provide a valuable individual baseline, to which later examinations could be compared, and assist in providing personally tailored treatment. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  19. Kinematic and kinetic analysis of overhand, sidearm and underhand lacrosse shot techniques.

    PubMed

    Macaulay, Charles A J; Katz, Larry; Stergiou, Pro; Stefanyshyn, Darren; Tomaghelli, Luciano

    2017-12-01

    Lacrosse requires the coordinated performance of many complex skills. One of these skills is shooting on the opponents' net using one of three techniques: overhand, sidearm or underhand. The purpose of this study was to (i) determine which technique generated the highest ball velocity and greatest shot accuracy and (ii) identify kinematic and kinetic variables that contribute to a high velocity and high accuracy shot. Twelve elite male lacrosse players participated in this study. Kinematic data were sampled at 250 Hz, while two-dimensional force plates collected ground reaction force data (1000 Hz). Statistical analysis showed significantly greater ball velocity for the sidearm technique than overhand (P < 0.001) and underhand (P < 0.001) techniques. No statistical difference was found for shot accuracy (P > 0.05). Kinematic and kinetic variables were not significantly correlated to shot accuracy or velocity across all shot types; however, when analysed independently, the lead foot horizontal impulse showed a negative correlation with underhand ball velocity (P = 0.042). This study identifies the technique with the highest ball velocity, defines kinematic and kinetic predictors related to ball velocity and provides information to coaches and athletes concerned with improving lacrosse shot performance.

  20. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  1. A simple 2D composite image analysis technique for the crystal growth study of L-ascorbic acid.

    PubMed

    Kumar, Krishan; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir

    2017-06-01

    This work was destined for 2D crystal growth studies of L-ascorbic acid using the composite image analysis technique. Growth experiments on the L-ascorbic acid crystals were carried out by standard (optical) microscopy, laser diffraction analysis, and composite image analysis. For image analysis, the growth of L-ascorbic acid crystals was captured as digital 2D RGB images, which were then processed to composite images. After processing, the crystal boundaries emerged as white lines against the black (cancelled) background. The crystal boundaries were well differentiated by peaks in the intensity graphs generated for the composite images. The lengths of crystal boundaries measured from the intensity graphs of composite images were in good agreement (correlation coefficient "r" = 0.99) with the lengths measured by standard microscopy. On the contrary, the lengths measured by laser diffraction were poorly correlated with both techniques. Therefore, the composite image analysis can replace the standard microscopy technique for the crystal growth studies of L-ascorbic acid. © 2017 Wiley Periodicals, Inc.

  2. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  3. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  4. Economic and Demographic Factors Impacting Placement of Students with Autism

    ERIC Educational Resources Information Center

    Kurth, Jennifer A.; Mastergeorge, Ann M.; Paschall, Katherine

    2016-01-01

    Educational placement of students with autism is often associated with child factors, such as IQ and communication skills. However, variability in placement patterns across states suggests that other factors are at play. This study used hierarchical cluster analysis techniques to identify demographic, economic, and educational covariates…

  5. Catchment process affecting drinking water quality, including the significance of rainfall events, using factor analysis and event mean concentrations.

    PubMed

    Cinque, Kathy; Jayasuriya, Niranjali

    2010-12-01

    To ensure the protection of drinking water an understanding of the catchment processes which can affect water quality is important as it enables targeted catchment management actions to be implemented. In this study factor analysis (FA) and comparing event mean concentrations (EMCs) with baseline values were techniques used to asses the relationships between water quality parameters and linking those parameters to processes within an agricultural drinking water catchment. FA found that 55% of the variance in the water quality data could be explained by the first factor, which was dominated by parameters usually associated with erosion. Inclusion of pathogenic indicators in an additional FA showed that Enterococcus and Clostridium perfringens (C. perfringens) were also related to the erosion factor. Analysis of the EMCs found that most parameters were significantly higher during periods of rainfall runoff. This study shows that the most dominant processes in an agricultural catchment are surface runoff and erosion. It also shows that it is these processes which mobilise pathogenic indicators and are therefore most likely to influence the transport of pathogens. Catchment management efforts need to focus on reducing the effect of these processes on water quality.

  6. Elimination of chromatographic and mass spectrometric problems in GC-MS analysis of Lavender essential oil by multivariate curve resolution techniques: Improving the peak purity assessment by variable size moving window-evolving factor analysis.

    PubMed

    Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan

    2015-03-01

    In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.

  7. Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach

    ERIC Educational Resources Information Center

    Lending, Diane; May, Jeffrey

    2013-01-01

    Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…

  8. A new technique for ordering asymmetrical three-dimensional data sets in ecology.

    PubMed

    Pavoine, Sandrine; Blondel, Jacques; Baguette, Michel; Chessel, Daniel

    2007-02-01

    The aim of this paper is to tackle the problem that arises from asymmetrical data cubes formed by two crossed factors fixed by the experimenter (factor A and factor B, e.g., sites and dates) and a factor which is not controlled for (the species). The entries of this cube are densities in species. We approach this kind of data by the comparison of patterns, that is to say by analyzing first the effect of factor B on the species-factor A pattern, and second the effect of factor A on the species-factor B pattern. The analysis of patterns instead of individual responses requires a correspondence analysis. We use a method we call Foucart's correspondence analysis to coordinate the correspondence analyses of several independent matrices of species x factor A (respectively B) type, corresponding to each modality of factor B (respectively A). Such coordination makes it possible to evaluate the effect of factor B (respectively A) on the species-factor A (respectively B) pattern. The results obtained by such a procedure are much more insightful than those resulting from a classical single correspondence analysis applied to the global matrix that is obtained by simply unrolling the data cube, juxtaposing for example the individual species x factor A matrices through modalities of factor B. This is because a single global correspondence analysis combines three effects of factors in a way that cannot be determined from factorial maps (factor A, factor B, and factor A x factor B interaction) whereas the applications of Foucart's correspondence analysis clearly discriminate two different issues. Using two data sets, we illustrate that this technique proves to be particularly powerful in the analyses of ecological convergence which include several distinct data sets and in the analyses of spatiotemporal variations of species distributions.

  9. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    NASA Astrophysics Data System (ADS)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  10. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    PubMed

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  11. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  12. Analysis of Learning Curve Fitting Techniques.

    DTIC Science & Technology

    1987-09-01

    1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied

  13. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  14. Graphical Representation of University Image: A Correspondence Analysis.

    ERIC Educational Resources Information Center

    Yavas, Ugar; Shemwell, Donald J.

    1996-01-01

    Correspondence analysis, an easy-to-interpret interdependence technique, portrays data graphically to show associations of factors more clearly. A study used the technique with 58 students in one university to determine factors in college choice. Results identified the institution's closest competitors and its positioning in terms of college…

  15. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  16. Analysis of the influencing factors of global energy interconnection development

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; He, Yongxiu; Ge, Sifan; Liu, Lin

    2018-04-01

    Under the background of building global energy interconnection and achieving green and low-carbon development, this paper grasps a new round of energy restructuring and the trend of energy technology change, based on the present situation of global and China's global energy interconnection development, established the index system of the impact of global energy interconnection development factors. A subjective and objective weight analysis of the factors affecting the development of the global energy interconnection was conducted separately by network level analysis and entropy method, and the weights are summed up by the method of additive integration, which gives the comprehensive weight of the influencing factors and the ranking of their influence.

  17. General methodology: Costing, budgeting, and techniques for benefit-cost and cost-effectiveness analysis

    NASA Technical Reports Server (NTRS)

    Stretchberry, D. M.; Hein, G. F.

    1972-01-01

    The general concepts of costing, budgeting, and benefit-cost ratio and cost-effectiveness analysis are discussed. The three common methods of costing are presented. Budgeting distributions are discussed. The use of discounting procedures is outlined. The benefit-cost ratio and cost-effectiveness analysis is defined and their current application to NASA planning is pointed out. Specific practices and techniques are discussed, and actual costing and budgeting procedures are outlined. The recommended method of calculating benefit-cost ratios is described. A standardized method of cost-effectiveness analysis and long-range planning are also discussed.

  18. Examining Evolving Performance on the Force Concept Inventory Using Factor Analysis

    ERIC Educational Resources Information Center

    Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W

    2017-01-01

    The application of factor analysis to the "Force Concept Inventory" (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a…

  19. A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis

    ERIC Educational Resources Information Center

    Edwards, Michael C.

    2010-01-01

    Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…

  20. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    ERIC Educational Resources Information Center

    Baglin, James

    2014-01-01

    Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…