Sample records for statistical techniques principal

  1. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  2. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  3. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  4. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  5. Principle Component Analysis with Incomplete Data: A simulation of R pcaMethods package in Constructing an Environmental Quality Index with Missing Data

    EPA Science Inventory

    Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...

  6. Information extraction from multivariate images

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Kegley, K. A.; Schiess, J. R.

    1986-01-01

    An overview of several multivariate image processing techniques is presented, with emphasis on techniques based upon the principal component transformation (PCT). Multiimages in various formats have a multivariate pixel value, associated with each pixel location, which has been scaled and quantized into a gray level vector, and the bivariate of the extent to which two images are correlated. The PCT of a multiimage decorrelates the multiimage to reduce its dimensionality and reveal its intercomponent dependencies if some off-diagonal elements are not small, and for the purposes of display the principal component images must be postprocessed into multiimage format. The principal component analysis of a multiimage is a statistical analysis based upon the PCT whose primary application is to determine the intrinsic component dimensionality of the multiimage. Computational considerations are also discussed.

  7. Proceedings of the NASA Symposium on Mathematical Pattern Recognition and Image Analysis

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.

    1983-01-01

    The application of mathematical and statistical analyses techniques to imagery obtained by remote sensors is described by Principal Investigators. Scene-to-map registration, geometric rectification, and image matching are among the pattern recognition aspects discussed.

  8. Principal component analysis of the cytokine and chemokine response to human traumatic brain injury.

    PubMed

    Helmy, Adel; Antoniades, Chrystalina A; Guilfoyle, Mathew R; Carpenter, Keri L H; Hutchinson, Peter J

    2012-01-01

    There is a growing realisation that neuro-inflammation plays a fundamental role in the pathology of Traumatic Brain Injury (TBI). This has led to the search for biomarkers that reflect these underlying inflammatory processes using techniques such as cerebral microdialysis. The interpretation of such biomarker data has been limited by the statistical methods used. When analysing data of this sort the multiple putative interactions between mediators need to be considered as well as the timing of production and high degree of statistical co-variance in levels of these mediators. Here we present a cytokine and chemokine dataset from human brain following human traumatic brain injury and use principal component analysis and partial least squares discriminant analysis to demonstrate the pattern of production following TBI, distinct phases of the humoral inflammatory response and the differing patterns of response in brain and in peripheral blood. This technique has the added advantage of making no assumptions about the Relative Recovery (RR) of microdialysis derived parameters. Taken together these techniques can be used in complex microdialysis datasets to summarise the data succinctly and generate hypotheses for future study.

  9. Identifying Nanoscale Structure-Function Relationships Using Multimodal Atomic Force Microscopy, Dimensionality Reduction, and Regression Techniques.

    PubMed

    Kong, Jessica; Giridharagopal, Rajiv; Harrison, Jeffrey S; Ginger, David S

    2018-05-31

    Correlating nanoscale chemical specificity with operational physics is a long-standing goal of functional scanning probe microscopy (SPM). We employ a data analytic approach combining multiple microscopy modes, using compositional information in infrared vibrational excitation maps acquired via photoinduced force microscopy (PiFM) with electrical information from conductive atomic force microscopy. We study a model polymer blend comprising insulating poly(methyl methacrylate) (PMMA) and semiconducting poly(3-hexylthiophene) (P3HT). We show that PiFM spectra are different from FTIR spectra, but can still be used to identify local composition. We use principal component analysis to extract statistically significant principal components and principal component regression to predict local current and identify local polymer composition. In doing so, we observe evidence of semiconducting P3HT within PMMA aggregates. These methods are generalizable to correlated SPM data and provide a meaningful technique for extracting complex compositional information that are impossible to measure from any one technique.

  10. Proposal for a biometrics of the cortical surface: a statistical method for relative surface distance metrics

    NASA Astrophysics Data System (ADS)

    Bookstein, Fred L.

    1995-08-01

    Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.

  11. Incorporating principal component analysis into air quality model evaluation

    EPA Science Inventory

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...

  12. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  13. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  14. Principal Component Analysis in the Spectral Analysis of the Dynamic Laser Speckle Patterns

    NASA Astrophysics Data System (ADS)

    Ribeiro, K. M.; Braga, R. A., Jr.; Horgan, G. W.; Ferreira, D. D.; Safadi, T.

    2014-02-01

    Dynamic laser speckle is a phenomenon that interprets an optical patterns formed by illuminating a surface under changes with coherent light. Therefore, the dynamic change of the speckle patterns caused by biological material is known as biospeckle. Usually, these patterns of optical interference evolving in time are analyzed by graphical or numerical methods, and the analysis in frequency domain has also been an option, however involving large computational requirements which demands new approaches to filter the images in time. Principal component analysis (PCA) works with the statistical decorrelation of data and it can be used as a data filtering. In this context, the present work evaluated the PCA technique to filter in time the data from the biospeckle images aiming the reduction of time computer consuming and improving the robustness of the filtering. It was used 64 images of biospeckle in time observed in a maize seed. The images were arranged in a data matrix and statistically uncorrelated by PCA technique, and the reconstructed signals were analyzed using the routine graphical and numerical methods to analyze the biospeckle. Results showed the potential of the PCA tool in filtering the dynamic laser speckle data, with the definition of markers of principal components related to the biological phenomena and with the advantage of fast computational processing.

  15. Rotation of EOFs by the Independent Component Analysis: Towards A Solution of the Mixing Problem in the Decomposition of Geophysical Time Series

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2001-01-01

    The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.

  16. PM10 and gaseous pollutants trends from air quality monitoring networks in Bari province: principal component analysis and absolute principal component scores on a two years and half data set

    PubMed Central

    2014-01-01

    Background The chemical composition of aerosols and particle size distributions are the most significant factors affecting air quality. In particular, the exposure to finer particles can cause short and long-term effects on human health. In the present paper PM10 (particulate matter with aerodynamic diameter lower than 10 μm), CO, NOx (NO and NO2), Benzene and Toluene trends monitored in six monitoring stations of Bari province are shown. The data set used was composed by bi-hourly means for all parameters (12 bi-hourly means per day for each parameter) and it’s referred to the period of time from January 2005 and May 2007. The main aim of the paper is to provide a clear illustration of how large data sets from monitoring stations can give information about the number and nature of the pollutant sources, and mainly to assess the contribution of the traffic source to PM10 concentration level by using multivariate statistical techniques such as Principal Component Analysis (PCA) and Absolute Principal Component Scores (APCS). Results Comparing the night and day mean concentrations (per day) for each parameter it has been pointed out that there is a different night and day behavior for some parameters such as CO, Benzene and Toluene than PM10. This suggests that CO, Benzene and Toluene concentrations are mainly connected with transport systems, whereas PM10 is mostly influenced by different factors. The statistical techniques identified three recurrent sources, associated with vehicular traffic and particulate transport, covering over 90% of variance. The contemporaneous analysis of gas and PM10 has allowed underlining the differences between the sources of these pollutants. Conclusions The analysis of the pollutant trends from large data set and the application of multivariate statistical techniques such as PCA and APCS can give useful information about air quality and pollutant’s sources. These knowledge can provide useful advices to environmental policies in order to reach the WHO recommended levels. PMID:24555534

  17. Reservoir zonation based on statistical analyses: A case study of the Nubian sandstone, Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    El Sharawy, Mohamed S.; Gaafar, Gamal R.

    2016-12-01

    Both reservoir engineers and petrophysicists have been concerned about dividing a reservoir into zones for engineering and petrophysics purposes. Through decades, several techniques and approaches were introduced. Out of them, statistical reservoir zonation, stratigraphic modified Lorenz (SML) plot and the principal component and clustering analyses techniques were chosen to apply on the Nubian sandstone reservoir of Palaeozoic - Lower Cretaceous age, Gulf of Suez, Egypt, by using five adjacent wells. The studied reservoir consists mainly of sandstone with some intercalation of shale layers with varying thickness from one well to another. The permeability ranged from less than 1 md to more than 1000 md. The statistical reservoir zonation technique, depending on core permeability, indicated that the cored interval of the studied reservoir can be divided into two zones. Using reservoir properties such as porosity, bulk density, acoustic impedance and interval transit time indicated also two zones with an obvious variation in separation depth and zones continuity. The stratigraphic modified Lorenz (SML) plot indicated the presence of more than 9 flow units in the cored interval as well as a high degree of microscopic heterogeneity. On the other hand, principal component and cluster analyses, depending on well logging data (gamma ray, sonic, density and neutron), indicated that the whole reservoir can be divided at least into four electrofacies having a noticeable variation in reservoir quality, as correlated with the measured permeability. Furthermore, continuity or discontinuity of the reservoir zones can be determined using this analysis.

  18. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  19. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    PubMed

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.

  20. [Analysis of the movement of long axis and the distribution of principal stress in abutment tooth retained by conical telescope].

    PubMed

    Lin, Ying-he; Man, Yi; Qu, Yi-li; Guan, Dong-hua; Lu, Xuan; Wei, Na

    2006-01-01

    To study the movement of long axis and the distribution of principal stress in the abutment teeth in removable partial denture which is retained by use of conical telescope. An ideal three dimensional finite element model was constructed by using SCT image reconstruction technique, self-programming and ANSYS software. The static loads were applied. The displacement of the long axis and the distribution of the principal stress in the abutment teeth was analyzed. There is no statistic difference of displacenat and stress distribution among different three-dimensional finite element models. Generally, the abutment teeth move along the long axis itself. Similar stress distribution was observed in each three-dimensional finite element model. The maximal principal compressive stress was observed at the distal cervix of the second premolar. The abutment teeth can be well protected by use of conical telescope.

  1. Ripening-dependent metabolic changes in the volatiles of pineapple (Ananas comosus (L.) Merr.) fruit: II. Multivariate statistical profiling of pineapple aroma compounds based on comprehensive two-dimensional gas chromatography-mass spectrometry.

    PubMed

    Steingass, Christof Björn; Jutzi, Manfred; Müller, Jenny; Carle, Reinhold; Schmarr, Hans-Georg

    2015-03-01

    Ripening-dependent changes of pineapple volatiles were studied in a nontargeted profiling analysis. Volatiles were isolated via headspace solid phase microextraction and analyzed by comprehensive 2D gas chromatography and mass spectrometry (HS-SPME-GC×GC-qMS). Profile patterns presented in the contour plots were evaluated applying image processing techniques and subsequent multivariate statistical data analysis. Statistical methods comprised unsupervised hierarchical cluster analysis (HCA) and principal component analysis (PCA) to classify the samples. Supervised partial least squares discriminant analysis (PLS-DA) and partial least squares (PLS) regression were applied to discriminate different ripening stages and describe the development of volatiles during postharvest storage, respectively. Hereby, substantial chemical markers allowing for class separation were revealed. The workflow permitted the rapid distinction between premature green-ripe pineapples and postharvest-ripened sea-freighted fruits. Volatile profiles of fully ripe air-freighted pineapples were similar to those of green-ripe fruits postharvest ripened for 6 days after simulated sea freight export, after PCA with only two principal components. However, PCA considering also the third principal component allowed differentiation between air-freighted fruits and the four progressing postharvest maturity stages of sea-freighted pineapples.

  2. A PRINCIPAL COMPONENT ANALYSIS OF THE CLEAN AIR STATUS AND TRENDS NETWORK (CASTNET) AIR CONCENTRATION DATA

    EPA Science Inventory

    The spatial and temporal variability of ambient air concentrations of SO2, SO42-, NO3, HNO3, and NH4+ obtained from EPA's CASTNet was examined using an objective, statistically based technique...

  3. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques

    PubMed Central

    Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934

  4. Chemometric and multivariate statistical analysis of time-of-flight secondary ion mass spectrometry spectra from complex Cu-Fe sulfides.

    PubMed

    Kalegowda, Yogesh; Harmer, Sarah L

    2012-03-20

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra of mineral samples are complex, comprised of large mass ranges and many peaks. Consequently, characterization and classification analysis of these systems is challenging. In this study, different chemometric and statistical data evaluation methods, based on monolayer sensitive TOF-SIMS data, have been tested for the characterization and classification of copper-iron sulfide minerals (chalcopyrite, chalcocite, bornite, and pyrite) at different flotation pulp conditions (feed, conditioned feed, and Eh modified). The complex mass spectral data sets were analyzed using the following chemometric and statistical techniques: principal component analysis (PCA); principal component-discriminant functional analysis (PC-DFA); soft independent modeling of class analogy (SIMCA); and k-Nearest Neighbor (k-NN) classification. PCA was found to be an important first step in multivariate analysis, providing insight into both the relative grouping of samples and the elemental/molecular basis for those groupings. For samples exposed to oxidative conditions (at Eh ~430 mV), each technique (PCA, PC-DFA, SIMCA, and k-NN) was found to produce excellent classification. For samples at reductive conditions (at Eh ~ -200 mV SHE), k-NN and SIMCA produced the most accurate classification. Phase identification of particles that contain the same elements but a different crystal structure in a mixed multimetal mineral system has been achieved.

  5. Evidence of tampering in watermark identification

    NASA Astrophysics Data System (ADS)

    McLauchlan, Lifford; Mehrübeoglu, Mehrübe

    2009-08-01

    In this work, watermarks are embedded in digital images in the discrete wavelet transform (DWT) domain. Principal component analysis (PCA) is performed on the DWT coefficients. Next higher order statistics based on the principal components and the eigenvalues are determined for different sets of images. Feature sets are analyzed for different types of attacks in m dimensional space. The results demonstrate the separability of the features for the tampered digital copies. Different feature sets are studied to determine more effective tamper evident feature sets. The digital forensics, the probable manipulation(s) or modification(s) performed on the digital information can be identified using the described technique.

  6. Hard, harder, hardest: principal stratification, statistical identifiability, and the inherent difficulty of finding surrogate endpoints.

    PubMed

    Wolfson, Julian; Henn, Lisa

    2014-01-01

    In many areas of clinical investigation there is great interest in identifying and validating surrogate endpoints, biomarkers that can be measured a relatively short time after a treatment has been administered and that can reliably predict the effect of treatment on the clinical outcome of interest. However, despite dramatic advances in the ability to measure biomarkers, the recent history of clinical research is littered with failed surrogates. In this paper, we present a statistical perspective on why identifying surrogate endpoints is so difficult. We view the problem from the framework of causal inference, with a particular focus on the technique of principal stratification (PS), an approach which is appealing because the resulting estimands are not biased by unmeasured confounding. In many settings, PS estimands are not statistically identifiable and their degree of non-identifiability can be thought of as representing the statistical difficulty of assessing the surrogate value of a biomarker. In this work, we examine the identifiability issue and present key simplifying assumptions and enhanced study designs that enable the partial or full identification of PS estimands. We also present example situations where these assumptions and designs may or may not be feasible, providing insight into the problem characteristics which make the statistical evaluation of surrogate endpoints so challenging.

  7. Hard, harder, hardest: principal stratification, statistical identifiability, and the inherent difficulty of finding surrogate endpoints

    PubMed Central

    2014-01-01

    In many areas of clinical investigation there is great interest in identifying and validating surrogate endpoints, biomarkers that can be measured a relatively short time after a treatment has been administered and that can reliably predict the effect of treatment on the clinical outcome of interest. However, despite dramatic advances in the ability to measure biomarkers, the recent history of clinical research is littered with failed surrogates. In this paper, we present a statistical perspective on why identifying surrogate endpoints is so difficult. We view the problem from the framework of causal inference, with a particular focus on the technique of principal stratification (PS), an approach which is appealing because the resulting estimands are not biased by unmeasured confounding. In many settings, PS estimands are not statistically identifiable and their degree of non-identifiability can be thought of as representing the statistical difficulty of assessing the surrogate value of a biomarker. In this work, we examine the identifiability issue and present key simplifying assumptions and enhanced study designs that enable the partial or full identification of PS estimands. We also present example situations where these assumptions and designs may or may not be feasible, providing insight into the problem characteristics which make the statistical evaluation of surrogate endpoints so challenging. PMID:25342953

  8. Time-oriented hierarchical method for computation of principal components using subspace learning algorithm.

    PubMed

    Jankovic, Marko; Ogawa, Hidemitsu

    2004-10-01

    Principal Component Analysis (PCA) and Principal Subspace Analysis (PSA) are classic techniques in statistical data analysis, feature extraction and data compression. Given a set of multivariate measurements, PCA and PSA provide a smaller set of "basis vectors" with less redundancy, and a subspace spanned by them, respectively. Artificial neurons and neural networks have been shown to perform PSA and PCA when gradient ascent (descent) learning rules are used, which is related to the constrained maximization (minimization) of statistical objective functions. Due to their low complexity, such algorithms and their implementation in neural networks are potentially useful in cases of tracking slow changes of correlations in the input data or in updating eigenvectors with new samples. In this paper we propose PCA learning algorithm that is fully homogeneous with respect to neurons. The algorithm is obtained by modification of one of the most famous PSA learning algorithms--Subspace Learning Algorithm (SLA). Modification of the algorithm is based on Time-Oriented Hierarchical Method (TOHM). The method uses two distinct time scales. On a faster time scale PSA algorithm is responsible for the "behavior" of all output neurons. On a slower scale, output neurons will compete for fulfillment of their "own interests". On this scale, basis vectors in the principal subspace are rotated toward the principal eigenvectors. At the end of the paper it will be briefly analyzed how (or why) time-oriented hierarchical method can be used for transformation of any of the existing neural network PSA method, into PCA method.

  9. Comparative forensic soil analysis of New Jersey state parks using a combination of simple techniques with multivariate statistics.

    PubMed

    Bonetti, Jennifer; Quarino, Lawrence

    2014-05-01

    This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.

  10. Regional Morphology Analysis Package (RMAP): Empirical Orthogonal Function Analysis, Background and Examples

    DTIC Science & Technology

    2007-10-01

    1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by

  11. Principal Curves and Surfaces

    DTIC Science & Technology

    1984-11-01

    welL The subipace is found by using the usual linear eigenv’ctor solution in th3 new enlarged space. This technique was first suggested by Gnanadesikan ...Wilk (1966, 1968), and a good description can be found in Gnanadesikan (1977). They suggested using polynomial functions’ of the original p co...Heidelberg, Springer Ver- lag. Gnanadesikan , R. (1977), Methods for Statistical Data Analysis of Multivariate Observa- tions, Wiley, New York

  12. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  13. Reconstruction of spatio-temporal temperature from sparse historical records using robust probabilistic principal component regression

    USGS Publications Warehouse

    Tipton, John; Hooten, Mevin B.; Goring, Simon

    2017-01-01

    Scientific records of temperature and precipitation have been kept for several hundred years, but for many areas, only a shorter record exists. To understand climate change, there is a need for rigorous statistical reconstructions of the paleoclimate using proxy data. Paleoclimate proxy data are often sparse, noisy, indirect measurements of the climate process of interest, making each proxy uniquely challenging to model statistically. We reconstruct spatially explicit temperature surfaces from sparse and noisy measurements recorded at historical United States military forts and other observer stations from 1820 to 1894. One common method for reconstructing the paleoclimate from proxy data is principal component regression (PCR). With PCR, one learns a statistical relationship between the paleoclimate proxy data and a set of climate observations that are used as patterns for potential reconstruction scenarios. We explore PCR in a Bayesian hierarchical framework, extending classical PCR in a variety of ways. First, we model the latent principal components probabilistically, accounting for measurement error in the observational data. Next, we extend our method to better accommodate outliers that occur in the proxy data. Finally, we explore alternatives to the truncation of lower-order principal components using different regularization techniques. One fundamental challenge in paleoclimate reconstruction efforts is the lack of out-of-sample data for predictive validation. Cross-validation is of potential value, but is computationally expensive and potentially sensitive to outliers in sparse data scenarios. To overcome the limitations that a lack of out-of-sample records presents, we test our methods using a simulation study, applying proper scoring rules including a computationally efficient approximation to leave-one-out cross-validation using the log score to validate model performance. The result of our analysis is a spatially explicit reconstruction of spatio-temporal temperature from a very sparse historical record.

  14. Cross-visit tumor sub-segmentation and registration with outlier rejection for dynamic contrast-enhanced MRI time series data.

    PubMed

    Buonaccorsi, G A; Rose, C J; O'Connor, J P B; Roberts, C; Watson, Y; Jackson, A; Jayson, G C; Parker, G J M

    2010-01-01

    Clinical trials of anti-angiogenic and vascular-disrupting agents often use biomarkers derived from DCE-MRI, typically reporting whole-tumor summary statistics and so overlooking spatial parameter variations caused by tissue heterogeneity. We present a data-driven segmentation method comprising tracer-kinetic model-driven registration for motion correction, conversion from MR signal intensity to contrast agent concentration for cross-visit normalization, iterative principal components analysis for imputation of missing data and dimensionality reduction, and statistical outlier detection using the minimum covariance determinant to obtain a robust Mahalanobis distance. After applying these techniques we cluster in the principal components space using k-means. We present results from a clinical trial of a VEGF inhibitor, using time-series data selected because of problems due to motion and outlier time series. We obtained spatially-contiguous clusters that map to regions with distinct microvascular characteristics. This methodology has the potential to uncover localized effects in trials using DCE-MRI-based biomarkers.

  15. Detecting most influencing courses on students grades using block PCA

    NASA Astrophysics Data System (ADS)

    Othman, Osama H.; Gebril, Rami Salah

    2014-12-01

    One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.

  16. Histogram of gradient and binarized statistical image features of wavelet subband-based palmprint features extraction

    NASA Astrophysics Data System (ADS)

    Attallah, Bilal; Serir, Amina; Chahir, Youssef; Boudjelal, Abdelwahhab

    2017-11-01

    Palmprint recognition systems are dependent on feature extraction. A method of feature extraction using higher discrimination information was developed to characterize palmprint images. In this method, two individual feature extraction techniques are applied to a discrete wavelet transform of a palmprint image, and their outputs are fused. The two techniques used in the fusion are the histogram of gradient and the binarized statistical image features. They are then evaluated using an extreme learning machine classifier before selecting a feature based on principal component analysis. Three palmprint databases, the Hong Kong Polytechnic University (PolyU) Multispectral Palmprint Database, Hong Kong PolyU Palmprint Database II, and the Delhi Touchless (IIDT) Palmprint Database, are used in this study. The study shows that our method effectively identifies and verifies palmprints and outperforms other methods based on feature extraction.

  17. Independent component analysis for automatic note extraction from musical trills

    NASA Astrophysics Data System (ADS)

    Brown, Judith C.; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  18. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  19. Quality improvement of diagnosis of the electromyography data based on statistical characteristics of the measured signals

    NASA Astrophysics Data System (ADS)

    Selivanova, Karina G.; Avrunin, Oleg G.; Zlepko, Sergii M.; Romanyuk, Sergii O.; Zabolotna, Natalia I.; Kotyra, Andrzej; Komada, Paweł; Smailova, Saule

    2016-09-01

    Research and systematization of motor disorders, taking into account the clinical and neurophysiologic phenomena, are important and actual problem of neurology. The article describes a technique for decomposing surface electromyography (EMG), using Principal Component Analysis. The decomposition is achieved by a set of algorithms that uses a specially developed for analyze EMG. The accuracy was verified by calculation of Mahalanobis distance and Probability error.

  20. Classifying Facial Actions

    PubMed Central

    Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.

    2010-01-01

    The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284

  1. Metabolomic evaluation of ginsenosides distribution in Panax genus (Panax ginseng and Panax quinquefolius) using multivariate statistical analysis.

    PubMed

    Pace, Roberto; Martinelli, Ernesto Marco; Sardone, Nicola; D E Combarieu, Eric

    2015-03-01

    Ginseng is any one of the eleven species belonging to the genus Panax of the family Araliaceae and is found in North America and in eastern Asia. Ginseng is characterized by the presence of ginsenosides. Principally Panax ginseng and Panax quinquefolius are the adaptogenic herbs and are commonly distributed as health food markets. In the present study high performance liquid chromatography has been used to identify and quantify ginsenosides in the two subject species and the different parts of the plant (roots, neck, leaves, flowers, fruits). The power of this chromatographic technique to evaluate the identity of botanical material and to distinguishing different part of the plants has been investigated with metabolomic technique such as principal component analysis. Metabolomics provide a good opportunity for mining useful chemical information from the chromatographic data set resulting an important tool for quality evaluation of medicinal plants in the authenticity, consistency and efficacy. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    NASA Astrophysics Data System (ADS)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  3. Combination of complementary data mining methods for geographical characterization of extra virgin olive oils based on mineral composition.

    PubMed

    Sayago, Ana; González-Domínguez, Raúl; Beltrán, Rafael; Fernández-Recamales, Ángeles

    2018-09-30

    This work explores the potential of multi-element fingerprinting in combination with advanced data mining strategies to assess the geographical origin of extra virgin olive oil samples. For this purpose, the concentrations of 55 elements were determined in 125 oil samples from multiple Spanish geographic areas. Several unsupervised and supervised multivariate statistical techniques were used to build classification models and investigate the relationship between mineral composition of olive oils and their provenance. Results showed that Spanish extra virgin olive oils exhibit characteristic element profiles, which can be differentiated on the basis of their origin in accordance with three geographical areas: Atlantic coast (Huelva province), Mediterranean coast and inland regions. Furthermore, statistical modelling yielded high sensitivity and specificity, principally when random forest and support vector machines were employed, thus demonstrating the utility of these techniques in food traceability and authenticity research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Automatic detection of slight parameter changes associated to complex biomedical signals using multiresolution q-entropy1.

    PubMed

    Torres, M E; Añino, M M; Schlotthauer, G

    2003-12-01

    It is well known that, from a dynamical point of view, sudden variations in physiological parameters which govern certain diseases can cause qualitative changes in the dynamics of the corresponding physiological process. The purpose of this paper is to introduce a technique that allows the automated temporal localization of slight changes in a parameter of the law that governs the nonlinear dynamics of a given signal. This tool takes, from the multiresolution entropies, the ability to show these changes as statistical variations at each scale. These variations are held in the corresponding principal component. Appropriately combining these techniques with a statistical changes detector, a complexity change detection algorithm is obtained. The relevance of the approach, together with its robustness in the presence of moderate noise, is discussed in numerical simulations and the automatic detector is applied to real and simulated biological signals.

  5. Arsenic distribution and valence state variation studied by fast hierarchical length-scale morphological, compositional, and speciation imaging at the Nanoscopium, Synchrotron Soleil

    NASA Astrophysics Data System (ADS)

    Somogyi, Andrea; Medjoubi, Kadda; Sancho-Tomas, Maria; Visscher, P. T.; Baranton, Gil; Philippot, Pascal

    2017-09-01

    The understanding of real complex geological, environmental and geo-biological processes depends increasingly on in-depth non-invasive study of chemical composition and morphology. In this paper we used scanning hard X-ray nanoprobe techniques in order to study the elemental composition, morphology and As speciation in complex highly heterogeneous geological samples. Multivariate statistical analytical techniques, such as principal component analysis and clustering were used for data interpretation. These measurements revealed the quantitative and valance state inhomogeneity of As and its relation to the total compositional and morphological variation of the sample at sub-μm scales.

  6. Application of Principal Component Analysis to NIR Spectra of Phyllosilicates: A Technique for Identifying Phyllosilicates on Mars

    NASA Technical Reports Server (NTRS)

    Rampe, E. B.; Lanza, N. L.

    2012-01-01

    Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.

  7. Statistical interpretation of chromatic indicators in correlation to phytochemical profile of a sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes.

    PubMed

    Tchabo, William; Ma, Yongkun; Kwaw, Emmanuel; Zhang, Haining; Xiao, Lulu; Apaliya, Maurice T

    2018-01-15

    The four different methods of color measurement of wine proposed by Boulton, Giusti, Glories and Commission International de l'Eclairage (CIE) were applied to assess the statistical relationship between the phytochemical profile and chromatic characteristics of sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes. The alteration in chromatic properties and phenolic composition of non-thermal aged mulberry wine were examined, aided by the used of Pearson correlation, cluster and principal component analysis. The results revealed a positive effect of non-thermal processes on phytochemical families of wines. From Pearson correlation analysis relationships between chromatic indexes and flavonols as well as anthocyanins were established. Cluster analysis highlighted similarities between Boulton and Giusti parameters, as well as Glories and CIE parameters in the assessment of chromatic properties of wines. Finally, principal component analysis was able to discriminate wines subjected to different maturation techniques on the basis of their chromatic and phenolics characteristics. Copyright © 2017. Published by Elsevier Ltd.

  8. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual. [NURE program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less

  9. Measurement of surface microtopography

    NASA Technical Reports Server (NTRS)

    Wall, S. D.; Farr, T. G.; Muller, J.-P.; Lewis, P.; Leberl, F. W.

    1991-01-01

    Acquisition of ground truth data for use in microwave interaction modeling requires measurement of surface roughness sampled at intervals comparable to a fraction of the microwave wavelength and extensive enough to adequately represent the statistics of a surface unit. Sub-centimetric measurement accuracy is thus required over large areas, and existing techniques are usually inadequate. A technique is discussed for acquiring the necessary photogrammetric data using twin film cameras mounted on a helicopter. In an attempt to eliminate tedious data reduction, an automated technique was applied to the helicopter photographs, and results were compared to those produced by conventional stereogrammetry. Derived root-mean-square (RMS) roughness for the same stereo-pair was 7.5 cm for the automated technique versus 6.5 cm for the manual method. The principal source of error is probably due to vegetation in the scene, which affects the automated technique but is ignored by a human operator.

  10. Influence of bicortical techniques in internal connection placed in premaxillary area by 3D finite element analysis.

    PubMed

    Verri, Fellippo Ramos; Cruz, Ronaldo Silva; Lemos, Cleidiel Aparecido Araújo; de Souza Batista, Victor Eduardo; Almeida, Daniel Augusto Faria; Verri, Ana Caroline Gonçales; Pellizzer, Eduardo Piza

    2017-02-01

    The aim of study was to evaluate the stress distribution in implant-supported prostheses and peri-implant bone using internal hexagon (IH) implants in the premaxillary area, varying surgical techniques (conventional, bicortical and bicortical in association with nasal floor elevation), and loading directions (0°, 30° and 60°) by three-dimensional (3D) finite element analysis. Three models were designed with Invesalius, Rhinoceros 3D and Solidworks software. Each model contained a bone block of the premaxillary area including an implant (IH, Ø4 × 10 mm) supporting a metal-ceramic crown. 178 N was applied in different inclinations (0°, 30°, 60°). The results were analyzed by von Mises, maximum principal stress, microstrain and displacement maps including ANOVA statistical test for some situations. Von Mises maps of implant, screws and abutment showed increase of stress concentration as increased loading inclination. Bicortical techniques showed reduction in implant apical area and in the head of fixation screws. Bicortical techniques showed slight increase stress in cortical bone in the maximum principal stress and microstrain maps under 60° loading. No differences in bone tissue regarding surgical techniques were observed. As conclusion, non-axial loads increased stress concentration in all maps. Bicortical techniques showed lower stress for implant and screw; however, there was slightly higher stress on cortical bone only under loads of higher inclinations (60°).

  11. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca; Champagne, Pascale, E-mail: champagne@civil.queensu.ca; Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system,more » followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling the five criteria parameters (set as dependent variables), on a statistically significant level: conductivity, dissolved oxygen (DO), nitrite (NO{sub 2}{sup −}), organic nitrogen (N), oxidation reduction potential (ORP), pH, sulfate and total volatile solids (TVS). The criteria parameters and the significant explanatory parameters were most important in modeling the dynamics of the passive treatment system during the study period. Such techniques and procedures were found to be highly valuable and could be applied to other sites to determine parameters of interest in similar naturalized engineered systems.« less

  12. Improving GEFS Weather Forecasts for Indian Monsoon with Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankita; Salvi, Kaustubh; Ghosh, Subimal

    2014-05-01

    Weather forecast has always been a challenging research problem, yet of a paramount importance as it serves the role of 'key input' in formulating modus operandi for immediate future. Short range rainfall forecasts influence a wide range of entities, right from agricultural industry to a common man. Accurate forecasts actually help in minimizing the possible damage by implementing pre-decided plan of action and hence it is necessary to gauge the quality of forecasts which might vary with the complexity of weather state and regional parameters. Indian Summer Monsoon Rainfall (ISMR) is one such perfect arena to check the quality of weather forecast not only because of the level of intricacy in spatial and temporal patterns associated with it, but also the amount of damage it can cause (because of poor forecasts) to the Indian economy by affecting agriculture Industry. The present study is undertaken with the rationales of assessing, the ability of Global Ensemble Forecast System (GEFS) in predicting ISMR over central India and the skill of statistical downscaling technique in adding value to the predictions by taking them closer to evidentiary target dataset. GEFS is a global numerical weather prediction system providing the forecast results of different climate variables at a fine resolution (0.5 degree and 1 degree). GEFS shows good skills in predicting different climatic variables but fails miserably over rainfall predictions for Indian summer monsoon rainfall, which is evident from a very low to negative correlation values between predicted and observed rainfall. Towards the fulfilment of second rationale, the statistical relationship is established between the reasonably well predicted climate variables (GEFS) and observed rainfall. The GEFS predictors are treated with multicollinearity and dimensionality reduction techniques, such as principal component analysis (PCA) and least absolute shrinkage and selection operator (LASSO). Statistical relationship is established between the principal components and observed rainfall over training period and predictions are obtained for testing period. The validations show high improvements in correlation coefficient between observed and predicted data (0.25 to 0.55). The results speak in favour of statistical downscaling methodology which shows the capability to reduce the gap between observed data and predictions. A detailed study is required to be carried out by applying different downscaling techniques to quantify the improvements in predictions.

  13. Variability in source sediment contributions by applying different statistic test for a Pyrenean catchment.

    PubMed

    Palazón, L; Navas, A

    2017-06-01

    Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Quality of stormwater runoff discharged from Massachusetts highways, 2005-07

    USGS Publications Warehouse

    Smith, Kirk P.; Granato, Gregory E.

    2010-01-01

    The U.S. Geological Survey (USGS), in cooperation with U.S. Department of Transportation Federal Highway Administration and the Massachusetts Department of Transportation, conducted a field study from September 2005 through September 2007 to characterize the quality of highway runoff for a wide range of constituents. The highways studied had annual average daily traffic (AADT) volumes from about 3,000 to more than 190,000 vehicles per day. Highway-monitoring stations were installed at 12 locations in Massachusetts on 8 highways. The 12 monitoring stations were subdivided into 4 primary, 4 secondary, and 4 test stations. Each site contained a 100-percent impervious drainage area that included two or more catch basins sharing a common outflow pipe. Paired primary and secondary stations were located within a few miles of each other on a limited-access section of the same highway. Most of the data were collected at the primary and secondary stations, which were located on four principal highways (Route 119, Route 2, Interstate 495, and Interstate 95). The secondary stations were operated simultaneously with the primary stations for at least a year. Data from the four test stations (Route 8, Interstate 195, Interstate 190, and Interstate 93) were used to determine the transferability of the data collected from the principal highways to other highways characterized by different construction techniques, land use, and geography. Automatic-monitoring techniques were used to collect composite samples of highway runoff and make continuous measurements of several physical characteristics. Flowweighted samples of highway runoff were collected automatically during approximately 140 rain and mixed rain, sleet, and snowstorms. These samples were analyzed for physical characteristics and concentrations of 6 dissolved major ions, total nutrients, 8 total-recoverable metals, suspended sediment, and 85 semivolatile organic compounds (SVOCs), which include priority polyaromatic hydrocarbons (PAHs), phthalate esters, and other anthropogenic or naturally occurring organic compounds. The distribution of particle size of suspended sediment also was determined for composite samples of highway runoff. Samples of highway runoff were collected year round and under various dry antecedent conditions throughout the 2-year sampling period. In addition to samples of highway runoff, supplemental samples also were collected of sediment in highway runoff, background soils, berm materials, maintenance sands, deicing compounds, and vegetation matter. These additional samples were collected near or on the highways to support data analysis. There were few statistically significant differences between populations of constituent concentrations in samples from the primary and secondary stations on the same principal highways (Mann-Whitney test, 95-percent confidence level). Similarly, there were few statistically significant differences between populations of constituent concentrations for the four principal highways (data from the paired primary and secondary stations for each principal highway) and populations for test stations with similar AADT volumes. Exceptions to this include several total-recoverable metals for stations on Route 2 and Interstate 195 (highways with moderate AADT volumes), and for stations on Interstate 95 and Interstate 93 (highways with high AADT volumes). Supplemental data collected during this study indicate that many of these differences may be explained by the quantity, as well as the quality, of the sediment in samples of highway runoff. Nonparametric statistical methods also were used to test for differences between populations of sample constituent concentrations among the four principal highways that differed mainly in traffic volume. These results indicate that there were few statistically significant differences (Mann-Whitney test, 95-percent confidence level) for populations of concentrations of most total-recoverable metals

  16. Dynamics and spatio-temporal variability of environmental factors in Eastern Australia using functional principal component analysis

    USGS Publications Warehouse

    Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.

    2010-01-01

    This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.

  17. Near-infrared Raman spectroscopy for estimating biochemical changes associated with different pathological conditions of cervix

    NASA Astrophysics Data System (ADS)

    Daniel, Amuthachelvi; Prakasarao, Aruna; Ganesan, Singaravelu

    2018-02-01

    The molecular level changes associated with oncogenesis precede the morphological changes in cells and tissues. Hence molecular level diagnosis would promote early diagnosis of the disease. Raman spectroscopy is capable of providing specific spectral signature of various biomolecules present in the cells and tissues under various pathological conditions. The aim of this work is to develop a non-linear multi-class statistical methodology for discrimination of normal, neoplastic and malignant cells/tissues. The tissues were classified as normal, pre-malignant and malignant by employing Principal Component Analysis followed by Artificial Neural Network (PC-ANN). The overall accuracy achieved was 99%. Further, to get an insight into the quantitative biochemical composition of the normal, neoplastic and malignant tissues, a linear combination of the major biochemicals by non-negative least squares technique was fit to the measured Raman spectra of the tissues. This technique confirms the changes in the major biomolecules such as lipids, nucleic acids, actin, glycogen and collagen associated with the different pathological conditions. To study the efficacy of this technique in comparison with histopathology, we have utilized Principal Component followed by Linear Discriminant Analysis (PC-LDA) to discriminate the well differentiated, moderately differentiated and poorly differentiated squamous cell carcinoma with an accuracy of 94.0%. And the results demonstrated that Raman spectroscopy has the potential to complement the good old technique of histopathology.

  18. Feature extraction through parallel Probabilistic Principal Component Analysis for heart disease diagnosis

    NASA Astrophysics Data System (ADS)

    Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan

    2017-09-01

    Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.

  19. Raman signatures of ferroic domain walls captured by principal component analysis.

    PubMed

    Nataf, G F; Barrett, N; Kreisel, J; Guennou, M

    2018-01-24

    Ferroic domain walls are currently investigated by several state-of-the art techniques in order to get a better understanding of their distinct, functional properties. Here, principal component analysis (PCA) of Raman maps is used to study ferroelectric domain walls (DWs) in LiNbO 3 and ferroelastic DWs in NdGaO 3 . It is shown that PCA allows us to quickly and reliably identify small Raman peak variations at ferroelectric DWs and that the value of a peak shift can be deduced-accurately and without a priori-from a first order Taylor expansion of the spectra. The ability of PCA to separate the contribution of ferroelastic domains and DWs to Raman spectra is emphasized. More generally, our results provide a novel route for the statistical analysis of any property mapped across a DW.

  20. Transforming Graph Data for Statistical Relational Learning

    DTIC Science & Technology

    2012-10-01

    Jordan, 2003), PLSA (Hofmann, 1999), ? Classification via RMN (Taskar et al., 2003) or SVM (Hasan, Chaoji, Salem , & Zaki, 2006) ? Hierarchical...dimensionality reduction methods such as Principal 407 Rossi, McDowell, Aha, & Neville Component Analysis (PCA), Principal Factor Analysis ( PFA ), and...clustering algorithm. Journal of the Royal Statistical Society. Series C, Applied statistics, 28, 100–108. Hasan, M. A., Chaoji, V., Salem , S., & Zaki, M

  1. Statistical analysis of fNIRS data: a comprehensive review.

    PubMed

    Tak, Sungho; Ye, Jong Chul

    2014-01-15

    Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Incorporating principal component analysis into air quality ...

    EPA Pesticide Factsheets

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Principal Component Analysis (PCA) with the intent of motivating its use by the evaluation community. One of the main objectives of PCA is to identify, through data reduction, the recurring and independent modes of variations (or signals) within a very large dataset, thereby summarizing the essential information of that dataset so that meaningful and descriptive conclusions can be made. In this demonstration, PCA is applied to a simple evaluation metric – the model bias associated with EPA's Community Multi-scale Air Quality (CMAQ) model when compared to weekly observations of sulfate (SO42−) and ammonium (NH4+) ambient air concentrations measured by the Clean Air Status and Trends Network (CASTNet). The advantages of using this technique are demonstrated as it identifies strong and systematic patterns of CMAQ model bias across a myriad of spatial and temporal scales that are neither constrained to geopolitical boundaries nor monthly/seasonal time periods (a limitation of many current studies). The technique also identifies locations (station–grid cell pairs) that are used as indicators for a more thorough diagnostic evaluation thereby hastening and facilitating understanding of the prob

  3. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Multivariate Statistical Analysis: a tool for groundwater quality assessment in the hidrogeologic region of the Ring of Cenotes, Yucatan, Mexico.

    NASA Astrophysics Data System (ADS)

    Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.

    2014-12-01

    The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.

  5. Accuracy of Different Implant Impression Techniques: Evaluation of New Tray Design Concept.

    PubMed

    Liu, David Yu; Cader, Fathima Nashmie; Abduo, Jaafar; Palamara, Joseph

    2017-12-29

    To evaluate implant impression accuracy with a new tray design concept in comparison to nonsplinted and splinted impression techniques for a 2-implant situation. A reference bar titanium framework was fabricated to fit on 2 parallel implants. The framework was used to generate a resin master model with 2 implants that fit precisely against the framework. Three impression techniques were evaluated: (1) nonsplinted, (2) splinted, and (3) nonsplinted with modified tray impressions. All the trays were fabricated from light-cured acrylic resin material with openings that corresponded to the implant impression copings. Ten impressions were taken for each technique using poly(vinyl siloxane) impression material. The impressions were poured with type IV dental stone to generate the test casts. A rosette strain gauge was bonded to the middle of the framework. As the framework retaining screws were tightened on each test cast, the developed strains were recorded until the completion of the tightening to 35 Ncm. The generated strains of the rosette strain gauge were used to calculate the maximum principal strain. A statistically significant difference was observed among the different impression techniques. The modified tray design impression technique was associated with the least framework strains, which indicates greater accuracy compared with the other techniques. There was no significant difference between the splinted and the nonsplinted impression techniques. The new tray design concept appeared to produce more accurate implant impressions than the other techniques. Despite the statistical difference among the impression techniques, the clinical significance of this difference is yet to be determined. © 2017 by the American College of Prosthodontists.

  6. Machine learning of frustrated classical spin models. I. Principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ce; Zhai, Hui

    2017-10-01

    This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.

  7. Medial-based deformable models in nonconvex shape-spaces for medical image segmentation.

    PubMed

    McIntosh, Chris; Hamarneh, Ghassan

    2012-01-01

    We explore the application of genetic algorithms (GA) to deformable models through the proposition of a novel method for medical image segmentation that combines GA with nonconvex, localized, medial-based shape statistics. We replace the more typical gradient descent optimizer used in deformable models with GA, and the convex, implicit, global shape statistics with nonconvex, explicit, localized ones. Specifically, we propose GA to reduce typical deformable model weaknesses pertaining to model initialization, pose estimation and local minima, through the simultaneous evolution of a large number of models. Furthermore, we constrain the evolution, and thus reduce the size of the search-space, by using statistically-based deformable models whose deformations are intuitive (stretch, bulge, bend) and are driven in terms of localized principal modes of variation, instead of modes of variation across the entire shape that often fail to capture localized shape changes. Although GA are not guaranteed to achieve the global optima, our method compares favorably to the prevalent optimization techniques, convex/nonconvex gradient-based optimizers and to globally optimal graph-theoretic combinatorial optimization techniques, when applied to the task of corpus callosum segmentation in 50 mid-sagittal brain magnetic resonance images.

  8. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2017-10-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  9. A climatology of total ozone mapping spectrometer data using rotated principal component analysis

    NASA Astrophysics Data System (ADS)

    Eder, Brian K.; Leduc, Sharon K.; Sickles, Joseph E.

    1999-02-01

    The spatial and temporal variability of total column ozone (Ω) obtained from the total ozone mapping spectrometer (TOMS version 7.0) during the period 1980-1992 was examined through the use of a multivariate statistical technique called rotated principal component analysis. Utilization of Kaiser's varimax orthogonal rotation led to the identification of 14, mostly contiguous subregions that together accounted for more than 70% of the total Ω variance. Each subregion displayed statistically unique Ω characteristics that were further examined through time series and spectral density analyses, revealing significant periodicities on semiannual, annual, quasi-biennial, and longer term time frames. This analysis facilitated identification of the probable mechanisms responsible for the variability of Ω within the 14 homogeneous subregions. The mechanisms were either dynamical in nature (i.e., advection associated with baroclinic waves, the quasi-biennial oscillation, or El Niño-Southern Oscillation) or photochemical in nature (i.e., production of odd oxygen (O or O3) associated with the annual progression of the Sun). The analysis has also revealed that the influence of a data retrieval artifact, found in equatorial latitudes of version 6.0 of the TOMS data, has been reduced in version 7.0.

  10. Characterization of exopolymers of aquatic bacteria by pyrolysis-mass spectrometry

    NASA Technical Reports Server (NTRS)

    Ford, T.; Sacco, E.; Black, J.; Kelley, T.; Goodacre, R.; Berkeley, R. C.; Mitchell, R.

    1991-01-01

    Exopolymers from a diverse collection of marine and freshwater bacteria were characterized by pyrolysis-mass spectrometry (Py-MS). Py-MS provides spectra of pyrolysis fragments that are characteristic of the original material. Analysis of the spectra by multivariate statistical techniques (principal component and canonical variate analysis) separated these exopolymers into distinct groups. Py-MS clearly distinguished characteristic fragments, which may be derived from components responsible for functional differences between polymers. The importance of these distinctions and the relevance of pyrolysis information to exopolysaccharide function in aquatic bacteria is discussed.

  11. Let's Keep Our Quality School Principals on the Job

    ERIC Educational Resources Information Center

    Norton, M. Scott

    2003-01-01

    Research studies strongly support the fact that the leadership of the school principal impacts directly on the climate of the school and, in turn, on student achievement. National statistics relating to principal turnover and dwindling supplies of qualified replacements show clearly that principal turnover has reached crisis proportions.…

  12. Measuring Principals' Effectiveness: Results from New Jersey's Principal Evaluation Pilot. REL 2015-089

    ERIC Educational Resources Information Center

    Ross, Christine; Herrmann, Mariesa; Angus, Megan Hague

    2015-01-01

    The purpose of this study was to describe the measures used to evaluate principals in New Jersey in the first (pilot) year of the new principal evaluation system and examine three of the statistical properties of the measures: their variation among principals, their year-to-year stability, and the associations between these measures and the…

  13. Advanced Treatment Monitoring for Olympic-Level Athletes Using Unsupervised Modeling Techniques

    PubMed Central

    Siedlik, Jacob A.; Bergeron, Charles; Cooper, Michael; Emmons, Russell; Moreau, William; Nabhan, Dustin; Gallagher, Philip; Vardiman, John P.

    2016-01-01

    Context Analysis of injury and illness data collected at large international competitions provides the US Olympic Committee and the national governing bodies for each sport with information to best prepare for future competitions. Research in which authors have evaluated medical contacts to provide the expected level of medical care and sports medicine services at international competitions is limited. Objective To analyze the medical-contact data for athletes, staff, and coaches who participated in the 2011 Pan American Games in Guadalajara, Mexico, using unsupervised modeling techniques to identify underlying treatment patterns. Design Descriptive epidemiology study. Setting Pan American Games. Patients or Other Participants A total of 618 US athletes (337 males, 281 females) participated in the 2011 Pan American Games. Main Outcome Measure(s) Medical data were recorded from the injury-evaluation and injury-treatment forms used by clinicians assigned to the central US Olympic Committee Sport Medicine Clinic and satellite locations during the operational 17-day period of the 2011 Pan American Games. We used principal components analysis and agglomerative clustering algorithms to identify and define grouped modalities. Lift statistics were calculated for within-cluster subgroups. Results Principal component analyses identified 3 components, accounting for 72.3% of the variability in datasets. Plots of the principal components showed that individual contacts focused on 4 treatment clusters: massage, paired manipulation and mobilization, soft tissue therapy, and general medical. Conclusions Unsupervised modeling techniques were useful for visualizing complex treatment data and provided insights for improved treatment modeling in athletes. Given its ability to detect clinically relevant treatment pairings in large datasets, unsupervised modeling should be considered a feasible option for future analyses of medical-contact data from international competitions. PMID:26794628

  14. Relationships between Association of Research Libraries (ARL) Statistics and Bibliometric Indicators: A Principal Components Analysis

    ERIC Educational Resources Information Center

    Hendrix, Dean

    2010-01-01

    This study analyzed 2005-2006 Web of Science bibliometric data from institutions belonging to the Association of Research Libraries (ARL) and corresponding ARL statistics to find any associations between indicators from the two data sets. Principal components analysis on 36 variables from 103 universities revealed obvious associations between…

  15. Partial Least Squares Regression Can Aid in Detecting Differential Abundance of Multiple Features in Sets of Metagenomic Samples

    PubMed Central

    Libiger, Ondrej; Schork, Nicholas J.

    2015-01-01

    It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061

  16. An Independent Filter for Gene Set Testing Based on Spectral Enrichment.

    PubMed

    Frost, H Robert; Li, Zhigang; Asselbergs, Folkert W; Moore, Jason H

    2015-01-01

    Gene set testing has become an indispensable tool for the analysis of high-dimensional genomic data. An important motivation for testing gene sets, rather than individual genomic variables, is to improve statistical power by reducing the number of tested hypotheses. Given the dramatic growth in common gene set collections, however, testing is often performed with nearly as many gene sets as underlying genomic variables. To address the challenge to statistical power posed by large gene set collections, we have developed spectral gene set filtering (SGSF), a novel technique for independent filtering of gene set collections prior to gene set testing. The SGSF method uses as a filter statistic the p-value measuring the statistical significance of the association between each gene set and the sample principal components (PCs), taking into account the significance of the associated eigenvalues. Because this filter statistic is independent of standard gene set test statistics under the null hypothesis but dependent under the alternative, the proportion of enriched gene sets is increased without impacting the type I error rate. As shown using simulated and real gene expression data, the SGSF algorithm accurately filters gene sets unrelated to the experimental outcome resulting in significantly increased gene set testing power.

  17. Combined data preprocessing and multivariate statistical analysis characterizes fed-batch culture of mouse hybridoma cells for rational medium design.

    PubMed

    Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup

    2010-10-01

    We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Evaluating filterability of different types of sludge by statistical analysis: The role of key organic compounds in extracellular polymeric substances.

    PubMed

    Xiao, Keke; Chen, Yun; Jiang, Xie; Zhou, Yan

    2017-03-01

    An investigation was conducted for 20 different types of sludge in order to identify the key organic compounds in extracellular polymeric substances (EPS) that are important in assessing variations of sludge filterability. The different types of sludge varied in initial total solids (TS) content, organic composition and pre-treatment methods. For instance, some of the sludges were pre-treated by acid, ultrasonic, thermal, alkaline, or advanced oxidation technique. The Pearson's correlation results showed significant correlations between sludge filterability and zeta potential, pH, dissolved organic carbon, protein and polysaccharide in soluble EPS (SB EPS), loosely bound EPS (LB EPS) and tightly bound EPS (TB EPS). The principal component analysis (PCA) method was used to further explore correlations between variables and similarities among EPS fractions of different types of sludge. Two principal components were extracted: principal component 1 accounted for 59.24% of total EPS variations, while principal component 2 accounted for 25.46% of total EPS variations. Dissolved organic carbon, protein and polysaccharide in LB EPS showed higher eigenvector projection values than the corresponding compounds in SB EPS and TB EPS in principal component 1. Further characterization of fractionized key organic compounds in LB EPS was conducted with size-exclusion chromatography-organic carbon detection-organic nitrogen detection (LC-OCD-OND). A numerical multiple linear regression model was established to describe relationship between organic compounds in LB EPS and sludge filterability. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. [The principal components analysis--method to classify the statistical variables with applications in medicine].

    PubMed

    Dascălu, Cristina Gena; Antohe, Magda Ecaterina

    2009-01-01

    Based on the eigenvalues and the eigenvectors analysis, the principal component analysis has the purpose to identify the subspace of the main components from a set of parameters, which are enough to characterize the whole set of parameters. Interpreting the data for analysis as a cloud of points, we find through geometrical transformations the directions where the cloud's dispersion is maximal--the lines that pass through the cloud's center of weight and have a maximal density of points around them (by defining an appropriate criteria function and its minimization. This method can be successfully used in order to simplify the statistical analysis on questionnaires--because it helps us to select from a set of items only the most relevant ones, which cover the variations of the whole set of data. For instance, in the presented sample we started from a questionnaire with 28 items and, applying the principal component analysis we identified 7 principal components--or main items--fact that simplifies significantly the further data statistical analysis.

  20. Discrimination of premalignant lesions and cancer tissues from normal gastric tissues using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Luo, Shuwen; Chen, Changshui; Mao, Hua; Jin, Shaoqin

    2013-06-01

    The feasibility of early detection of gastric cancer using near-infrared (NIR) Raman spectroscopy (RS) by distinguishing premalignant lesions (adenomatous polyp, n=27) and cancer tissues (adenocarcinoma, n=33) from normal gastric tissues (n=45) is evaluated. Significant differences in Raman spectra are observed among the normal, adenomatous polyp, and adenocarcinoma gastric tissues at 936, 1003, 1032, 1174, 1208, 1323, 1335, 1450, and 1655 cm-1. Diverse statistical methods are employed to develop effective diagnostic algorithms for classifying the Raman spectra of different types of ex vivo gastric tissues, including principal component analysis (PCA), linear discriminant analysis (LDA), and naive Bayesian classifier (NBC) techniques. Compared with PCA-LDA algorithms, PCA-NBC techniques together with leave-one-out, cross-validation method provide better discriminative results of normal, adenomatous polyp, and adenocarcinoma gastric tissues, resulting in superior sensitivities of 96.3%, 96.9%, and 96.9%, and specificities of 93%, 100%, and 95.2%, respectively. Therefore, NIR RS associated with multivariate statistical algorithms has the potential for early diagnosis of gastric premalignant lesions and cancer tissues in molecular level.

  1. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  2. The bio-optical properties of CDOM as descriptor of lake stratification.

    PubMed

    Bracchini, Luca; Dattilo, Arduino Massimo; Hull, Vincent; Loiselle, Steven Arthur; Martini, Silvia; Rossi, Claudio; Santinelli, Chiara; Seritti, Alfredo

    2006-11-01

    Multivariate statistical techniques are used to demonstrate the fundamental role of CDOM optical properties in the description of water masses during the summer stratification of a deep lake. PC1 was linked with dissolved species and PC2 with suspended particles. In the first principal component that the role of CDOM bio-optical properties give a better description of the stratification of the Salto Lake with respect to temperature. The proposed multivariate approach can be used for the analysis of different stratified aquatic ecosystems in relation to interaction between bio-optical properties and stratification of the water body.

  3. Area estimation using multiyear designs and partial crop identification

    NASA Technical Reports Server (NTRS)

    Sielken, R. L., Jr.

    1984-01-01

    Statistical procedures were developed for large area assessments using both satellite and conventional data. Crop acreages, other ground cover indices, and measures of change were the principal characteristics of interest. These characteristics are capable of being estimated from samples collected possibly from several sources at varying times, with different levels of identification. Multiyear analysis techniques were extended to include partially identified samples; the best current year sampling design corresponding to a given sampling history was determined; weights reflecting the precision or confidence in each observation were identified and utilized, and the variation in estimates incorporating partially identified samples were quantified.

  4. Color enhancement of landsat agricultural imagery: JPL LACIE image processing support task

    NASA Technical Reports Server (NTRS)

    Madura, D. P.; Soha, J. M.; Green, W. B.; Wherry, D. B.; Lewis, S. D.

    1978-01-01

    Color enhancement techniques were applied to LACIE LANDSAT segments to determine if such enhancement can assist analysis in crop identification. The procedure involved increasing the color range by removing correlation between components. First, a principal component transformation was performed, followed by contrast enhancement to equalize component variances, followed by an inverse transformation to restore familiar color relationships. Filtering was applied to lower order components to reduce color speckle in the enhanced products. Use of single acquisition and multiple acquisition statistics to control the enhancement were compared, and the effects of normalization investigated. Evaluation is left to LACIE personnel.

  5. Principal Attrition and Mobility: Results from the 2008-09 Principal Follow-Up Survey. First Look. NCES 2010-337

    ERIC Educational Resources Information Center

    Battle, Danielle

    2010-01-01

    While the National Center for Education Statistics (NCES) has conducted surveys of attrition and mobility among school teachers for two decades, little was known about similar movements among school principals. In order to inform discussions and decisions among policymakers, researchers, and parents, the 2008-09 Principal Follow-up Survey (PFS)…

  6. Non-targeted 1H NMR fingerprinting and multivariate statistical analyses for the characterisation of the geographical origin of Italian sweet cherries.

    PubMed

    Longobardi, F; Ventrella, A; Bianco, A; Catucci, L; Cafagna, I; Gallo, V; Mastrorilli, P; Agostiano, A

    2013-12-01

    In this study, non-targeted (1)H NMR fingerprinting was used in combination with multivariate statistical techniques for the classification of Italian sweet cherries based on their different geographical origins (Emilia Romagna and Puglia). As classification techniques, Soft Independent Modelling of Class Analogy (SIMCA), Partial Least Squares Discriminant Analysis (PLS-DA), and Linear Discriminant Analysis (LDA) were carried out and the results were compared. For LDA, before performing a refined selection of the number/combination of variables, two different strategies for a preliminary reduction of the variable number were tested. The best average recognition and CV prediction abilities (both 100.0%) were obtained for all the LDA models, although PLS-DA also showed remarkable performances (94.6%). All the statistical models were validated by observing the prediction abilities with respect to an external set of cherry samples. The best result (94.9%) was obtained with LDA by performing a best subset selection procedure on a set of 30 principal components previously selected by a stepwise decorrelation. The metabolites that mostly contributed to the classification performances of such LDA model, were found to be malate, glucose, fructose, glutamine and succinate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Anomaly Detection in Gamma-Ray Vehicle Spectra with Principal Components Analysis and Mahalanobis Distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.

    2006-01-23

    The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates andmore » probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.« less

  8. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  9. Predicting Teacher Job Satisfaction Based on Principals' Instructional Supervision Behaviours: A Study of Turkish Teachers

    ERIC Educational Resources Information Center

    Ilgan, Abdurrahman; Parylo, Oksana; Sungu, Hilmi

    2015-01-01

    This quantitative research examined instructional supervision behaviours of school principals as a predictor of teacher job satisfaction through the analysis of Turkish teachers' perceptions of principals' instructional supervision behaviours. There was a statistically significant difference found between the teachers' job satisfaction level and…

  10. Preparing Principals To Lead in the New Millennium: A Response to the Leadership Crisis in American Schools.

    ERIC Educational Resources Information Center

    Chirichello, Michael

    There are about 80,000 public school principals in the United States. The Bureau of Labor Statistics estimates there will be a 10 percent increase in the employment of educational administrators of all types through 2006. The National Association of Elementary School Principals estimates that more than 40 percent of principals will retire or leave…

  11. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    PubMed

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  12. Investigation of the Impact of Extracting and Exchanging Health Information by Using Internet and Social Networks.

    PubMed

    Pistolis, John; Zimeras, Stelios; Chardalias, Kostas; Roupa, Zoe; Fildisis, George; Diomidous, Marianna

    2016-06-01

    Social networks (1) have been embedded in our daily life for a long time. They constitute a powerful tool used nowadays for both searching and exchanging information on different issues by using Internet searching engines (Google, Bing, etc.) and Social Networks (Facebook, Twitter etc.). In this paper, are presented the results of a research based on the frequency and the type of the usage of the Internet and the Social Networks by the general public and the health professionals. The objectives of the research were focused on the investigation of the frequency of seeking and meticulously searching for health information in the social media by both individuals and health practitioners. The exchanging of information is a procedure that involves the issues of reliability and quality of information. In this research, by using advanced statistical techniques an effort is made to investigate the participant's profile in using social networks for searching and exchanging information on health issues. Based on the answers 93 % of the people, use the Internet to find information on health-subjects. Considering principal component analysis, the most important health subjects were nutrition (0.719 %), respiratory issues (0.79 %), cardiological issues (0.777%), psychological issues (0.667%) and total (73.8%). The research results, based on different statistical techniques revealed that the 61.2% of the males and 56.4% of the females intended to use the social networks for searching medical information. Based on the principal components analysis, the most important sources that the participants mentioned, were the use of the Internet and social networks for exchanging information on health issues. These sources proved to be of paramount importance to the participants of the study. The same holds for nursing, medical and administrative staff in hospitals.

  13. Characterizing pigments with hyperspectral imaging variable false-color composites

    NASA Astrophysics Data System (ADS)

    Hayem-Ghez, Anita; Ravaud, Elisabeth; Boust, Clotilde; Bastian, Gilles; Menu, Michel; Brodie-Linder, Nancy

    2015-11-01

    Hyperspectral imaging has been used for pigment characterization on paintings for the last 10 years. It is a noninvasive technique, which mixes the power of spectrophotometry and that of imaging technologies. We have access to a visible and near-infrared hyperspectral camera, ranging from 400 to 1000 nm in 80-160 spectral bands. In order to treat the large amount of data that this imaging technique generates, one can use statistical tools such as principal component analysis (PCA). To conduct the characterization of pigments, researchers mostly use PCA, convex geometry algorithms and the comparison of resulting clusters to database spectra with a specific tolerance (like the Spectral Angle Mapper tool on the dedicated software ENVI). Our approach originates from false-color photography and aims at providing a simple tool to identify pigments thanks to imaging spectroscopy. It can be considered as a quick first analysis to see the principal pigments of a painting, before using a more complete multivariate statistical tool. We study pigment spectra, for each kind of hue (blue, green, red and yellow) to identify the wavelength maximizing spectral differences. The case of red pigments is most interesting because our methodology can discriminate the red pigments very well—even red lakes, which are always difficult to identify. As for the yellow and blue categories, it represents a good progress of IRFC photography for pigment discrimination. We apply our methodology to study the pigments on a painting by Eustache Le Sueur, a French painter of the seventeenth century. We compare the results to other noninvasive analysis like X-ray fluorescence and optical microscopy. Finally, we draw conclusions about the advantages and limits of the variable false-color image method using hyperspectral imaging.

  14. Using statistical text classification to identify health information technology incidents

    PubMed Central

    Chai, Kevin E K; Anthony, Stephen; Coiera, Enrico; Magrabi, Farah

    2013-01-01

    Objective To examine the feasibility of using statistical text classification to automatically identify health information technology (HIT) incidents in the USA Food and Drug Administration (FDA) Manufacturer and User Facility Device Experience (MAUDE) database. Design We used a subset of 570 272 incidents including 1534 HIT incidents reported to MAUDE between 1 January 2008 and 1 July 2010. Text classifiers using regularized logistic regression were evaluated with both ‘balanced’ (50% HIT) and ‘stratified’ (0.297% HIT) datasets for training, validation, and testing. Dataset preparation, feature extraction, feature selection, cross-validation, classification, performance evaluation, and error analysis were performed iteratively to further improve the classifiers. Feature-selection techniques such as removing short words and stop words, stemming, lemmatization, and principal component analysis were examined. Measurements κ statistic, F1 score, precision and recall. Results Classification performance was similar on both the stratified (0.954 F1 score) and balanced (0.995 F1 score) datasets. Stemming was the most effective technique, reducing the feature set size to 79% while maintaining comparable performance. Training with balanced datasets improved recall (0.989) but reduced precision (0.165). Conclusions Statistical text classification appears to be a feasible method for identifying HIT reports within large databases of incidents. Automated identification should enable more HIT problems to be detected, analyzed, and addressed in a timely manner. Semi-supervised learning may be necessary when applying machine learning to big data analysis of patient safety incidents and requires further investigation. PMID:23666777

  15. A study of the professional development needs of Shiraz high schools' principals in the area of educational leadership.

    PubMed

    Hayat, Aliasghar; Abdollahi, Bijan; Zainabadi, Hasan Reza; Arasteh, Hamid Reza

    2015-07-01

    The increased emphasis on standards-based school accountability since the passage of the no child left behind act of 2001 is focusing critical attention on the professional development of school principals and their ability to meet the challenges of improving the student outcomes. Due to this subject, the current study examined professional development needs of Shiraz high schools principals. The statistical population consisted of 343 principals of Shiraz high schools, of whom 250 subjects were selected using Krejcie and Morgan (1978) sample size determination table. To collect the data, a questionnaire developed by Salazar (2007) was administered. This questionnaire was designed for professional development in the leadership skills/competencies and consisted of 25 items in each leadership performance domain using five-point Likert-type scales. The content validity of the questionnaire was confirmed and the Cronbach's Alpha coefficient was (a=0.78). To analyze the data, descriptive statistics and Paired-Samples t-test were used. Also, the data was analyzed through SPSS14 software. The findings showed that principals' "Importance" ratings were always higher than their "Actual proficiency" ratings. The mean score of the difference between "Importance" and "Actual proficiency" pair on "Organizing resources" was 2.11, making it the highest "need" area. The lowest need area was "Managing the organization and operational procedures" at 0.81. Also, the results showed that there was a statistically significant difference between the means of the "Importance" and the corresponding means on the "Actual proficiency" (Difference of means=1.48, t=49.38, p<0.001). Based on the obtained results, the most important professional development needs of the principals included organizing resources, resolving complex problems, understanding student development and learning, developing the vision and the mission, building team commitment, understanding measurements, evaluation and assessment strategies, facilitating the change process, solving problems and making decisions. In other words, the principals had statistically significant professional development needs in all areas of the educational leadership. Also, the results suggested that today's school principals need to grow and learn throughout their careers by ongoing professional development.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  17. 15 CFR 758.3 - Responsibilities of parties to the transaction.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... principal party in interest the exporter for EAR purposes. One writing may cover multiple transactions between the same principals. See § 748.4(a)(3) of the EAR. Note to paragraph (b): For statistical purposes.... principal party in interest. For purposes of licensing responsibility under the EAR, the U.S. agent of the...

  18. 15 CFR 758.3 - Responsibilities of parties to the transaction.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... principal party in interest the exporter for EAR purposes. One writing may cover multiple transactions between the same principals. See § 748.4(a)(3) of the EAR. Note to paragraph (b): For statistical purposes.... principal party in interest. For purposes of licensing responsibility under the EAR, the U.S. agent of the...

  19. 15 CFR 758.3 - Responsibilities of parties to the transaction.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... principal party in interest the exporter for EAR purposes. One writing may cover multiple transactions between the same principals. See § 748.4(a)(3) of the EAR. Note to paragraph (b): For statistical purposes.... principal party in interest. For purposes of licensing responsibility under the EAR, the U.S. agent of the...

  20. 15 CFR 758.3 - Responsibilities of parties to the transaction.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... principal party in interest the exporter for EAR purposes. One writing may cover multiple transactions between the same principals. See § 748.4(a)(3) of the EAR. Note to paragraph (b): For statistical purposes.... principal party in interest. For purposes of licensing responsibility under the EAR, the U.S. agent of the...

  1. 15 CFR 758.3 - Responsibilities of parties to the transaction.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... principal party in interest the exporter for EAR purposes. One writing may cover multiple transactions between the same principals. See § 748.4(a)(3) of the EAR. Note to paragraph (b): For statistical purposes.... principal party in interest. For purposes of licensing responsibility under the EAR, the U.S. agent of the...

  2. Comparing and combining biomarkers as principle surrogates for time-to-event clinical endpoints.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Gilbert, Peter B

    2015-02-10

    Principal surrogate endpoints are useful as targets for phase I and II trials. In many recent trials, multiple post-randomization biomarkers are measured. However, few statistical methods exist for comparison of or combination of biomarkers as principal surrogates, and none of these methods to our knowledge utilize time-to-event clinical endpoint information. We propose a Weibull model extension of the semi-parametric estimated maximum likelihood method that allows for the inclusion of multiple biomarkers in the same risk model as multivariate candidate principal surrogates. We propose several methods for comparing candidate principal surrogates and evaluating multivariate principal surrogates. These include the time-dependent and surrogate-dependent true and false positive fraction, the time-dependent and the integrated standardized total gain, and the cumulative distribution function of the risk difference. We illustrate the operating characteristics of our proposed methods in simulations and outline how these statistics can be used to evaluate and compare candidate principal surrogates. We use these methods to investigate candidate surrogates in the Diabetes Control and Complications Trial. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Study of T-wave morphology parameters based on Principal Components Analysis during acute myocardial ischemia

    NASA Astrophysics Data System (ADS)

    Baglivo, Fabricio Hugo; Arini, Pedro David

    2011-12-01

    Electrocardiographic repolarization abnormalities can be detected by Principal Components Analysis of the T-wave. In this work we studied the efect of signal averaging on the mean value and reproducibility of the ratio of the 2nd to the 1st eigenvalue of T-wave (T21W) and the absolute and relative T-wave residuum (TrelWR and TabsWR) in the ECG during ischemia induced by Percutaneous Coronary Intervention. Also, the intra-subject and inter-subject variability of T-wave parameters have been analyzed. Results showed that TrelWR and TabsWR evaluated from the average of 10 complexes had lower values and higher reproducibility than those obtained from 1 complex. On the other hand T21W calculated from 10 complexes did not show statistical diferences versus the T21W calculated on single beats. The results of this study corroborate that, with a signal averaging technique, the 2nd and the 1st eigenvalue are not afected by noise while the 4th to 8th eigenvalues are so much afected by this, suggesting the use of the signal averaged technique before calculation of absolute and relative T-wave residuum. Finally, we have shown that T-wave morphology parameters present high intra-subject stability.

  4. Mitotic instability in triploid and tetraploid one-year-old eastern oyster, Crassostrea virginica, assessed by cytogenetic and flow cytometry techniques.

    PubMed

    de Sousa, Joana Teixeira; Allen, Standish K; Wolfe, Brittany M; Small, Jessica Moss

    2018-02-01

    For commercial oyster aquaculture, triploidy has significant advantages. To produce triploids, the principal technology uses diploid × tetraploid crosses. The development of tetraploid brood stock for this purpose has been successful, but as more is understood about tetraploids, it seems clear that chromosome instability is a principal feature in oysters. This paper is a continuation of work to investigate chromosome instability in polyploid Crassostrea virginica. We established families between tetraploids-apparently stable (non-mosaic) and unstable (mosaic)-and normal reference diploids, creating triploid groups, as well as tetraploids between mosaic and non-mosaic tetraploids. Chromosome loss was about the same for triploid juveniles produced from either mosaic or non-mosaic tetraploids or from either male or female tetraploids. However, there was a statistically significant difference in chromosome loss in tetraploid juveniles produced from mosaic versus non-mosaic parents, with mosaics producing more unstable progeny. These results confirm that chromosome instability, as manifested in mosaic tetraploids, is of little concern for producing triploids, but it is clearly problematic for tetraploid breeding. Concordance between the results from cytogenetics and flow cytometry was also tested for the first time in oysters, by assessing the ploidy of individuals using both techniques. Results between the two were non-concordant.

  5. Door Security using Face Detection and Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Bhutra, Venkatesh; Kumar, Harshav; Jangid, Santosh; Solanki, L.

    2018-03-01

    With the world moving towards advanced technologies, security forms a crucial part in daily life. Among the many techniques used for this purpose, Face Recognition stands as effective means of authentication and security. This paper deals with the user of principal component and security. PCA is a statistical approach used to simplify a data set. The minimum Euclidean distance found from the PCA technique is used to recognize the face. Raspberry Pi a low cost ARM based computer on a small circuit board, controls the servo motor and other sensors. The servo-motor is in turn attached to the doors of home and opens up when the face is recognized. The proposed work has been done using a self-made training database of students from B.K. Birla Institute of Engineering and Technology, Pilani, Rajasthan, India.

  6. On the brain structure heterogeneity of autism: Parsing out acquisition site effects with significance-weighted principal component analysis.

    PubMed

    Martinez-Murcia, Francisco Jesús; Lai, Meng-Chuan; Górriz, Juan Manuel; Ramírez, Javier; Young, Adam M H; Deoni, Sean C L; Ecker, Christine; Lombardo, Michael V; Baron-Cohen, Simon; Murphy, Declan G M; Bullmore, Edward T; Suckling, John

    2017-03-01

    Neuroimaging studies have reported structural and physiological differences that could help understand the causes and development of Autism Spectrum Disorder (ASD). Many of them rely on multisite designs, with the recruitment of larger samples increasing statistical power. However, recent large-scale studies have put some findings into question, considering the results to be strongly dependent on the database used, and demonstrating the substantial heterogeneity within this clinically defined category. One major source of variance may be the acquisition of the data in multiple centres. In this work we analysed the differences found in the multisite, multi-modal neuroimaging database from the UK Medical Research Council Autism Imaging Multicentre Study (MRC AIMS) in terms of both diagnosis and acquisition sites. Since the dissimilarities between sites were higher than between diagnostic groups, we developed a technique called Significance Weighted Principal Component Analysis (SWPCA) to reduce the undesired intensity variance due to acquisition site and to increase the statistical power in detecting group differences. After eliminating site-related variance, statistically significant group differences were found, including Broca's area and the temporo-parietal junction. However, discriminative power was not sufficient to classify diagnostic groups, yielding accuracies results close to random. Our work supports recent claims that ASD is a highly heterogeneous condition that is difficult to globally characterize by neuroimaging, and therefore different (and more homogenous) subgroups should be defined to obtain a deeper understanding of ASD. Hum Brain Mapp 38:1208-1223, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. Principal component regression analysis with SPSS.

    PubMed

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  8. Rapid analysis of pharmaceutical drugs using LIBS coupled with multivariate analysis.

    PubMed

    Tiwari, P K; Awasthi, S; Kumar, R; Anand, R K; Rai, P K; Rai, A K

    2018-02-01

    Type 2 diabetes drug tablets containing voglibose having dose strengths of 0.2 and 0.3 mg of various brands have been examined, using laser-induced breakdown spectroscopy (LIBS) technique. The statistical methods such as the principal component analysis (PCA) and the partial least square regression analysis (PLSR) have been employed on LIBS spectral data for classifying and developing the calibration models of drug samples. We have developed the ratio-based calibration model applying PLSR in which relative spectral intensity ratios H/C, H/N and O/N are used. Further, the developed model has been employed to predict the relative concentration of element in unknown drug samples. The experiment has been performed in air and argon atmosphere, respectively, and the obtained results have been compared. The present model provides rapid spectroscopic method for drug analysis with high statistical significance for online control and measurement process in a wide variety of pharmaceutical industrial applications.

  9. Kinetics of bacterial fluorescence staining with 3,3'-diethylthiacyanine.

    PubMed

    Thomas, Marlon S; Nuñez, Vicente; Upadhyayula, Srigokul; Zielins, Elizabeth R; Bao, Duoduo; Vasquez, Jacob M; Bahmani, Baharak; Vullev, Valentine I

    2010-06-15

    For more than a century, colorimetric and fluorescence staining have been the foundation of a broad range of key bioanalytical techniques. The dynamics of such staining processes, however, still remains largely unexplored. We investigated the kinetics of fluorescence staining of two gram-negative and two gram-positive species with 3,3'-diethylthiacyanine (THIA) iodide. An increase in the THIA fluorescence quantum yield, induced by the bacterial dye uptake, was the principal reason for the observed emission enhancement. The fluorescence quantum yield of THIA depended on the media viscosity and not on the media polarity, which suggested that the microenvironment of the dye molecules taken up by the cells was restrictive. The kinetics of fluorescence staining did not manifest a statistically significant dependence neither on the dye concentration, nor on the cell count. In the presence of surfactant additives, however, the fluorescence-enhancement kinetic patterns manifested species specificity with statistically significant discernibility.

  10. The Role of School Principals in the Governorate of Ma'an in Promoting Intellectual Security among Students

    ERIC Educational Resources Information Center

    Waswas, Dima; Gasaymeh, Al-Mothana M.

    2017-01-01

    This study aims at identifying the role played by school principals in the Governorate of Ma'an to strengthen intellectual security of the school students; and identifying whether there are statistically significant differences in the roles of principals attributed to the variables: gender, academic level, and years of experience in…

  11. Assessing heavy metal toxicity in sediments of Chennai Coast of Tamil Nadu using Energy Dispersive X-Ray Fluorescence Spectroscopy (EDXRF) with statistical approach.

    PubMed

    Tholkappian, M; Ravisankar, R; Chandrasekaran, A; Jebakumar, J Prince Prakash; Kanagasabapathy, K V; Prasad, M V R; Satapathy, K K

    2018-01-01

    The concentration of some heavy metals: Al, Ca, K, Fe, Ti, Mg, Mn, V, Cr, Zn, Ni and Co in sediments from Pulicat Lake to Vadanemmeli along Chennai Coast, Tamil Nadu has been determined using EDXRF technique. The mean concentrations of Mg, Al, K, Ca, Ti, Fe, V, Cr, Mn, Co, Ni, and Zn were found to be 1918, 25436, 9832, 9859, 2109, 8209, 41.58, 34.14, 160.80, 2.85. 18.79 and 29.12 mg kg -1 respectively. These mean concentrations do not exceed the world crustal average. The level of pollution attributed to heavy metals was evaluated using several pollution indicators in order to determine anthropogenically derived contaminations. Enrichment Factor (EF), Geoaccumulation Index (I geo ), Contamination Factor (CF) and Pollution Load Index (PLI) were used in evaluating the contamination status of sediments. Enrichment Factors (EF) reveal the anthropogenic sources of V, Cr, Ni and Zn Geoaccumulation Index (I geo ) results reveal that the study area is not contaminated by the heavy metals. Similar results were also obtained by using pollution load index (PLI). The results of pollution indices indicates that most of the locations were not polluted by heavy metals. Multivariate statistical analysis performed using principal components and clustering techniques were used to identify the source of the heavy metals. The result of statistical procedures indicate that heavy metals in sediments are mainly of natural origin. This study provides a relatively novel technique for identifying and mapping the distribution of metal pollutants and their sources in sediment.

  12. Forensic Discrimination of Latent Fingerprints Using Laser-Induced Breakdown Spectroscopy (LIBS) and Chemometric Approaches.

    PubMed

    Yang, Jun-Ho; Yoh, Jack J

    2018-01-01

    A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.

  13. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    PubMed

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  14. High-spatial-resolution passive microwave sounding systems

    NASA Technical Reports Server (NTRS)

    Staelin, D. H.; Rosenkranz, P. W.

    1994-01-01

    The principal contributions of this combined theoretical and experimental effort were to advance and demonstrate new and more accurate techniques for sounding atmospheric temperature, humidity, and precipitation profiles at millimeter wavelengths, and to improve the scientific basis for such soundings. Some of these techniques are being incorporated in both research and operational systems. Specific results include: (1) development of the MIT Microwave Temperature Sounder (MTS), a 118-GHz eight-channel imaging spectrometer plus a switched-frequency spectrometer near 53 GHz, for use on the NASA ER-2 high-altitude aircraft, (2) conduct of ER-2 MTS missions in multiple seasons and locations in combination with other instruments, mapping with unprecedented approximately 2-km lateral resolution atmospheric temperature and precipitation profiles, atmospheric transmittances (at both zenith and nadir), frontal systems, and hurricanes, (3) ground based 118-GHz 3-D spectral images of wavelike structure within clouds passing overhead, (4) development and analysis of approaches to ground- and space-based 5-mm wavelength sounding of the upper stratosphere and mesosphere, which supported the planning of improvements to operational weather satellites, (5) development of improved multidimensional and adaptive retrieval methods for atmospheric temperature and humidity profiles, (6) development of combined nonlinear and statistical retrieval techniques for 183-GHz humidity profile retrievals, (7) development of nonlinear statistical retrieval techniques for precipitation cell-top altitudes, and (8) numerical analyses of the impact of remote sensing data on the accuracy of numerical weather predictions; a 68-km gridded model was used to study the spectral properties of error growth.

  15. Using Shakespeare's Sotto Voce to Determine True Identity From Text

    PubMed Central

    Kernot, David; Bossomaier, Terry; Bradbury, Roger

    2018-01-01

    Little is known of the private life of William Shakespeare, but he is famous for his collection of plays and poems, even though many of the works attributed to him were published anonymously. Determining the identity of Shakespeare has fascinated scholars for 400 years, and four significant figures in English literary history have been suggested as likely alternatives to Shakespeare for some disputed works: Bacon, de Vere, Stanley, and Marlowe. A myriad of computational and statistical tools and techniques have been used to determine the true authorship of his works. Many of these techniques rely on basic statistical correlations, word counts, collocated word groups, or keyword density, but no one method has been decided on. We suggest that an alternative technique that uses word semantics to draw on personality can provide an accurate profile of a person. To test this claim, we analyse the works of Shakespeare, Christopher Marlowe, and Elizabeth Cary. We use Word Accumulation Curves, Hierarchical Clustering overlays, Principal Component Analysis, and Linear Discriminant Analysis techniques in combination with RPAS, a multi-faceted text analysis approach that draws on a writer's personality, or self to identify subtle characteristics within a person's writing style. Here we find that RPAS can separate the known authored works of Shakespeare from Marlowe and Cary. Further, it separates their contested works, works suspected of being written by others. While few authorship identification techniques identify self from the way a person writes, we demonstrate that these stylistic characteristics are as applicable 400 years ago as they are today and have the potential to be used within cyberspace for law enforcement purposes. PMID:29599734

  16. Assessing Principals' Quality Assurance Strategies in Osun State Secondary Schools, Nigeria

    ERIC Educational Resources Information Center

    Fasasi, Yunus Adebunmi; Oyeniran, Saheed

    2014-01-01

    This paper examined principals' quality assurance strategies in secondary schools in Osun State, Nigeria. The study adopted a descriptive survey research design. Stratified random sampling technique was used to select 10 male and 10 female principals, and 190 male and190 female teachers. "Secondary School Principal Quality Assurance…

  17. Use of multispectral Ikonos imagery for discriminating between conventional and conservation agricultural tillage practices

    USGS Publications Warehouse

    Vina, Andres; Peters, Albert J.; Ji, Lei

    2003-01-01

    There is a global concern about the increase in atmospheric concentrations of greenhouse gases. One method being discussed to encourage greenhouse gas mitigation efforts is based on a trading system whereby carbon emitters can buy effective mitigation efforts from farmers implementing conservation tillage practices. These practices sequester carbon from the atmosphere, and such a trading system would require a low-cost and accurate method of verification. Remote sensing technology can offer such a verification technique. This paper is focused on the use of standard image processing procedures applied to a multispectral Ikonos image, to determine whether it is possible to validate that farmers have complied with agreements to implement conservation tillage practices. A principal component analysis (PCA) was performed in order to isolate image variance in cropped fields. Analyses of variance (ANOVA) statistical procedures were used to evaluate the capability of each Ikonos band and each principal component to discriminate between conventional and conservation tillage practices. A logistic regression model was implemented on the principal component most effective in discriminating between conventional and conservation tillage, in order to produce a map of the probability of conventional tillage. The Ikonos imagery, in combination with ground-reference information, proved to be a useful tool for verification of conservation tillage practices.

  18. Multivariate analysis of chromatographic retention data as a supplementary means for grouping structurally related compounds.

    PubMed

    Fasoula, S; Zisi, Ch; Sampsonidis, I; Virgiliou, Ch; Theodoridis, G; Gika, H; Nikitas, P; Pappa-Louisi, A

    2015-03-27

    In the present study a series of 45 metabolite standards belonging to four chemically similar metabolite classes (sugars, amino acids, nucleosides and nucleobases, and amines) was subjected to LC analysis on three HILIC columns under 21 different gradient conditions with the aim to explore whether the retention properties of these analytes are determined from the chemical group they belong. Two multivariate techniques, principal component analysis (PCA) and discriminant analysis (DA), were used for statistical evaluation of the chromatographic data and extraction similarities between chemically related compounds. The total variance explained by the first two principal components of PCA was found to be about 98%, whereas both statistical analyses indicated that all analytes are successfully grouped in four clusters of chemical structure based on the retention obtained in four or at least three chromatographic runs, which, however should be performed on two different HILIC columns. Moreover, leave-one-out cross-validation of the above retention data set showed that the chemical group in which an analyte belongs can be 95.6% correctly predicted when the analyte is subjected to LC analysis under the same four or three experimental conditions as the all set of analytes was run beforehand. That, in turn, may assist with disambiguation of analyte identification in complex biological extracts. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. The Perceptions of Principals and Teachers Regarding Mental Health Providers' Impact on Student Achievement in High Poverty Schools

    ERIC Educational Resources Information Center

    Perry, Teresa

    2012-01-01

    This study examined the perceptions of principals and teachers regarding mental health provider's impact on student achievement and behavior in high poverty schools using descriptive statistics, t-test, and two-way ANOVA. Respondents in this study shared similar views concerning principal and teacher satisfaction and levels of support for the…

  20. Statistical analysis of major ion and trace element geochemistry of water, 1986-2006, at seven wells transecting the freshwater/saline-water interface of the Edwards Aquifer, San Antonio, Texas

    USGS Publications Warehouse

    Mahler, Barbara J.

    2008-01-01

    The statistical analyses taken together indicate that the geochemistry at the freshwater-zone wells is more variable than that at the transition-zone wells. The geochemical variability at the freshwater-zone wells might result from dilution of ground water by meteoric water. This is indicated by relatively constant major ion molar ratios; a preponderance of positive correlations between SC, major ions, and trace elements; and a principal components analysis in which the major ions are strongly loaded on the first principal component. Much of the variability at three of the four transition-zone wells might result from the use of different laboratory analytical methods or reporting procedures during the period of sampling. This is reflected by a lack of correlation between SC and major ion concentrations at the transition-zone wells and by a principal components analysis in which the variability is fairly evenly distributed across several principal components. The statistical analyses further indicate that, although the transition-zone wells are less well connected to surficial hydrologic conditions than the freshwater-zone wells, there is some connection but the response time is longer. 

  1. Principal curvatures and area ratio of propagating surfaces in isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Zheng, Tianhang; You, Jiaping; Yang, Yue

    2017-10-01

    We study the statistics of principal curvatures and the surface area ratio of propagating surfaces with a constant or nonconstant propagating velocity in isotropic turbulence using direct numerical simulation. Propagating surface elements initially constitute a plane to model a planar premixed flame front. When the statistics of evolving propagating surfaces reach the stationary stage, the statistical profiles of principal curvatures scaled by the Kolmogorov length scale versus the constant displacement speed scaled by the Kolmogorov velocity scale collapse at different Reynolds numbers. The magnitude of averaged principal curvatures and the number of surviving surface elements without cusp formation decrease with increasing displacement speed. In addition, the effect of surface stretch on the nonconstant displacement speed inhibits the cusp formation on surface elements at negative Markstein numbers. In order to characterize the wrinkling process of the global propagating surface, we develop a model to demonstrate that the increase of the surface area ratio is primarily due to positive Lagrangian time integrations of the area-weighted averaged tangential strain-rate term and propagation-curvature term. The difference between the negative averaged mean curvature and the positive area-weighted averaged mean curvature characterizes the cellular geometry of the global propagating surface.

  2. A study of the professional development needs of Shiraz high schools’ principals in the area of educational leadership

    PubMed Central

    HAYAT, ALIASGHAR; ABDOLLAHI, BIJAN; ZAINABADI, HASAN REZA; ARASTEH, HAMID REZA

    2015-01-01

    Introduction The increased emphasis on standards-based school accountability since the passage of the no child left behind act of 2001 is focusing critical attention on the professional development of school principals and their ability to meet the challenges of improving the student outcomes. Due to this subject, the current study examined professional development needs of Shiraz high schools principals. Methods The statistical population consisted of 343 principals of Shiraz high schools, of whom 250 subjects were selected using Krejcie and Morgan (1978) sample size determination table. To collect the data, a questionnaire developed by Salazar (2007) was administered. This questionnaire was designed for professional development in the leadership skills/competencies and consisted of 25 items in each leadership performance domain using five-point Likert-type scales. The content validity of the questionnaire was confirmed and the Cronbach’s Alpha coefficient was (a=0.78). To analyze the data, descriptive statistics and Paired-Samples t-test were used. Also, the data was analyzed through SPSS14 software. Results The findings showed that principals’ “Importance” ratings were always higher than their “Actual proficiency” ratings. The mean score of the difference between “Importance” and “Actual proficiency” pair on “Organizing resources” was 2.11, making it the highest “need” area. The lowest need area was “Managing the organization and operational procedures” at 0.81. Also, the results showed that there was a statistically significant difference between the means of the “Importance” and the corresponding means on the “Actual proficiency” (Difference of means=1.48, t=49.38, p<0.001). Conclusion Based on the obtained results, the most important professional development needs of the principals included organizing resources, resolving complex problems, understanding student development and learning, developing the vision and the mission, building team commitment, understanding measurements, evaluation and assessment strategies, facilitating the change process, solving problems and making decisions. In other words, the principals had statistically significant professional development needs in all areas of the educational leadership. Also, the results suggested that today’s school principals need to grow and learn throughout their careers by ongoing professional development. PMID:26269786

  3. Progress in the Development of CdZnTe Unipolar Detectors for Different Anode Geometries and Data Corrections

    PubMed Central

    Zhang, Qiushi; Zhang, Congzhe; Lu, Yanye; Yang, Kun; Ren, Qiushi

    2013-01-01

    CdZnTe detectors have been under development for the past two decades, providing good stopping power for gamma rays, lightweight camera heads and improved energy resolution. However, the performance of this type of detector is limited primarily by incomplete charge collection problems resulting from charge carriers trapping. This paper is a review of the progress in the development of CdZnTe unipolar detectors with some data correction techniques for improving performance of the detectors. We will first briefly review the relevant theories. Thereafter, two aspects of the techniques for overcoming the hole trapping issue are summarized, including irradiation direction configuration and pulse shape correction methods. CdZnTe detectors of different geometries are discussed in detail, covering the principal of the electrode geometry design, the design and performance characteristics, some detector prototypes development and special correction techniques to improve the energy resolution. Finally, the state of art development of 3-D position sensing and Compton imaging technique are also discussed. Spectroscopic performance of CdZnTe semiconductor detector will be greatly improved even to approach the statistical limit on energy resolution with the combination of some of these techniques. PMID:23429509

  4. ELICIT: An alternative imprecise weight elicitation technique for use in multi-criteria decision analysis for healthcare.

    PubMed

    Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard

    2016-01-01

    In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers' (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. The criteria were ranked from 1-5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. ELICIT is appropriate in situations where only ordinal DMs' preferences are available to elicit decision criteria weights.

  5. Evaluation of the environmental contamination at an abandoned mining site using multivariate statistical techniques--the Rodalquilar (Southern Spain) mining district.

    PubMed

    Bagur, M G; Morales, S; López-Chicano, M

    2009-11-15

    Unsupervised and supervised pattern recognition techniques such as hierarchical cluster analysis, principal component analysis, factor analysis and linear discriminant analysis have been applied to water samples recollected in Rodalquilar mining district (Southern Spain) in order to identify different sources of environmental pollution caused by the abandoned mining industry. The effect of the mining activity on waters was monitored determining the concentration of eleven elements (Mn, Ba, Co, Cu, Zn, As, Cd, Sb, Hg, Au and Pb) by inductively coupled plasma mass spectrometry (ICP-MS). The Box-Cox transformation has been used to transform the data set in normal form in order to minimize the non-normal distribution of the geochemical data. The environmental impact is affected mainly by the mining activity developed in the zone, the acid drainage and finally by the chemical treatment used for the benefit of gold.

  6. Increasing the perceptual salience of relationships in parallel coordinate plots.

    PubMed

    Harter, Jonathan M; Wu, Xunlei; Alabi, Oluwafemi S; Phadke, Madhura; Pinto, Lifford; Dougherty, Daniel; Petersen, Hannah; Bass, Steffen; Taylor, Russell M

    2012-01-01

    We present three extensions to parallel coordinates that increase the perceptual salience of relationships between axes in multivariate data sets: (1) luminance modulation maintains the ability to preattentively detect patterns in the presence of overplotting, (2) adding a one-vs.-all variable display highlights relationships between one variable and all others, and (3) adding a scatter plot within the parallel-coordinates display preattentively highlights clusters and spatial layouts without strongly interfering with the parallel-coordinates display. These techniques can be combined with one another and with existing extensions to parallel coordinates, and two of them generalize beyond cases with known-important axes. We applied these techniques to two real-world data sets (relativistic heavy-ion collision hydrodynamics and weather observations with statistical principal component analysis) as well as the popular car data set. We present relationships discovered in the data sets using these methods.

  7. An Examination of Potential Racial and Gender Bias in the Principal Version of the Interactive Computer Interview System

    ERIC Educational Resources Information Center

    DiPonio, Joseph M.

    2010-01-01

    The primary object of this study was to determine whether racial and/or gender bias were evidenced in the use of the ICIS-Principal. Specifically, will the use of the ICIS-Principal result in biased scores at a statistically significant level when rating current practicing administrators of varying gender and race. The study involved simulated…

  8. A Study of Strengths and Weaknesses of Descriptive Assessment from Principals, Teachers and Experts Points of View in Chaharmahal and Bakhteyari Primary Schools

    ERIC Educational Resources Information Center

    Sharief, Mostafa; Naderi, Mahin; Hiedari, Maryam Shoja; Roodbari, Omolbanin; Jalilvand, Mohammad Reza

    2012-01-01

    The aim of current study is to determine the strengths and weaknesses of descriptive evaluation from the viewpoint of principals, teachers and experts of Chaharmahal and Bakhtiari province. A descriptive survey was performed. Statistical population includes 208 principals, 303 teachers, and 100 executive experts of descriptive evaluation scheme in…

  9. A Study on the Attitudes of Students, Instructors, and Educational Principals to Electronic Administration of Final-Semester Examinations in Payame Noor University in Iran

    ERIC Educational Resources Information Center

    Omidian, Faranak; Nedayeh Ali, Farzaneh

    2015-01-01

    The aim of this study was to investigate the attitudes of students, instructors, and educational principals to electronic administration of final-semester examinations at undergraduate and post- graduate levels in Payame Noor University in Khuzestan. The statistical population of this study consisted of all educational principals, instructors, of…

  10. Fractal analysis of scatter imaging signatures to distinguish breast pathologies

    NASA Astrophysics Data System (ADS)

    Eguizabal, Alma; Laughney, Ashley M.; Krishnaswamy, Venkataramanan; Wells, Wendy A.; Paulsen, Keith D.; Pogue, Brian W.; López-Higuera, José M.; Conde, Olga M.

    2013-02-01

    Fractal analysis combined with a label-free scattering technique is proposed for describing the pathological architecture of tumors. Clinicians and pathologists are conventionally trained to classify abnormal features such as structural irregularities or high indices of mitosis. The potential of fractal analysis lies in the fact of being a morphometric measure of the irregular structures providing a measure of the object's complexity and self-similarity. As cancer is characterized by disorder and irregularity in tissues, this measure could be related to tumor growth. Fractal analysis has been probed in the understanding of the tumor vasculature network. This work addresses the feasibility of applying fractal analysis to the scattering power map (as a physical modeling) and principal components (as a statistical modeling) provided by a localized reflectance spectroscopic system. Disorder, irregularity and cell size variation in tissue samples is translated into the scattering power and principal components magnitude and its fractal dimension is correlated with the pathologist assessment of the samples. The fractal dimension is computed applying the box-counting technique. Results show that fractal analysis of ex-vivo fresh tissue samples exhibits separated ranges of fractal dimension that could help classifier combining the fractal results with other morphological features. This contrast trend would help in the discrimination of tissues in the intraoperative context and may serve as a useful adjunct to surgeons.

  11. Spectral gene set enrichment (SGSE).

    PubMed

    Frost, H Robert; Li, Zhigang; Moore, Jason H

    2015-03-03

    Gene set testing is typically performed in a supervised context to quantify the association between groups of genes and a clinical phenotype. In many cases, however, a gene set-based interpretation of genomic data is desired in the absence of a phenotype variable. Although methods exist for unsupervised gene set testing, they predominantly compute enrichment relative to clusters of the genomic variables with performance strongly dependent on the clustering algorithm and number of clusters. We propose a novel method, spectral gene set enrichment (SGSE), for unsupervised competitive testing of the association between gene sets and empirical data sources. SGSE first computes the statistical association between gene sets and principal components (PCs) using our principal component gene set enrichment (PCGSE) method. The overall statistical association between each gene set and the spectral structure of the data is then computed by combining the PC-level p-values using the weighted Z-method with weights set to the PC variance scaled by Tracy-Widom test p-values. Using simulated data, we show that the SGSE algorithm can accurately recover spectral features from noisy data. To illustrate the utility of our method on real data, we demonstrate the superior performance of the SGSE method relative to standard cluster-based techniques for testing the association between MSigDB gene sets and the variance structure of microarray gene expression data. Unsupervised gene set testing can provide important information about the biological signal held in high-dimensional genomic data sets. Because it uses the association between gene sets and samples PCs to generate a measure of unsupervised enrichment, the SGSE method is independent of cluster or network creation algorithms and, most importantly, is able to utilize the statistical significance of PC eigenvalues to ignore elements of the data most likely to represent noise.

  12. Quantitative analysis of fetal facial morphology using 3D ultrasound and statistical shape modeling: a feasibility study.

    PubMed

    Dall'Asta, Andrea; Schievano, Silvia; Bruse, Jan L; Paramasivam, Gowrishankar; Kaihura, Christine Tita; Dunaway, David; Lees, Christoph C

    2017-07-01

    The antenatal detection of facial dysmorphism using 3-dimensional ultrasound may raise the suspicion of an underlying genetic condition but infrequently leads to a definitive antenatal diagnosis. Despite advances in array and noninvasive prenatal testing, not all genetic conditions can be ascertained from such testing. The aim of this study was to investigate the feasibility of quantitative assessment of fetal face features using prenatal 3-dimensional ultrasound volumes and statistical shape modeling. STUDY DESIGN: Thirteen normal and 7 abnormal stored 3-dimensional ultrasound fetal face volumes were analyzed, at a median gestation of 29 +4  weeks (25 +0 to 36 +1 ). The 20 3-dimensional surface meshes generated were aligned and served as input for a statistical shape model, which computed the mean 3-dimensional face shape and 3-dimensional shape variations using principal component analysis. Ten shape modes explained more than 90% of the total shape variability in the population. While the first mode accounted for overall size differences, the second highlighted shape feature changes from an overall proportionate toward a more asymmetric face shape with a wide prominent forehead and an undersized, posteriorly positioned chin. Analysis of the Mahalanobis distance in principal component analysis shape space suggested differences between normal and abnormal fetuses (median and interquartile range distance values, 7.31 ± 5.54 for the normal group vs 13.27 ± 9.82 for the abnormal group) (P = .056). This feasibility study demonstrates that objective characterization and quantification of fetal facial morphology is possible from 3-dimensional ultrasound. This technique has the potential to assist in utero diagnosis, particularly of rare conditions in which facial dysmorphology is a feature. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Statistical assessment of normal mitral annular geometry using automated three-dimensional echocardiographic analysis.

    PubMed

    Pouch, Alison M; Vergnat, Mathieu; McGarvey, Jeremy R; Ferrari, Giovanni; Jackson, Benjamin M; Sehgal, Chandra M; Yushkevich, Paul A; Gorman, Robert C; Gorman, Joseph H

    2014-01-01

    The basis of mitral annuloplasty ring design has progressed from qualitative surgical intuition to experimental and theoretical analysis of annular geometry with quantitative imaging techniques. In this work, we present an automated three-dimensional (3D) echocardiographic image analysis method that can be used to statistically assess variability in normal mitral annular geometry to support advancement in annuloplasty ring design. Three-dimensional patient-specific models of the mitral annulus were automatically generated from 3D echocardiographic images acquired from subjects with normal mitral valve structure and function. Geometric annular measurements including annular circumference, annular height, septolateral diameter, intercommissural width, and the annular height to intercommissural width ratio were automatically calculated. A mean 3D annular contour was computed, and principal component analysis was used to evaluate variability in normal annular shape. The following mean ± standard deviations were obtained from 3D echocardiographic image analysis: annular circumference, 107.0 ± 14.6 mm; annular height, 7.6 ± 2.8 mm; septolateral diameter, 28.5 ± 3.7 mm; intercommissural width, 33.0 ± 5.3 mm; and annular height to intercommissural width ratio, 22.7% ± 6.9%. Principal component analysis indicated that shape variability was primarily related to overall annular size, with more subtle variation in the skewness and height of the anterior annular peak, independent of annular diameter. Patient-specific 3D echocardiographic-based modeling of the human mitral valve enables statistical analysis of physiologically normal mitral annular geometry. The tool can potentially lead to the development of a new generation of annuloplasty rings that restore the diseased mitral valve annulus back to a truly normal geometry. Copyright © 2014 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Predicting Statistical Response and Extreme Events in Uncertainty Quantification through Reduced-Order Models

    NASA Astrophysics Data System (ADS)

    Qi, D.; Majda, A.

    2017-12-01

    A low-dimensional reduced-order statistical closure model is developed for quantifying the uncertainty in statistical sensitivity and intermittency in principal model directions with largest variability in high-dimensional turbulent system and turbulent transport models. Imperfect model sensitivity is improved through a recent mathematical strategy for calibrating model errors in a training phase, where information theory and linear statistical response theory are combined in a systematic fashion to achieve the optimal model performance. The idea in the reduced-order method is from a self-consistent mathematical framework for general systems with quadratic nonlinearity, where crucial high-order statistics are approximated by a systematic model calibration procedure. Model efficiency is improved through additional damping and noise corrections to replace the expensive energy-conserving nonlinear interactions. Model errors due to the imperfect nonlinear approximation are corrected by tuning the model parameters using linear response theory with an information metric in a training phase before prediction. A statistical energy principle is adopted to introduce a global scaling factor in characterizing the higher-order moments in a consistent way to improve model sensitivity. Stringent models of barotropic and baroclinic turbulence are used to display the feasibility of the reduced-order methods. Principal statistical responses in mean and variance can be captured by the reduced-order models with accuracy and efficiency. Besides, the reduced-order models are also used to capture crucial passive tracer field that is advected by the baroclinic turbulent flow. It is demonstrated that crucial principal statistical quantities like the tracer spectrum and fat-tails in the tracer probability density functions in the most important large scales can be captured efficiently with accuracy using the reduced-order tracer model in various dynamical regimes of the flow field with distinct statistical structures.

  15. Spatial patterns of heavy metals in soil under different geological structures and land uses for assessing metal enrichments.

    PubMed

    Krami, Loghman Khoda; Amiri, Fazel; Sefiyanian, Alireza; Shariff, Abdul Rashid B Mohamed; Tabatabaie, Tayebeh; Pradhan, Biswajeet

    2013-12-01

    One hundred and thirty composite soil samples were collected from Hamedan county, Iran to characterize the spatial distribution and trace the sources of heavy metals including As, Cd, Co, Cr, Cu, Ni, Pb, V, Zn, and Fe. The multivariate gap statistical analysis was used; for interrelation of spatial patterns of pollution, the disjunctive kriging and geoenrichment factor (EF(G)) techniques were applied. Heavy metals and soil properties were grouped using agglomerative hierarchical clustering and gap statistic. Principal component analysis was used for identification of the source of metals in a set of data. Geostatistics was used for the geospatial data processing. Based on the comparison between the original data and background values of the ten metals, the disjunctive kriging and EF(G) techniques were used to quantify their geospatial patterns and assess the contamination levels of the heavy metals. The spatial distribution map combined with the statistical analysis showed that the main source of Cr, Co, Ni, Zn, Pb, and V in group A land use (agriculture, rocky, and urban) was geogenic; the origin of As, Cd, and Cu was industrial and agricultural activities (anthropogenic sources). In group B land use (rangeland and orchards), the origin of metals (Cr, Co, Ni, Zn, and V) was mainly controlled by natural factors and As, Cd, Cu, and Pb had been added by organic factors. In group C land use (water), the origin of most heavy metals is natural without anthropogenic sources. The Cd and As pollution was relatively more serious in different land use. The EF(G) technique used confirmed the anthropogenic influence of heavy metal pollution. All metals showed concentrations substantially higher than their background values, suggesting anthropogenic pollution.

  16. Temporal and spatial assessment of river surface water quality using multivariate statistical techniques: a study in Can Tho City, a Mekong Delta area, Vietnam.

    PubMed

    Phung, Dung; Huang, Cunrui; Rutherford, Shannon; Dwirahmadi, Febi; Chu, Cordia; Wang, Xiaoming; Nguyen, Minh; Nguyen, Nga Huy; Do, Cuong Manh; Nguyen, Trung Hieu; Dinh, Tuan Anh Diep

    2015-05-01

    The present study is an evaluation of temporal/spatial variations of surface water quality using multivariate statistical techniques, comprising cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA). Eleven water quality parameters were monitored at 38 different sites in Can Tho City, a Mekong Delta area of Vietnam from 2008 to 2012. Hierarchical cluster analysis grouped the 38 sampling sites into three clusters, representing mixed urban-rural areas, agricultural areas and industrial zone. FA/PCA resulted in three latent factors for the entire research location, three for cluster 1, four for cluster 2, and four for cluster 3 explaining 60, 60.2, 80.9, and 70% of the total variance in the respective water quality. The varifactors from FA indicated that the parameters responsible for water quality variations are related to erosion from disturbed land or inflow of effluent from sewage plants and industry, discharges from wastewater treatment plants and domestic wastewater, agricultural activities and industrial effluents, and contamination by sewage waste with faecal coliform bacteria through sewer and septic systems. Discriminant analysis (DA) revealed that nephelometric turbidity units (NTU), chemical oxygen demand (COD) and NH₃ are the discriminating parameters in space, affording 67% correct assignation in spatial analysis; pH and NO₂ are the discriminating parameters according to season, assigning approximately 60% of cases correctly. The findings suggest a possible revised sampling strategy that can reduce the number of sampling sites and the indicator parameters responsible for large variations in water quality. This study demonstrates the usefulness of multivariate statistical techniques for evaluation of temporal/spatial variations in water quality assessment and management.

  17. Biostatistics Series Module 10: Brief Overview of Multivariate Methods.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2017-01-01

    Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.

  18. Understanding software faults and their role in software reliability modeling

    NASA Technical Reports Server (NTRS)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.

  19. Use of multivariate statistics to identify unreliable data obtained using CASA.

    PubMed

    Martínez, Luis Becerril; Crispín, Rubén Huerta; Mendoza, Maximino Méndez; Gallegos, Oswaldo Hernández; Martínez, Andrés Aragón

    2013-06-01

    In order to identify unreliable data in a dataset of motility parameters obtained from a pilot study acquired by a veterinarian with experience in boar semen handling, but without experience in the operation of a computer assisted sperm analysis (CASA) system, a multivariate graphical and statistical analysis was performed. Sixteen boar semen samples were aliquoted then incubated with varying concentrations of progesterone from 0 to 3.33 µg/ml and analyzed in a CASA system. After standardization of the data, Chernoff faces were pictured for each measurement, and a principal component analysis (PCA) was used to reduce the dimensionality and pre-process the data before hierarchical clustering. The first twelve individual measurements showed abnormal features when Chernoff faces were drawn. PCA revealed that principal components 1 and 2 explained 63.08% of the variance in the dataset. Values of principal components for each individual measurement of semen samples were mapped to identify differences among treatment or among boars. Twelve individual measurements presented low values of principal component 1. Confidence ellipses on the map of principal components showed no statistically significant effects for treatment or boar. Hierarchical clustering realized on two first principal components produced three clusters. Cluster 1 contained evaluations of the two first samples in each treatment, each one of a different boar. With the exception of one individual measurement, all other measurements in cluster 1 were the same as observed in abnormal Chernoff faces. Unreliable data in cluster 1 are probably related to the operator inexperience with a CASA system. These findings could be used to objectively evaluate the skill level of an operator of a CASA system. This may be particularly useful in the quality control of semen analysis using CASA systems.

  20. Assessing the Independent Contribution of Maternal Educational Expectations to Children’s Educational Attainment in Early Adulthood: A Propensity Score Matching Analysis

    PubMed Central

    Pingault, Jean Baptiste; Côté, Sylvana M.; Petitclerc, Amélie; Vitaro, Frank; Tremblay, Richard E.

    2015-01-01

    Background Parental educational expectations have been associated with children’s educational attainment in a number of long-term longitudinal studies, but whether this relationship is causal has long been debated. The aims of this prospective study were twofold: 1) test whether low maternal educational expectations contributed to failure to graduate from high school; and 2) compare the results obtained using different strategies for accounting for confounding variables (i.e. multivariate regression and propensity score matching). Methodology/Principal Findings The study sample included 1,279 participants from the Quebec Longitudinal Study of Kindergarten Children. Maternal educational expectations were assessed when the participants were aged 12 years. High school graduation – measuring educational attainment – was determined through the Quebec Ministry of Education when the participants were aged 22–23 years. Findings show that when using the most common statistical approach (i.e. multivariate regressions to adjust for a restricted set of potential confounders) the contribution of low maternal educational expectations to failure to graduate from high school was statistically significant. However, when using propensity score matching, the contribution of maternal expectations was reduced and remained statistically significant only for males. Conclusions/Significance The results of this study are consistent with the possibility that the contribution of parental expectations to educational attainment is overestimated in the available literature. This may be explained by the use of a restricted range of potential confounding variables as well as the dearth of studies using appropriate statistical techniques and study designs in order to minimize confounding. Each of these techniques and designs, including propensity score matching, has its strengths and limitations: A more comprehensive understanding of the causal role of parental expectations will stem from a convergence of findings from studies using different techniques and designs. PMID:25803867

  1. Seasonal assessment and apportionment of surface water pollution using multivariate statistical methods: Sinos River, southern Brazil.

    PubMed

    Alves, Darlan Daniel; Riegel, Roberta Plangg; de Quevedo, Daniela Müller; Osório, Daniela Montanari Migliavacca; da Costa, Gustavo Marques; do Nascimento, Carlos Augusto; Telöken, Franko

    2018-06-08

    Assessment of surface water quality is an issue of currently high importance, especially in polluted rivers which provide water for treatment and distribution as drinking water, as is the case of the Sinos River, southern Brazil. Multivariate statistical techniques allow a better understanding of the seasonal variations in water quality, as well as the source identification and source apportionment of water pollution. In this study, the multivariate statistical techniques of cluster analysis (CA), principal component analysis (PCA), and positive matrix factorization (PMF) were used, along with the Kruskal-Wallis test and Spearman's correlation analysis in order to interpret a water quality data set resulting from a monitoring program conducted over a period of almost two years (May 2013 to April 2015). The water samples were collected from the raw water inlet of the municipal water treatment plant (WTP) operated by the Water and Sewage Services of Novo Hamburgo (COMUSA). CA allowed the data to be grouped into three periods (autumn and summer (AUT-SUM); winter (WIN); spring (SPR)). Through the PCA, it was possible to identify that the most important parameters in contribution to water quality variations are total coliforms (TCOLI) in SUM-AUT, water level (WL), water temperature (WT), and electrical conductivity (EC) in WIN and color (COLOR) and turbidity (TURB) in SPR. PMF was applied to the complete data set and enabled the source apportionment water pollution through three factors, which are related to anthropogenic sources, such as the discharge of domestic sewage (mostly represented by Escherichia coli (ECOLI)), industrial wastewaters, and agriculture runoff. The results provided by this study demonstrate the contribution provided by the use of integrated statistical techniques in the interpretation and understanding of large data sets of water quality, showing also that this approach can be used as an efficient methodology to optimize indicators for water quality assessment.

  2. Crude oil price forecasting based on hybridizing wavelet multiple linear regression model, particle swarm optimization techniques, and principal component analysis.

    PubMed

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series.

  3. Physician performance assessment using a composite quality index.

    PubMed

    Liu, Kaibo; Jain, Shabnam; Shi, Jianjun

    2013-07-10

    Assessing physician performance is important for the purposes of measuring and improving quality of service and reducing healthcare delivery costs. In recent years, physician performance scorecards have been used to provide feedback on individual measures; however, one key challenge is how to develop a composite quality index that combines multiple measures for overall physician performance evaluation. A controversy arises over establishing appropriate weights to combine indicators in multiple dimensions, and cannot be easily resolved. In this study, we proposed a generic unsupervised learning approach to develop a single composite index for physician performance assessment by using non-negative principal component analysis. We developed a new algorithm named iterative quadratic programming to solve the numerical issue in the non-negative principal component analysis approach. We conducted real case studies to demonstrate the performance of the proposed method. We provided interpretations from both statistical and clinical perspectives to evaluate the developed composite ranking score in practice. In addition, we implemented the root cause assessment techniques to explain physician performance for improvement purposes. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Crude Oil Price Forecasting Based on Hybridizing Wavelet Multiple Linear Regression Model, Particle Swarm Optimization Techniques, and Principal Component Analysis

    PubMed Central

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  5. Differential use of fresh water environments by wintering waterfowl of coastal Texas

    USGS Publications Warehouse

    White, D.H.; James, D.

    1978-01-01

    A comparative study of the environmental relationships among 14 species of wintering waterfowl was conducted at the Welder Wildlife Foundation, San Patricia County, near Sinton, Texas during the fall and early winter of 1973. Measurements of 20 environmental factors (social, vegetational, physical, and chemical) were subjected to multivariate statistical methods to determine certain niche characteristics and environmental relationships of waterfowl wintering in the aquatic community.....Each waterfowl species occupied a unique realized niche by responding to distinct combinations of environmental factors identified by principal component analysis. One percent confidence ellipses circumscribing the mean scores plotted for the first and second principal components gave an indication of relative niche width for each species. The waterfowl environments were significantly different interspecifically and water depth at feeding site and % emergent vegetation were most important in the separation. This was shown by subjecting the transformed data to multivariate analysis of variance with an associated step-down procedure. The species were distributed along a community cline extending from shallow water with abundant emergent vegetation to open deep water with little emergent vegetation of any kind. Four waterfowl subgroups were significantly separated along the cline, as indicated by one-way analysis of variance with Duncan?s multiple range test. Clumping of the bird species toward the middle of the available habitat hyperspace was shown in a plot of the principal component scores for the random samples and individual species.....Naturally occurring relationships among waterfowl were clarified using principal comcomponent analysis and related multivariate procedures. These techniques may prove useful in wetland management for particular groups of waterfowl based on habitat preferences.

  6. Statistics as Tools in Library Planning: On the State and Institutional Level.

    ERIC Educational Resources Information Center

    Trezza, Alphonse F.

    The principal uses of statistics in library planning may be illustrated by examples from the state of Illinois. State law specifies that the Illinois State Library compile and publish statistics on libraries. State agencies also play an important and expanding role in this effort. The state library now compiles statistics on all types of…

  7. An extended data mining method for identifying differentially expressed assay-specific signatures in functional genomic studies.

    PubMed

    Rollins, Derrick K; Teh, Ailing

    2010-12-17

    Microarray data sets provide relative expression levels for thousands of genes for a small number, in comparison, of different experimental conditions called assays. Data mining techniques are used to extract specific information of genes as they relate to the assays. The multivariate statistical technique of principal component analysis (PCA) has proven useful in providing effective data mining methods. This article extends the PCA approach of Rollins et al. to the development of ranking genes of microarray data sets that express most differently between two biologically different grouping of assays. This method is evaluated on real and simulated data and compared to a current approach on the basis of false discovery rate (FDR) and statistical power (SP) which is the ability to correctly identify important genes. This work developed and evaluated two new test statistics based on PCA and compared them to a popular method that is not PCA based. Both test statistics were found to be effective as evaluated in three case studies: (i) exposing E. coli cells to two different ethanol levels; (ii) application of myostatin to two groups of mice; and (iii) a simulated data study derived from the properties of (ii). The proposed method (PM) effectively identified critical genes in these studies based on comparison with the current method (CM). The simulation study supports higher identification accuracy for PM over CM for both proposed test statistics when the gene variance is constant and for one of the test statistics when the gene variance is non-constant. PM compares quite favorably to CM in terms of lower FDR and much higher SP. Thus, PM can be quite effective in producing accurate signatures from large microarray data sets for differential expression between assays groups identified in a preliminary step of the PCA procedure and is, therefore, recommended for use in these applications.

  8. Application of multivariate statistical analysis concepts for assessment of hydrogeochemistry of groundwater—a study in Suri I and II blocks of Birbhum District, West Bengal, India

    NASA Astrophysics Data System (ADS)

    Das, Shreya; Nag, S. K.

    2017-05-01

    Multivariate statistical techniques, cluster and principal component analysis were applied to the data on groundwater quality of Suri I and II Blocks of Birbhum District, West Bengal, India, to extract principal factors corresponding to the different sources of variation in the hydrochemistry as well as the main controls on the hydrochemistry. For this, bore well water samples have been collected in two phases, during Post-monsoon (November 2012) and Pre-monsoon (April 2013) from 26 sampling locations spread homogeneously over the two blocks. Excess fluoride in groundwater has been reported at two locations both in post- and in pre-monsoon sessions, with a rise observed in pre-monsoon. Localized presence of excess iron has also been observed during both sessions. The water is found to be mildly alkaline in post-monsoon but slightly acidic at some locations during pre-monsoon. Correlation and cluster analysis studies demonstrate that fluoride shares a moderately positive correlation with pH in post-monsoon and a very strong one with carbonate in pre-monsoon indicating dominance of rock water interaction and ion exchange activity in the study area. Certain locations in the study area have been reported with less than 0.6 mg/l fluoride in groundwater, leading to possibility of occurrence of severe dental caries especially in children. Low values of sulfate and phosphate in water indicate a meager chance of contamination of groundwater due to anthropogenic factors.

  9. Improving the Principal Selection Process to Enhance the Opportunities for Women.

    ERIC Educational Resources Information Center

    Chapman, Judith

    1986-01-01

    Presents statistical profiles of Australian women principals and reviews research on school administrator selection in Australia, the United Kingdom, and the United States. To ensure equity, specific recommendations are given concerning vacancy announcements, criteria identification, consideration of evidence, and interviewing and decision-making…

  10. Teacher Contract Non-Renewal: Midwest, Rocky Mountains, and Southeast

    ERIC Educational Resources Information Center

    Nixon, Andy; Dam, Margaret; Packard, Abbot L.

    2012-01-01

    This quantitative study investigated reasons that school principals recommend non-renewal of probationary teachers' contracts. Principal survey results from three regions of the US (Midwest, Rocky Mountains, & Southeast) were analyzed using the Kruskal-Wallis and Mann-Whitney U statistical procedures, while significance was tested applying a…

  11. USBM (United States Bureau of Mines) borehole deformation gage absolute stress measurement test procedure: Final draft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-12-01

    The technique described herein for determining the magnitudes and directions of the in situ principal stresses utilizes the stress relief in a small volume of rock when it is physically isolated from the surrounding rock mass. Measurements of deformation are related to stress magnitudes through an understanding of the constitutive behavior of the rock. The behaviors of the non-salt strata around the ESF are expected to conform approximately to that of uniform homogeneous linear-elastic materials having either isotropic or transverse isotropic properties, for which constitutive relations are developed. The constitutive behavior of the salt strata is not well understood andmore » so the overcoring technique yields information of only very limited use. For this reason the overcoring technique will not be used in the salt strata. The technique has also limited application in rocks containing joints spaced less than 8 in. (0.2 m) apart, unless a large number of test can be performed to obtain, a good statistical average. However, such unfavorably discontinuous rocks are not expected as a norm at the Deaf Smith County site. 7 refs., 22 figs., 4 tabs.« less

  12. Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)

    NASA Astrophysics Data System (ADS)

    De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.

    1993-01-01

    The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.

  13. Automatic Cataract Hardness Classification Ex Vivo by Ultrasound Techniques.

    PubMed

    Caixinha, Miguel; Santos, Mário; Santos, Jaime

    2016-04-01

    To demonstrate the feasibility of a new methodology for cataract hardness characterization and automatic classification using ultrasound techniques, different cataract degrees were induced in 210 porcine lenses. A 25-MHz ultrasound transducer was used to obtain acoustical parameters (velocity and attenuation) and backscattering signals. B-Scan and parametric Nakagami images were constructed. Ninety-seven parameters were extracted and subjected to a Principal Component Analysis. Bayes, K-Nearest-Neighbours, Fisher Linear Discriminant and Support Vector Machine (SVM) classifiers were used to automatically classify the different cataract severities. Statistically significant increases with cataract formation were found for velocity, attenuation, mean brightness intensity of the B-Scan images and mean Nakagami m parameter (p < 0.01). The four classifiers showed a good performance for healthy versus cataractous lenses (F-measure ≥ 92.68%), while for initial versus severe cataracts the SVM classifier showed the higher performance (90.62%). The results showed that ultrasound techniques can be used for non-invasive cataract hardness characterization and automatic classification. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  14. An Examination of Principal Leadership Styles and Their Influence on School Performance as Measured by Adequate Yearly Progress at Selected Title I Elementary Schools in South Carolina

    ERIC Educational Resources Information Center

    Martin, Tammy Faith

    2012-01-01

    The purpose of this study was to examine principal leadership styles and their influence on school performance as measured by adequate yearly progress at selected Title I schools in South Carolina. The main focus of the research study was to complete descriptive statistics on principal leadership styles in schools that met or did not meet adequate…

  15. Adjustment of geochemical background by robust multivariate statistics

    USGS Publications Warehouse

    Zhou, D.

    1985-01-01

    Conventional analyses of exploration geochemical data assume that the background is a constant or slowly changing value, equivalent to a plane or a smoothly curved surface. However, it is better to regard the geochemical background as a rugged surface, varying with changes in geology and environment. This rugged surface can be estimated from observed geological, geochemical and environmental properties by using multivariate statistics. A method of background adjustment was developed and applied to groundwater and stream sediment reconnaissance data collected from the Hot Springs Quadrangle, South Dakota, as part of the National Uranium Resource Evaluation (NURE) program. Source-rock lithology appears to be a dominant factor controlling the chemical composition of groundwater or stream sediments. The most efficacious adjustment procedure is to regress uranium concentration on selected geochemical and environmental variables for each lithologic unit, and then to delineate anomalies by a common threshold set as a multiple of the standard deviation of the combined residuals. Robust versions of regression and RQ-mode principal components analysis techniques were used rather than ordinary techniques to guard against distortion caused by outliers Anomalies delineated by this background adjustment procedure correspond with uranium prospects much better than do anomalies delineated by conventional procedures. The procedure should be applicable to geochemical exploration at different scales for other metals. ?? 1985.

  16. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  17. Data preparation techniques for a perinatal psychiatric study based on linked data.

    PubMed

    Xu, Fenglian; Hilder, Lisa; Austin, Marie-Paule; Sullivan, Elizabeth A

    2012-06-08

    In recent years there has been an increase in the use of population-based linked data. However, there is little literature that describes the method of linked data preparation. This paper describes the method for merging data, calculating the statistical variable (SV), recoding psychiatric diagnoses and summarizing hospital admissions for a perinatal psychiatric study. The data preparation techniques described in this paper are based on linked birth data from the New South Wales (NSW) Midwives Data Collection (MDC), the Register of Congenital Conditions (RCC), the Admitted Patient Data Collection (APDC) and the Pharmaceutical Drugs of Addiction System (PHDAS). The master dataset is the meaningfully linked data which include all or major study data collections. The master dataset can be used to improve the data quality, calculate the SV and can be tailored for different analyses. To identify hospital admissions in the periods before pregnancy, during pregnancy and after birth, a statistical variable of time interval (SVTI) needs to be calculated. The methods and SPSS syntax for building a master dataset, calculating the SVTI, recoding the principal diagnoses of mental illness and summarizing hospital admissions are described. Linked data preparation, including building the master dataset and calculating the SV, can improve data quality and enhance data function.

  18. Measurement and Modelling: Sequential Use of Analytical Techniques in a Study of Risk-Taking in Decision-Making by School Principals

    ERIC Educational Resources Information Center

    Trimmer, Karen

    2016-01-01

    This paper investigates reasoned risk-taking in decision-making by school principals using a methodology that combines sequential use of psychometric and traditional measurement techniques. Risk-taking is defined as when decisions are made that are not compliant with the regulatory framework, the primary governance mechanism for public schools in…

  19. An application of principal component analysis to the clavicle and clavicle fixation devices.

    PubMed

    Daruwalla, Zubin J; Courtis, Patrick; Fitzpatrick, Clare; Fitzpatrick, David; Mullett, Hannan

    2010-03-26

    Principal component analysis (PCA) enables the building of statistical shape models of bones and joints. This has been used in conjunction with computer assisted surgery in the past. However, PCA of the clavicle has not been performed. Using PCA, we present a novel method that examines the major modes of size and three-dimensional shape variation in male and female clavicles and suggests a method of grouping the clavicle into size and shape categories. Twenty-one high-resolution computerized tomography scans of the clavicle were reconstructed and analyzed using a specifically developed statistical software package. After performing statistical shape analysis, PCA was applied to study the factors that account for anatomical variation. The first principal component representing size accounted for 70.5 percent of anatomical variation. The addition of a further three principal components accounted for almost 87 percent. Using statistical shape analysis, clavicles in males have a greater lateral depth and are longer, wider and thicker than in females. However, the sternal angle in females is larger than in males. PCA confirmed these differences between genders but also noted that men exhibit greater variance and classified clavicles into five morphological groups. This unique approach is the first that standardizes a clavicular orientation. It provides information that is useful to both, the biomedical engineer and clinician. Other applications include implant design with regard to modifying current or designing future clavicle fixation devices. Our findings support the need for further development of clavicle fixation devices and the questioning of whether gender-specific devices are necessary.

  20. Spatio-Temporal Patterns of Barmah Forest Virus Disease in Queensland, Australia

    PubMed Central

    Naish, Suchithra; Hu, Wenbiao; Mengersen, Kerrie; Tong, Shilu

    2011-01-01

    Background Barmah Forest virus (BFV) disease is a common and wide-spread mosquito-borne disease in Australia. This study investigated the spatio-temporal patterns of BFV disease in Queensland, Australia using geographical information system (GIS) tools and geostatistical analysis. Methods/Principal Findings We calculated the incidence rates and standardised incidence rates of BFV disease. Moran's I statistic was used to assess the spatial autocorrelation of BFV incidences. Spatial dynamics of BFV disease was examined using semi-variogram analysis. Interpolation techniques were applied to visualise and display the spatial distribution of BFV disease in statistical local areas (SLAs) throughout Queensland. Mapping of BFV disease by SLAs reveals the presence of substantial spatio-temporal variation over time. Statistically significant differences in BFV incidence rates were identified among age groups (χ2 = 7587, df = 7327,p<0.01). There was a significant positive spatial autocorrelation of BFV incidence for all four periods, with the Moran's I statistic ranging from 0.1506 to 0.2901 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. Conclusions/Significance This is the first study to examine spatial and temporal variation in the incidence rates of BFV disease across Queensland using GIS and geostatistics. The BFV transmission varied with age and gender, which may be due to exposure rates or behavioural risk factors. There are differences in the spatio-temporal patterns of BFV disease which may be related to local socio-ecological and environmental factors. These research findings may have implications in the BFV disease control and prevention programs in Queensland. PMID:22022430

  1. The Trauma of Adolescent Suicide: A Time for Special Leadership by Principals.

    ERIC Educational Resources Information Center

    Dempsey, Richard A.

    This monograph provides principals and school officials with information about coping with adolescent suicide. Section 1, "Introduction," discusses the uncomfortable nature of the topic, cites statistics, and recommends that preventive programs be developed. Section 2, "Causes of Suicide," analyzes stress and depression among youth and suggests…

  2. 50 CFR 300.183 - Permit holder reporting and recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... person required to obtain a trade permit under § 300.182 retains, at his/her principal place of business... his/her principal place of business, a copy of each biweekly report and all supporting records for a... regulated under this subpart, biweekly reports, statistical documents, catch documents, re-export...

  3. Connecting Principal Leadership, Teacher Collaboration, and Student Achievement

    ERIC Educational Resources Information Center

    Goddard, Yvonne L.; Miller, Robert; Larsen, Ross; Goddard, Roger; Madsen, Jean; Schroeder, Patricia

    2010-01-01

    The purpose of this paper was to test the relationship between principal leadership and teacher collaboration around instructional improvement to determine whether these measures were statistically related and whether, together, they were associated with academic achievement in elementary schools. Data were obtained from 1,600 teachers in 96…

  4. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udey, Ruth Norma

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  5. Strain Transient Detection Techniques: A Comparison of Source Parameter Inversions of Signals Isolated through Principal Component Analysis (PCA), Non-Linear PCA, and Rotated PCA

    NASA Astrophysics Data System (ADS)

    Lipovsky, B.; Funning, G. J.

    2009-12-01

    We compare several techniques for the analysis of geodetic time series with the ultimate aim to characterize the physical processes which are represented therein. We compare three methods for the analysis of these data: Principal Component Analysis (PCA), Non-Linear PCA (NLPCA), and Rotated PCA (RPCA). We evaluate each method by its ability to isolate signals which may be any combination of low amplitude (near noise level), temporally transient, unaccompanied by seismic emissions, and small scale with respect to the spatial domain. PCA is a powerful tool for extracting structure from large datasets which is traditionally realized through either the solution of an eigenvalue problem or through iterative methods. PCA is an transformation of the coordinate system of our data such that the new "principal" data axes retain maximal variance and minimal reconstruction error (Pearson, 1901; Hotelling, 1933). RPCA is achieved by an orthogonal transformation of the principal axes determined in PCA. In the analysis of meteorological data sets, RPCA has been seen to overcome domain shape dependencies, correct for sampling errors, and to determine principal axes which more closely represent physical processes (e.g., Richman, 1986). NLPCA generalizes PCA such that principal axes are replaced by principal curves (e.g., Hsieh 2004). We achieve NLPCA through an auto-associative feed-forward neural network (Scholz, 2005). We show the geophysical relevance of these techniques by application of each to a synthetic data set. Results are compared by inverting principal axes to determine deformation source parameters. Temporal variability in source parameters, estimated by each method, are also compared.

  6. Randomized Controlled Trial of a Special Acupuncture Technique for Pain after Thoracotomy

    PubMed Central

    Deng, Gary; Rusch, Valerie; Vickers, Andrew; Malhortra, Vivek; Ginex, Pamela; Downey, Robert; Bains, Manjit; Park, Bernard; Rizk, Nabil; Flores, Raja; Yeung, Simon; Cassileth, Barrie

    2009-01-01

    Objective To determine whether an acupuncture technique specially developed for a surgical oncology population (intervention) reduces pain or analgesic use after thoracotomy compared to a sham acupuncture technique (control). Methods One hundred and sixty two cancer patients undergoing thoracotomy were randomized to group A) preoperative implantation of small intradermal needles which were retained for 4 weeks or B) preoperative placement of sham needles at the same schedule. Numerical Rating Scale (NRS) of pain and total opioid use we evaluated during the in-patient stay; Brief Pain Inventory (BPI) and Medication Quantification Scale (MQS) were evaluated after discharge up to 3 months after the surgery. Results The principal analysis, a comparison of BPI pain intensity scores at the 30 day follow-up, showed no significant difference between the intervention and control group. Pain scores were marginally higher in the intervention group 0.05 (95% C.I.: 0.74, -0.64; p=0.9). There were also no statistically significant differences between groups for secondary endpoints, including chronic pain assessments at 60 and 90 days, in-patient pain, and medication use in hospital and after discharge. Conclusion A special acupuncture technique as provided in this study did not reduce pain or use of pain medication after thoracotomy more than a sham technique. PMID:19114190

  7. The Use of Multi-Component Statistical Techniques in Understanding Subduction Zone Arc Granitic Geochemical Data Sets

    NASA Astrophysics Data System (ADS)

    Pompe, L.; Clausen, B. L.; Morton, D. M.

    2015-12-01

    Multi-component statistical techniques and GIS visualization are emerging trends in understanding large data sets. Our research applies these techniques to a large igneous geochemical data set from southern California to better understand magmatic and plate tectonic processes. A set of 480 granitic samples collected by Baird from this area were analyzed for 39 geochemical elements. Of these samples, 287 are from the Peninsular Ranges Batholith (PRB) and 164 from part of the Transverse Ranges (TR). Principal component analysis (PCA) summarized the 39 variables into 3 principal components (PC) by matrix multiplication and for the PRB are interpreted as follows: PC1 with about 30% of the variation included mainly compatible elements and SiO2 and indicates extent of differentation; PC2 with about 20% of the variation included HFS elements and may indicate crustal contamination as usually identified by Sri; PC3 with about 20% of the variation included mainly HRE elements and may indicate magma source depth as often diplayed using REE spider diagrams and possibly Sr/Y. Several elements did not fit well in any of the three components: Cr, Ni, U, and Na2O.For the PRB, the PC1 correlation with SiO2 was r=-0.85, the PC2 correlation with Sri was r=0.80, and the PC3 correlation with Gd/Yb was r=-0.76 and with Sr/Y was r=-0.66 . Extending this method to the TR, correlations were r=-0.85, -0.21, -0.06, and -0.64, respectively. A similar extent of correlation for both areas was visually evident using GIS interpolation.PC1 seems to do well at indicating differentiation index for both the PRB and TR and correlates very well with SiO2, Al2O3, MgO, FeO*, CaO, K2O, Sc, V, and Co, but poorly with Na2O and Cr. If the crustal component is represented by Sri, PC2 correlates well and less expesively with this indicator in the PRB, but not in the TR. Source depth has been related to the slope on REE spidergrams, and PC3 based on only the HREE and using the Sr/Y ratios gives a reasonable correlation for both PRB and TR, but the Gd/Yb ratio gives a reasonable correlation for only the PRB. The PRB data provide reasonable correlation between principal components and standard geochemical indicators, perhaps because of the well-recognized monotonic variation from SW to NE. Data sets from the TR give similar results in some cases, but poor correlation in others.

  8. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  9. Using radar imagery for crop discrimination: a statistical and conditional probability study

    USGS Publications Warehouse

    Haralick, R.M.; Caspall, F.; Simonett, D.S.

    1970-01-01

    A number of the constraints with which remote sensing must contend in crop studies are outlined. They include sensor, identification accuracy, and congruencing constraints; the nature of the answers demanded of the sensor system; and the complex temporal variances of crops in large areas. Attention is then focused on several methods which may be used in the statistical analysis of multidimensional remote sensing data.Crop discrimination for radar K-band imagery is investigated by three methods. The first one uses a Bayes decision rule, the second a nearest-neighbor spatial conditional probability approach, and the third the standard statistical techniques of cluster analysis and principal axes representation.Results indicate that crop type and percent of cover significantly affect the strength of the radar return signal. Sugar beets, corn, and very bare ground are easily distinguishable, sorghum, alfalfa, and young wheat are harder to distinguish. Distinguishability will be improved if the imagery is examined in time sequence so that changes between times of planning, maturation, and harvest provide additional discriminant tools. A comparison between radar and photography indicates that radar performed surprisingly well in crop discrimination in western Kansas and warrants further study.

  10. Short Shrift to Long Lists: An Alternative Approach to the Development of Performance Standards for School Principals.

    ERIC Educational Resources Information Center

    Louden, William; Wildy, Helen

    1999-01-01

    Describes examples of standards frameworks for principals' work operant in three countries and describes an alternative approach based on interviewing 40 Australian principals. By combining qualitative case studies with probabilistic measurement techniques, the alternative approach provides contextually rich descriptions of growth in performance…

  11. HOS network-based classification of power quality events via regression algorithms

    NASA Astrophysics Data System (ADS)

    Palomares Salas, José Carlos; González de la Rosa, Juan José; Sierra Fernández, José María; Pérez, Agustín Agüera

    2015-12-01

    This work compares seven regression algorithms implemented in artificial neural networks (ANNs) supported by 14 power-quality features, which are based in higher-order statistics. Combining time and frequency domain estimators to deal with non-stationary measurement sequences, the final goal of the system is the implementation in the future smart grid to guarantee compatibility between all equipment connected. The principal results are based in spectral kurtosis measurements, which easily adapt to the impulsive nature of the power quality events. These results verify that the proposed technique is capable of offering interesting results for power quality (PQ) disturbance classification. The best results are obtained using radial basis networks, generalized regression, and multilayer perceptron, mainly due to the non-linear nature of data.

  12. Digital enhancement of multispectral MSS data for maximum image visibility

    NASA Technical Reports Server (NTRS)

    Algazi, V. R.

    1973-01-01

    A systematic approach to the enhancement of images has been developed. This approach exploits two principal features involved in the observation of images: the properties of human vision and the statistics of the images being observed. The rationale of the enhancement procedure is as follows: in the observation of some features of interest in an image, the range of objective luminance-chrominance values being displayed is generally limited and does not use the whole perceptual range of vision of the observer. The purpose of the enhancement technique is to expand and distort in a systematic way the grey scale values of each of the multispectral bands making up a color composite, to enhance the average visibility of the features being observed.

  13. Cloud Statistics for NASA Climate Change Studies

    NASA Technical Reports Server (NTRS)

    Wylie, Donald P.

    1999-01-01

    The Principal Investigator participated in two field experiments and developed a global data set on cirrus cloud frequency and optical depth to aid the development of numerical models of climate. Four papers were published under this grant. The accomplishments are summarized: (1) In SUCCESS (SUbsonic aircraft: Contrail & Cloud Effects Special Study) the Principal Investigator aided weather forecasters in the start of the field program. A paper also was published on the clouds studied in SUCCESS and the use of the satellite stereographic technique to distinguish cloud forms and heights of clouds. (2) In SHEBA (Surface Heat Budget in the Arctic) FIRE/ACE (Arctic Cloud Experiment) the Principal Investigator provided daily weather and cloud forecasts for four research aircraft crews, NASA's ER-2, UCAR's C-130, University of Washington's Convert 580, and the Canadian Atmospheric Environment Service's Convert 580. Approximately 105 forecasts were written. The Principal Investigator also made daily weather summaries with calculations of air trajectories for 54 flight days in the experiment. The trajectories show where the air sampled during the flights came from and will be used in future publications to discuss the origin and history of the air and clouds sampled by the aircraft. A paper discussing how well the FIRE/ACE data represent normal climatic conditions in the arctic is being prepared. (3) The Principal Investigator's web page became the source of information for weather forecasting by the scientists on the SHEBA ship. (4) Global Cirrus frequency and optical depth is a continuing analysis of global cloud cover and frequency distribution are being made from the NOAA polar orbiting weather satellites. This analysis is sensitive to cirrus clouds because of the radiative channels used. During this grant three papers were published which describe cloud frequencies, their optical properties and compare the Wisconsin FM Cloud Analysis to other global cloud data such as the International Satellite Cloud Climatology Program (ISCCP) and the Stratospheric Aerosol and Gas Experiment (SAGE). A summary of eight years of HIRS data will be published in late 1998. Important information from this study are: 1) cirrus clouds cover most of the earth, 2) they are found about 40% of the time globally, 3) in the tropics cirrus cloud frequencies are even higher, from 80-100%, 4) there is slight evidence that cirnis cloud cover is increasing in the northern hemisphere at about 0.5% per year, and 5) cirrus clouds have an average infrared transmittance of about 40% of the terrestrial radiation. (5) Global Cloud Frequency Statistics published on the Principal Investigator's web page have been used in the planning of the future CRYSTAL experiment and have been used for refinements of a global numerical model operated at the Colorado State University.

  14. Metrological approaches to organic chemical purity: primary reference materials for vitamin D metabolites.

    PubMed

    Nelson, Michael A; Bedner, Mary; Lang, Brian E; Toman, Blaza; Lippa, Katrice A

    2015-11-01

    Given the critical role of pure, organic compound primary reference standards used to characterize and certify chemical Certified Reference Materials (CRMs), it is essential that associated mass purity assessments be fit-for-purpose, represented by an appropriate uncertainty interval, and metrologically sound. The mass fraction purities (% g/g) of 25-hydroxyvitamin D (25(OH)D) reference standards used to produce and certify values for clinical vitamin D metabolite CRMs were investigated by multiple orthogonal quantitative measurement techniques. Quantitative (1)H-nuclear magnetic resonance spectroscopy (qNMR) was performed to establish traceability of these materials to the International System of Units (SI) and to directly assess the principal analyte species. The 25(OH)D standards contained volatile and water impurities, as well as structurally-related impurities that are difficult to observe by chromatographic methods or to distinguish from the principal 25(OH)D species by one-dimensional NMR. These impurities have the potential to introduce significant biases to purity investigations in which a limited number of measurands are quantified. Combining complementary information from multiple analytical methods, using both direct and indirect measurement techniques, enabled mitigation of these biases. Purities of 25(OH)D reference standards and associated uncertainties were determined using frequentist and Bayesian statistical models to combine data acquired via qNMR, liquid chromatography with UV absorbance and atmospheric pressure-chemical ionization mass spectrometric detection (LC-UV, LC-ACPI-MS), thermogravimetric analysis (TGA), and Karl Fischer (KF) titration.

  15. Principal Component Analysis: Resources for an Essential Application of Linear Algebra

    ERIC Educational Resources Information Center

    Pankavich, Stephen; Swanson, Rebecca

    2015-01-01

    Principal Component Analysis (PCA) is a highly useful topic within an introductory Linear Algebra course, especially since it can be used to incorporate a number of applied projects. This method represents an essential application and extension of the Spectral Theorem and is commonly used within a variety of fields, including statistics,…

  16. Applications of Nonlinear Principal Components Analysis to Behavioral Data.

    ERIC Educational Resources Information Center

    Hicks, Marilyn Maginley

    1981-01-01

    An empirical investigation of the statistical procedure entitled nonlinear principal components analysis was conducted on a known equation and on measurement data in order to demonstrate the procedure and examine its potential usefulness. This method was suggested by R. Gnanadesikan and based on an early paper of Karl Pearson. (Author/AL)

  17. What Are the Characteristics of Principals Identified As Effective by Teachers?

    ERIC Educational Resources Information Center

    Fowler, William J., Jr.

    This exploratory study investigated which characteristics of a principal are identified as effective by teachers in the same school setting. The data were obtained from the Schools and Staffing Study of 1988, from the National Center for Education Statistics (NCES). The Teacher Questionnaire of the Schools and Staffing Survey (SASS) questioned…

  18. Evaluation of drinking quality of groundwater through multivariate techniques in urban area.

    PubMed

    Das, Madhumita; Kumar, A; Mohapatra, M; Muduli, S D

    2010-07-01

    Groundwater is a major source of drinking water in urban areas. Because of the growing threat of debasing water quality due to urbanization and development, monitoring water quality is a prerequisite to ensure its suitability for use in drinking. But analysis of a large number of properties and parameter to parameter basis evaluation of water quality is not feasible in a regular interval. Multivariate techniques could streamline the data without much loss of information to a reasonably manageable data set. In this study, using principal component analysis, 11 relevant properties of 58 water samples were grouped into three statistical factors. Discriminant analysis identified "pH influence" as the most distinguished factor and pH, Fe, and NO₃⁻ as the most discriminating variables and could be treated as water quality indicators. These were utilized to classify the sampling sites into homogeneous clusters that reflect location-wise importance of specific indicator/s for use to monitor drinking water quality in the whole study area.

  19. ELICIT: An alternative imprecise weight elicitation technique for use in multi-criteria decision analysis for healthcare

    PubMed Central

    Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard

    2015-01-01

    Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235

  20. The Influence of the Variety, Vineyard, and Vintage on the Romanian White Wines Quality

    PubMed Central

    Hosu, Anamaria; Floare-Avram, Veronica; Feher, Ioana; Inceu, Mihai

    2016-01-01

    The wine is one of the most consumed drinks over the world, being subjected to falsification or adulteration regarding the variety, vintage, and geographical region. In this study, the influence of different characteristics of wines (type, production year, and origin) on the total phenolic content, total flavonoids content, antioxidant activity, total sugars content, pH, and 18O/16O isotopic ratio was investigated. The differentiation of selected wines on the basis of tested parameters was investigated using chemometric techniques, such as analysis of variance, cluster analysis, and principal component analysis. The experimental results are in agreement with other outcomes and allow concluding that variety and vineyard have the major influence on the studied parameters, but, also, statistical interaction effect between year and vineyard and year and variety is observed in some cases. The obtained results have demonstrated that these parameters together with chemometric techniques show a significant potential to be used for discrimination of white wines. PMID:27840767

  1. A computer analysis of ERTS data of the Lake Gregory area of South Australia with particular emphasis on its role in terrain classification for engineering. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Lodwick, G. D. (Principal Investigator)

    1976-01-01

    A digital computer and multivariate statistical techniques were used to analyze 4-band multispectral data. A representation of the original data for each of the four bands allows a certain degree of terrain interpretation; however, variations in appearance of sites within and between bands, without additional criteria for deciding which representation should be preferred, create difficulties for classification. Investigation of the video data groups produced by principal components analysis and cluster analysis techniques shows that effective correlations with classifications of terrain produced by conventional methods could be carried out. The analyses also highlighted underlying relationships between the various elements. The approach used allows large areas (185 cm by 185 cm) to be classified into fundamental units within a matter of hours and can be applied to those parts of the Earth where facilities for conventional studies are poor or lacking.

  2. A hybrid LIBS-Raman system combined with chemometrics: an efficient tool for plastic identification and sorting.

    PubMed

    Shameem, K M Muhammed; Choudhari, Khoobaram S; Bankapur, Aseefhali; Kulkarni, Suresh D; Unnikrishnan, V K; George, Sajan D; Kartha, V B; Santhosh, C

    2017-05-01

    Classification of plastics is of great importance in the recycling industry as the littering of plastic wastes increases day by day as a result of its extensive use. In this paper, we demonstrate the efficacy of a combined laser-induced breakdown spectroscopy (LIBS)-Raman system for the rapid identification and classification of post-consumer plastics. The atomic information and molecular information of polyethylene terephthalate, polyethylene, polypropylene, and polystyrene were studied using plasma emission spectra and scattered signal obtained in the LIBS and Raman technique, respectively. The collected spectral features of the samples were analyzed using statistical tools (principal component analysis, Mahalanobis distance) to categorize the plastics. The analyses of the data clearly show that elemental information and molecular information obtained from these techniques are efficient for classification of plastics. In addition, the molecular information collected via Raman spectroscopy exhibits clearly distinct features for the transparent plastics (100% discrimination), whereas the LIBS technique shows better spectral feature differences for the colored samples. The study shows that the information obtained from these complementary techniques allows the complete classification of the plastic samples, irrespective of the color or additives. This work further throws some light on the fact that the potential limitations of any of these techniques for sample identification can be overcome by the complementarity of these two techniques. Graphical Abstract ᅟ.

  3. A reduction in ag/residential signature conflict using principal components analysis of LANDSAT temporal data

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Borden, F. Y.

    1977-01-01

    Methods to accurately delineate the types of land cover in the urban-rural transition zone of metropolitan areas were considered. The application of principal components analysis to multidate LANDSAT imagery was investigated as a means of reducing the overlap between residential and agricultural spectral signatures. The statistical concepts of principal components analysis were discussed, as well as the results of this analysis when applied to multidate LANDSAT imagery of the Washington, D.C. metropolitan area.

  4. The influence of foot hyperpronation on pelvic biomechanics during stance phase of the gait: A biomechanical simulation study.

    PubMed

    Yazdani, Farzaneh; Razeghi, Mohsen; Karimi, Mohammad Taghi; Raeisi Shahraki, Hadi; Salimi Bani, Milad

    2018-05-01

    Despite the theoretical link between foot hyperpronation and biomechanical dysfunction of the pelvis, the literature lacks evidence that confirms this assumption in truly hyperpronated feet subjects during gait. Changes in the kinematic pattern of the pelvic segment were assessed in 15 persons with hyperpronated feet and compared to a control group of 15 persons with normally aligned feet during the stance phase of gait based on biomechanical musculoskeletal simulation. Kinematic and kinetic data were collected while participants walked at a comfortable self-selected speed. A generic OpenSim musculoskeletal model with 23 degrees of freedom and 92 muscles was scaled for each participant. OpenSim inverse kinematic analysis was applied to calculate segment angles in the sagittal, frontal and horizontal planes. Principal component analysis was employed as a data reduction technique, as well as a computational tool to obtain principal component scores. Independent-sample t-test was used to detect group differences. The difference between groups in scores for the first principal component in the sagittal plane was statistically significant (p = 0.01; effect size = 1.06), but differences between principal component scores in the frontal and horizontal planes were not significant. The hyperpronation group had greater anterior pelvic tilt during 20%-80% of the stance phase. In conclusion, in persons with hyperpronation we studied the role of the pelvic segment was mainly to maintain postural balance in the sagittal plane by increasing anterior pelvic inclination. Since anterior pelvic tilt may be associated with low back symptoms, the evaluation of foot posture should be considered in assessing the patients with low back and pelvic dysfunction.

  5. Instructional leadership in elementary science: How are school leaders positioned to lead in a next generation science standards era?

    NASA Astrophysics Data System (ADS)

    Winn, Kathleen Mary

    The Next Generation Science Standards (NGSS) are the newest K-12 science content standards created by a coalition of educators, scientists, and researchers available for adoption by states and schools. Principals are important actors during policy implementation especially since principals are charged with assuming the role of an instructional leader for their teachers in all subject areas. Science poses a unique challenge to the elementary curricular landscape because traditionally, elementary teachers report low levels of self-efficacy in the subject. Support in this area therefore becomes important for a successful integration of a new science education agenda. This study analyzed self-reported survey data from public elementary principals (N=667) to address the following three research questions: (1) What type of science backgrounds do elementary principals have? (2) What indicators predict if elementary principals will engage in instructional leadership behaviors in science? (3) Does self-efficacy mediate the relationship between science background and a capacity for instructional leadership in science? The survey data were analyzed quantitatively. Descriptive statistics address the first research question and inferential statistics (hierarchal regression analysis and a mediation analysis) answer the second and third research questions.The sample data show that about 21% of elementary principals have a formal science degree and 26% have a degree in a STEM field. Most principals have not had recent experience teaching science, nor were they every exclusively a science teacher. The analyses suggests that demographic, experiential, and self-efficacy variables predict instructional leadership practices in science.

  6. Modified neural networks for rapid recovery of tokamak plasma parameters for real time control

    NASA Astrophysics Data System (ADS)

    Sengupta, A.; Ranjan, P.

    2002-07-01

    Two modified neural network techniques are used for the identification of the equilibrium plasma parameters of the Superconducting Steady State Tokamak I from external magnetic measurements. This is expected to ultimately assist in a real time plasma control. As different from the conventional network structure where a single network with the optimum number of processing elements calculates the outputs, a multinetwork system connected in parallel does the calculations here in one of the methods. This network is called the double neural network. The accuracy of the recovered parameters is clearly more than the conventional network. The other type of neural network used here is based on the statistical function parametrization combined with a neural network. The principal component transformation removes linear dependences from the measurements and a dimensional reduction process reduces the dimensionality of the input space. This reduced and transformed input set, rather than the entire set, is fed into the neural network input. This is known as the principal component transformation-based neural network. The accuracy of the recovered parameters in the latter type of modified network is found to be a further improvement over the accuracy of the double neural network. This result differs from that obtained in an earlier work where the double neural network showed better performance. The conventional network and the function parametrization methods have also been used for comparison. The conventional network has been used for an optimization of the set of magnetic diagnostics. The effective set of sensors, as assessed by this network, are compared with the principal component based network. Fault tolerance of the neural networks has been tested. The double neural network showed the maximum resistance to faults in the diagnostics, while the principal component based network performed poorly. Finally the processing times of the methods have been compared. The double network and the principal component network involve the minimum computation time, although the conventional network also performs well enough to be used in real time.

  7. Instantiation and registration of statistical shape models of the femur and pelvis using 3D ultrasound imaging.

    PubMed

    Barratt, Dean C; Chan, Carolyn S K; Edwards, Philip J; Penney, Graeme P; Slomczykowski, Mike; Carter, Timothy J; Hawkes, David J

    2008-06-01

    Statistical shape modelling potentially provides a powerful tool for generating patient-specific, 3D representations of bony anatomy for computer-aided orthopaedic surgery (CAOS) without the need for a preoperative CT scan. Furthermore, freehand 3D ultrasound (US) provides a non-invasive method for digitising bone surfaces in the operating theatre that enables a much greater region to be sampled compared with conventional direct-contact (i.e., pointer-based) digitisation techniques. In this paper, we describe how these approaches can be combined to simultaneously generate and register a patient-specific model of the femur and pelvis to the patient during surgery. In our implementation, a statistical deformation model (SDM) was constructed for the femur and pelvis by performing a principal component analysis on the B-spline control points that parameterise the freeform deformations required to non-rigidly register a training set of CT scans to a carefully segmented template CT scan. The segmented template bone surface, represented by a triangulated surface mesh, is instantiated and registered to a cloud of US-derived surface points using an iterative scheme in which the weights corresponding to the first five principal modes of variation of the SDM are optimised in addition to the rigid-body parameters. The accuracy of the method was evaluated using clinically realistic data obtained on three intact human cadavers (three whole pelves and six femurs). For each bone, a high-resolution CT scan and rigid-body registration transformation, calculated using bone-implanted fiducial markers, served as the gold standard bone geometry and registration transformation, respectively. After aligning the final instantiated model and CT-derived surfaces using the iterative closest point (ICP) algorithm, the average root-mean-square distance between the surfaces was 3.5mm over the whole bone and 3.7mm in the region of surgical interest. The corresponding distances after aligning the surfaces using the marker-based registration transformation were 4.6 and 4.5mm, respectively. We conclude that despite limitations on the regions of bone accessible using US imaging, this technique has potential as a cost-effective and non-invasive method to enable surgical navigation during CAOS procedures, without the additional radiation dose associated with performing a preoperative CT scan or intraoperative fluoroscopic imaging. However, further development is required to investigate errors using error measures relevant to specific surgical procedures.

  8. The Views and Opinions of School Principals and Teachers on Positive Education

    ERIC Educational Resources Information Center

    Bas, Asli Uz; Firat, Necla Sahin

    2017-01-01

    The purpose of this study is to assess the views and opinions of school principals and teachers on positive education. The sample of the study includes 8 school principals and 12 teachers who attend different public schools in Izmir, Turkey. Data is collected through semi-structured interview technique. Findings show that majority of the…

  9. A Content Analysis of Themes That Emerge from School Principals' Web2.0 Conversations

    ERIC Educational Resources Information Center

    Manning, Rory

    2011-01-01

    The purpose of this qualitative study was to analyze the self initiated conversations held by school principals on web2.o platforms, such as blogs, through the lens of current leadership standards. The online writings of thirteen school principals were analyzed using grounded theory techniques (Strauss and Corbin, 1998) to elucidate emerging…

  10. [The application of the multidimensional statistical methods in the evaluation of the influence of atmospheric pollution on the population's health].

    PubMed

    Surzhikov, V D; Surzhikov, D V

    2014-01-01

    The search and measurement of causal relationships between exposure to air pollution and health state of the population is based on the system analysis and risk assessment to improve the quality of research. With this purpose there is applied the modern statistical analysis with the use of criteria of independence, principal component analysis and discriminate function analysis. As a result of analysis out of all atmospheric pollutants there were separated four main components: for diseases of the circulatory system main principal component is implied with concentrations of suspended solids, nitrogen dioxide, carbon monoxide, hydrogen fluoride, for the respiratory diseases the main c principal component is closely associated with suspended solids, sulfur dioxide and nitrogen dioxide, charcoal black. The discriminant function was shown to be used as a measure of the level of air pollution.

  11. The Higher Education System in Israel: Statistical Abstract and Analysis.

    ERIC Educational Resources Information Center

    Herskovic, Shlomo

    This edition of a statistical abstract published every few years on the higher education system in Israel presents the most recent data available through 1990-91. The data were gathered through the cooperation of the Central Bureau of Statistics and institutions of higher education. Chapter 1 presents a summary of principal findings covering the…

  12. Statistical modeling of interfractional tissue deformation and its application in radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Vile, Douglas J.

    In radiation therapy, interfraction organ motion introduces a level of geometric uncertainty into the planning process. Plans, which are typically based upon a single instance of anatomy, must be robust against daily anatomical variations. For this problem, a model of the magnitude, direction, and likelihood of deformation is useful. In this thesis, principal component analysis (PCA) is used to statistically model the 3D organ motion for 19 prostate cancer patients, each with 8-13 fractional computed tomography (CT) images. Deformable image registration and the resultant displacement vector fields (DVFs) are used to quantify the interfraction systematic and random motion. By applying the PCA technique to the random DVFs, principal modes of random tissue deformation were determined for each patient, and a method for sampling synthetic random DVFs was developed. The PCA model was then extended to describe the principal modes of systematic and random organ motion for the population of patients. A leave-one-out study tested both the systematic and random motion model's ability to represent PCA training set DVFs. The random and systematic DVF PCA models allowed the reconstruction of these data with absolute mean errors between 0.5-0.9 mm and 1-2 mm, respectively. To the best of the author's knowledge, this study is the first successful effort to build a fully 3D statistical PCA model of systematic tissue deformation in a population of patients. By sampling synthetic systematic and random errors, organ occupancy maps were created for bony and prostate-centroid patient setup processes. By thresholding these maps, PCA-based planning target volume (PTV) was created and tested against conventional margin recipes (van Herk for bony alignment and 5 mm fixed [3 mm posterior] margin for centroid alignment) in a virtual clinical trial for low-risk prostate cancer. Deformably accumulated delivered dose served as a surrogate for clinical outcome. For the bony landmark setup subtrial, the PCA PTV significantly (p<0.05) reduced D30, D20, and D5 to bladder and D50 to rectum, while increasing rectal D20 and D5. For the centroid-aligned setup, the PCA PTV significantly reduced all bladder DVH metrics and trended to lower rectal toxicity metrics. All PTVs covered the prostate with the prescription dose.

  13. XCOM intrinsic dimensionality for low-Z elements at diagnostic energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornefalk, Hans

    2012-02-15

    Purpose: To determine the intrinsic dimensionality of linear attenuation coefficients (LACs) from XCOM for elements with low atomic number (Z = 1-20) at diagnostic x-ray energies (25-120 keV). H{sub 0}{sup q}, the hypothesis that the space of LACs is spanned by q bases, is tested for various q-values. Methods: Principal component analysis is first applied and the LACs are projected onto the first q principal component bases. The residuals of the model values vs XCOM data are determined for all energies and atomic numbers. Heteroscedasticity invalidates the prerequisite of i.i.d. errors necessary for bootstrapping residuals. Instead wild bootstrap is applied,more » which, by not mixing residuals, allows the effect of the non-i.i.d residuals to be reflected in the result. Credible regions for the eigenvalues of the correlation matrix for the bootstrapped LAC data are determined. If subsequent credible regions for the eigenvalues overlap, the corresponding principal component is not considered to represent true data structure but noise. If this happens for eigenvalues l and l + 1, for any l{<=}q, H{sub 0}{sup q} is rejected. Results: The largest value of q for which H{sub 0}{sup q} is nonrejectable at the 5%-level is q = 4. This indicates that the statistically significant intrinsic dimensionality of low-Z XCOM data at diagnostic energies is four. Conclusions: The method presented allows determination of the statistically significant dimensionality of any noisy linear subspace. Knowledge of such significant dimensionality is of interest for any method making assumptions on intrinsic dimensionality and evaluating results on noisy reference data. For LACs, knowledge of the low-Z dimensionality might be relevant when parametrization schemes are tuned to XCOM data. For x-ray imaging techniques based on the basis decomposition method (Alvarez and Macovski, Phys. Med. Biol. 21, 733-744, 1976), an underlying dimensionality of two is commonly assigned to the LAC of human tissue at diagnostic energies. The finding of a higher statistically significant dimensionality thus raises the question whether a higher assumed model dimensionality (now feasible with the advent of multibin x-ray systems) might also be practically relevant, i.e., if better tissue characterization results can be obtained.« less

  14. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    PubMed Central

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  15. Characteristics of Teachers Nominated for an Accelerated Principal Preparation Program

    ERIC Educational Resources Information Center

    Rios, Steve J.; Reyes-Guerra, Daniel

    2012-01-01

    This article reports the initial evaluation results of a new accelerated, job-embedded principal preparation program funded by a Race to the Top Grant (U.S. Department of Education, 2012a) in Florida. Descriptive statistics, t-tests, and chi-square analyses were used to describe the characteristics of a group of potential applicants nominated to…

  16. Public High School Principals' Perceptions of Academic Reform. Center for Education Statistics Bulletin.

    ERIC Educational Resources Information Center

    Center for Education Statistics (ED/OERI), Washington, DC.

    A survey of public high school principals asked which policies, programs, and practices designed to improve learning were currently in operation at their schools, and whether these policies were instituted or substantially strengthened in the past 5 years. These policies reflect the school-level recommendations for education reform made in "A…

  17. Adolescent Suicide. The Trauma of Adolescent Suicide. A Time for Special Leadership by Principals.

    ERIC Educational Resources Information Center

    Dempsey, Richard A.

    This monograph was written to help principals and other school personnel explore the issues of adolescent suicide and its prevention. Chapter 1 presents statistics on the incidence of adolescent suicides and suicide attempts. Chapter 2, Causes of Suicide, reviews developmental tasks of adolescence, lists several contributors to adolescent suicide,…

  18. A Correlational Study of Principals' Leadership Style and Teacher Absenteeism

    ERIC Educational Resources Information Center

    Carter, Jason

    2010-01-01

    The purpose of this study was to determine whether McGregor's Theory X and Theory Y, gender, age, and years of experience of principals form a composite explaining the variation in teacher absences. It sought to determine whether all or any of these variables would be statistically significant in explaining the variance in absences for teachers.…

  19. Variable Neighborhood Search Heuristics for Selecting a Subset of Variables in Principal Component Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Singh, Renu; Steinley, Douglas

    2009-01-01

    The selection of a subset of variables from a pool of candidates is an important problem in several areas of multivariate statistics. Within the context of principal component analysis (PCA), a number of authors have argued that subset selection is crucial for identifying those variables that are required for correct interpretation of the…

  20. A Label-Free Fluorescent Array Sensor Utilizing Liposome Encapsulating Calcein for Discriminating Target Proteins by Principal Component Analysis

    PubMed Central

    Imamura, Ryota; Murata, Naoki; Shimanouchi, Toshinori; Yamashita, Kaoru; Fukuzawa, Masayuki; Noda, Minoru

    2017-01-01

    A new fluorescent arrayed biosensor has been developed to discriminate species and concentrations of target proteins by using plural different phospholipid liposome species encapsulating fluorescent molecules, utilizing differences in permeation of the fluorescent molecules through the membrane to modulate liposome-target protein interactions. This approach proposes a basically new label-free fluorescent sensor, compared with the common technique of developed fluorescent array sensors with labeling. We have confirmed a high output intensity of fluorescence emission related to characteristics of the fluorescent molecules dependent on their concentrations when they leak from inside the liposomes through the perturbed lipid membrane. After taking an array image of the fluorescence emission from the sensor using a CMOS imager, the output intensities of the fluorescence were analyzed by a principal component analysis (PCA) statistical method. It is found from PCA plots that different protein species with several concentrations were successfully discriminated by using the different lipid membranes with high cumulative contribution ratio. We also confirmed that the accuracy of the discrimination by the array sensor with a single shot is higher than that of a single sensor with multiple shots. PMID:28714873

  1. A Label-Free Fluorescent Array Sensor Utilizing Liposome Encapsulating Calcein for Discriminating Target Proteins by Principal Component Analysis.

    PubMed

    Imamura, Ryota; Murata, Naoki; Shimanouchi, Toshinori; Yamashita, Kaoru; Fukuzawa, Masayuki; Noda, Minoru

    2017-07-15

    A new fluorescent arrayed biosensor has been developed to discriminate species and concentrations of target proteins by using plural different phospholipid liposome species encapsulating fluorescent molecules, utilizing differences in permeation of the fluorescent molecules through the membrane to modulate liposome-target protein interactions. This approach proposes a basically new label-free fluorescent sensor, compared with the common technique of developed fluorescent array sensors with labeling. We have confirmed a high output intensity of fluorescence emission related to characteristics of the fluorescent molecules dependent on their concentrations when they leak from inside the liposomes through the perturbed lipid membrane. After taking an array image of the fluorescence emission from the sensor using a CMOS imager, the output intensities of the fluorescence were analyzed by a principal component analysis (PCA) statistical method. It is found from PCA plots that different protein species with several concentrations were successfully discriminated by using the different lipid membranes with high cumulative contribution ratio. We also confirmed that the accuracy of the discrimination by the array sensor with a single shot is higher than that of a single sensor with multiple shots.

  2. High-dimensional inference with the generalized Hopfield model: principal component analysis and corrections.

    PubMed

    Cocco, S; Monasson, R; Sessak, V

    2011-05-01

    We consider the problem of inferring the interactions between a set of N binary variables from the knowledge of their frequencies and pairwise correlations. The inference framework is based on the Hopfield model, a special case of the Ising model where the interaction matrix is defined through a set of patterns in the variable space, and is of rank much smaller than N. We show that maximum likelihood inference is deeply related to principal component analysis when the amplitude of the pattern components ξ is negligible compared to √N. Using techniques from statistical mechanics, we calculate the corrections to the patterns to the first order in ξ/√N. We stress the need to generalize the Hopfield model and include both attractive and repulsive patterns in order to correctly infer networks with sparse and strong interactions. We present a simple geometrical criterion to decide how many attractive and repulsive patterns should be considered as a function of the sampling noise. We moreover discuss how many sampled configurations are required for a good inference, as a function of the system size N and of the amplitude ξ. The inference approach is illustrated on synthetic and biological data.

  3. A first application of independent component analysis to extracting structure from stock returns.

    PubMed

    Back, A D; Weigend, A S

    1997-08-01

    This paper explores the application of a signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. ICA is shown to be a potentially powerful method of analyzing and understanding driving mechanisms in financial time series. The application to portfolio optimization is described in Chin and Weigend (1998).

  4. Screening of oil sources by using comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry and multivariate statistical analysis.

    PubMed

    Zhang, Wanfeng; Zhu, Shukui; He, Sheng; Wang, Yanxin

    2015-02-06

    Using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOFMS), volatile and semi-volatile organic compounds in crude oil samples from different reservoirs or regions were analyzed for the development of a molecular fingerprint database. Based on the GC×GC/TOFMS fingerprints of crude oils, principal component analysis (PCA) and cluster analysis were used to distinguish the oil sources and find biomarkers. As a supervised technique, the geological characteristics of crude oils, including thermal maturity, sedimentary environment etc., are assigned to the principal components. The results show that tri-aromatic steroid (TAS) series are the suitable marker compounds in crude oils for the oil screening, and the relative abundances of individual TAS compounds have excellent correlation with oil sources. In order to correct the effects of some other external factors except oil sources, the variables were defined as the content ratio of some target compounds and 13 parameters were proposed for the screening of oil sources. With the developed model, the crude oils were easily discriminated, and the result is in good agreement with the practical geological setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Derivation of Boundary Manikins: A Principal Component Analysis

    NASA Technical Reports Server (NTRS)

    Young, Karen; Margerum, Sarah; Barr, Abbe; Ferrer, Mike A.; Rajulu, Sudhakar

    2008-01-01

    When designing any human-system interface, it is critical to provide realistic anthropometry to properly represent how a person fits within a given space. This study aimed to identify a minimum number of boundary manikins or representative models of subjects anthropometry from a target population, which would realistically represent the population. The boundary manikin anthropometry was derived using, Principal Component Analysis (PCA). PCA is a statistical approach to reduce a multi-dimensional dataset using eigenvectors and eigenvalues. The measurements used in the PCA were identified as those measurements critical for suit and cockpit design. The PCA yielded a total of 26 manikins per gender, as well as their anthropometry from the target population. Reduction techniques were implemented to reduce this number further with a final result of 20 female and 22 male subjects. The anthropometry of the boundary manikins was then be used to create 3D digital models (to be discussed in subsequent papers) intended for use by designers to test components of their space suit design, to verify that the requirements specified in the Human Systems Integration Requirements (HSIR) document are met. The end-goal is to allow for designers to generate suits which accommodate the diverse anthropometry of the user population.

  6. Characterizing Variability of Modular Brain Connectivity with Constrained Principal Component Analysis

    PubMed Central

    Hirayama, Jun-ichiro; Hyvärinen, Aapo; Kiviniemi, Vesa; Kawanabe, Motoaki; Yamashita, Okito

    2016-01-01

    Characterizing the variability of resting-state functional brain connectivity across subjects and/or over time has recently attracted much attention. Principal component analysis (PCA) serves as a fundamental statistical technique for such analyses. However, performing PCA on high-dimensional connectivity matrices yields complicated “eigenconnectivity” patterns, for which systematic interpretation is a challenging issue. Here, we overcome this issue with a novel constrained PCA method for connectivity matrices by extending the idea of the previously proposed orthogonal connectivity factorization method. Our new method, modular connectivity factorization (MCF), explicitly introduces the modularity of brain networks as a parametric constraint on eigenconnectivity matrices. In particular, MCF analyzes the variability in both intra- and inter-module connectivities, simultaneously finding network modules in a principled, data-driven manner. The parametric constraint provides a compact module-based visualization scheme with which the result can be intuitively interpreted. We develop an optimization algorithm to solve the constrained PCA problem and validate our method in simulation studies and with a resting-state functional connectivity MRI dataset of 986 subjects. The results show that the proposed MCF method successfully reveals the underlying modular eigenconnectivity patterns in more general situations and is a promising alternative to existing methods. PMID:28002474

  7. Early forest fire detection using principal component analysis of infrared video

    NASA Astrophysics Data System (ADS)

    Saghri, John A.; Radjabi, Ryan; Jacobs, John T.

    2011-09-01

    A land-based early forest fire detection scheme which exploits the infrared (IR) temporal signature of fire plume is described. Unlike common land-based and/or satellite-based techniques which rely on measurement and discrimination of fire plume directly from its infrared and/or visible reflectance imagery, this scheme is based on exploitation of fire plume temporal signature, i.e., temperature fluctuations over the observation period. The method is simple and relatively inexpensive to implement. The false alarm rate is expected to be lower that of the existing methods. Land-based infrared (IR) cameras are installed in a step-stare-mode configuration in potential fire-prone areas. The sequence of IR video frames from each camera is digitally processed to determine if there is a fire within camera's field of view (FOV). The process involves applying a principal component transformation (PCT) to each nonoverlapping sequence of video frames from the camera to produce a corresponding sequence of temporally-uncorrelated principal component (PC) images. Since pixels that form a fire plume exhibit statistically similar temporal variation (i.e., have a unique temporal signature), PCT conveniently renders the footprint/trace of the fire plume in low-order PC images. The PC image which best reveals the trace of the fire plume is then selected and spatially filtered via simple threshold and median filter operations to remove the background clutter, such as traces of moving tree branches due to wind.

  8. Structural damage detection in wind turbine blades based on time series representations of dynamic responses

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2015-03-01

    The development of large wind turbines that enable to harvest energy more efficiently is a consequence of the increasing demand for renewables in the world. To optimize the potential energy output, light and flexible wind turbine blades (WTBs) are designed. However, the higher flexibilities and lower buckling capacities adversely affect the long-term safety and reliability of WTBs, and thus the increased operation and maintenance costs reduce the expected revenue. Effective structural health monitoring techniques can help to counteract this by limiting inspection efforts and avoiding unplanned maintenance actions. Vibration-based methods deserve high attention due to the moderate instrumentation efforts and the applicability for in-service measurements. The present paper proposes the use of cross-correlations (CCs) of acceleration responses between sensors at different locations for structural damage detection in WTBs. CCs were in the past successfully applied for damage detection in numerical and experimental beam structures while utilizing only single lags between the signals. The present approach uses vectors of CC coefficients for multiple lags between measurements of two selected sensors taken from multiple possible combinations of sensors. To reduce the dimensionality of the damage sensitive feature (DSF) vectors, principal component analysis is performed. The optimal number of principal components (PCs) is chosen with respect to a statistical threshold. Finally, the detection phase uses the selected PCs of the healthy structure to calculate scores from a current DSF vector, where statistical hypothesis testing is performed for making a decision about the current structural state. The method is applied to laboratory experiments conducted on a small WTB with non-destructive damage scenarios.

  9. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  10. Principals Leadership Styles and Gender Influence on Teachers Morale in Public Secondary Schools

    ERIC Educational Resources Information Center

    Eboka, Obiajulu Chinyelum

    2016-01-01

    The study investigated the perception of teachers on the influence of principals' leadership styles and gender on teacher morale. Four research questions and four research hypotheses guided the study. An ex-post facto research design was adopted in the study. Through the simple random sampling technique a total of 72 principals and 2,506 in 72…

  11. Prenotification, Incentives, and Survey Modality: An Experimental Test of Methods to Increase Survey Response Rates of School Principals

    ERIC Educational Resources Information Center

    Jacob, Robin Tepper; Jacob, Brian

    2012-01-01

    Teacher and principal surveys are among the most common data collection techniques employed in education research. Yet there is remarkably little research on survey methods in education, or about the most cost-effective way to raise response rates among teachers and principals. In an effort to explore various methods for increasing survey response…

  12. An integrtated approach to the use of Landsat TM data for gold exploration in west central Nevada

    NASA Technical Reports Server (NTRS)

    Mouat, D. A.; Myers, J. S.; Miller, N. L.

    1987-01-01

    This paper represents an integration of several Landsat TM image processing techniques with other data to discriminate the lithologies and associated areas of hydrothermal alteration in the vicinity of the Paradise Peak gold mine in west central Nevada. A microprocessor-based image processing system and an IDIMS system were used to analyze data from a 512 X 512 window of a Landsat-5 TM scene collected on June 30, 1984. Image processing techniques included simple band composites, band ratio composites, principal components composites, and baseline-based composites. These techniques were chosen based on their ability to discriminate the spectral characteristics of the products of hydrothermal alteration as well as of the associated regional lithologies. The simple band composite, ratio composite, two principal components composites, and the baseline-based composites separately can define the principal areas of alteration. Combined, they provide a very powerful exploration tool.

  13. Impervious surfaces mapping using high resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Shirmeen, Tahmina

    In recent years, impervious surfaces have emerged not only as an indicator of the degree of urbanization, but also as an indicator of environmental quality. As impervious surface area increases, storm water runoff increases in velocity, quantity, temperature and pollution load. Any of these attributes can contribute to the degradation of natural hydrology and water quality. Various image processing techniques have been used to identify the impervious surfaces, however, most of the existing impervious surface mapping tools used moderate resolution imagery. In this project, the potential of standard image processing techniques to generate impervious surface data for change detection analysis using high-resolution satellite imagery was evaluated. The city of Oxford, MS was selected as the study site for this project. Standard image processing techniques, including Normalized Difference Vegetation Index (NDVI), Principal Component Analysis (PCA), a combination of NDVI and PCA, and image classification algorithms, were used to generate impervious surfaces from multispectral IKONOS and QuickBird imagery acquired in both leaf-on and leaf-off conditions. Accuracy assessments were performed, using truth data generated by manual classification, with Kappa statistics and Zonal statistics to select the most appropriate image processing techniques for impervious surface mapping. The performance of selected image processing techniques was enhanced by incorporating Soil Brightness Index (SBI) and Greenness Index (GI) derived from Tasseled Cap Transformed (TCT) IKONOS and QuickBird imagery. A time series of impervious surfaces for the time frame between 2001 and 2007 was made using the refined image processing techniques to analyze the changes in IS in Oxford. It was found that NDVI and the combined NDVI--PCA methods are the most suitable image processing techniques for mapping impervious surfaces in leaf-off and leaf-on conditions respectively, using high resolution multispectral imagery. It was also found that IS data generated by these techniques can be refined by removing the conflicting dry soil patches using SBI and GI obtained from TCT of the same imagery used for IS data generation. The change detection analysis of the IS time series shows that Oxford experienced the major changes in IS from the year 2001 to 2004 and 2006 to 2007.

  14. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  15. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  16. A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.

    PubMed

    Read, S; Bath, P A; Willett, P; Maheswaran, R

    2013-08-30

    The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Statistical Inference for Porous Materials using Persistent Homology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moon, Chul; Heath, Jason E.; Mitchell, Scott A.

    2017-12-01

    We propose a porous materials analysis pipeline using persistent homology. We rst compute persistent homology of binarized 3D images of sampled material subvolumes. For each image we compute sets of homology intervals, which are represented as summary graphics called persistence diagrams. We convert persistence diagrams into image vectors in order to analyze the similarity of the homology of the material images using the mature tools for image analysis. Each image is treated as a vector and we compute its principal components to extract features. We t a statistical model using the loadings of principal components to estimate material porosity, permeability,more » anisotropy, and tortuosity. We also propose an adaptive version of the structural similarity index (SSIM), a similarity metric for images, as a measure to determine the statistical representative elementary volumes (sREV) for persistence homology. Thus we provide a capability for making a statistical inference of the uid ow and transport properties of porous materials based on their geometry and connectivity.« less

  18. Spatial characterization of dissolved trace elements and heavy metals in the upper Han River (China) using multivariate statistical techniques.

    PubMed

    Li, Siyue; Zhang, Quanfa

    2010-04-15

    A data matrix (4032 observations), obtained during a 2-year monitoring period (2005-2006) from 42 sites in the upper Han River is subjected to various multivariate statistical techniques including cluster analysis, principal component analysis (PCA), factor analysis (FA), correlation analysis and analysis of variance to determine the spatial characterization of dissolved trace elements and heavy metals. Our results indicate that waters in the upper Han River are primarily polluted by Al, As, Cd, Pb, Sb and Se, and the potential pollutants include Ba, Cr, Hg, Mn and Ni. Spatial distribution of trace metals indicates the polluted sections mainly concentrate in the Danjiang, Danjiangkou Reservoir catchment and Hanzhong Plain, and the most contaminated river is in the Hanzhong Plain. Q-model clustering depends on geographical location of sampling sites and groups the 42 sampling sites into four clusters, i.e., Danjiang, Danjiangkou Reservoir region (lower catchment), upper catchment and one river in headwaters pertaining to water quality. The headwaters, Danjiang and lower catchment, and upper catchment correspond to very high polluted, moderate polluted and relatively low polluted regions, respectively. Additionally, PCA/FA and correlation analysis demonstrates that Al, Cd, Mn, Ni, Fe, Si and Sr are controlled by natural sources, whereas the other metals appear to be primarily controlled by anthropogenic origins though geogenic source contributing to them. 2009 Elsevier B.V. All rights reserved.

  19. Facial anthropometric differences among gender, ethnicity, and age groups.

    PubMed

    Zhuang, Ziqing; Landsittel, Douglas; Benson, Stacey; Roberge, Raymond; Shaffer, Ronald

    2010-06-01

    The impact of race/ethnicity upon facial anthropometric data in the US workforce, on the development of personal protective equipment, has not been investigated to any significant degree. The proliferation of minority populations in the US workforce has increased the need to investigate differences in facial dimensions among these workers. The objective of this study was to determine the face shape and size differences among race and age groups from the National Institute for Occupational Safety and Health survey of 3997 US civilian workers. Survey participants were divided into two gender groups, four racial/ethnic groups, and three age groups. Measurements of height, weight, neck circumference, and 18 facial dimensions were collected using traditional anthropometric techniques. A multivariate analysis of the data was performed using Principal Component Analysis. An exploratory analysis to determine the effect of different demographic factors had on anthropometric features was assessed via a linear model. The 21 anthropometric measurements, body mass index, and the first and second principal component scores were dependent variables, while gender, ethnicity, age, occupation, weight, and height served as independent variables. Gender significantly contributes to size for 19 of 24 dependent variables. African-Americans have statistically shorter, wider, and shallower noses than Caucasians. Hispanic workers have 14 facial features that are significantly larger than Caucasians, while their nose protrusion, height, and head length are significantly shorter. The other ethnic group was composed primarily of Asian subjects and has statistically different dimensions from Caucasians for 16 anthropometric values. Nineteen anthropometric values for subjects at least 45 years of age are statistically different from those measured for subjects between 18 and 29 years of age. Workers employed in manufacturing, fire fighting, healthcare, law enforcement, and other occupational groups have facial features that differ significantly than those in construction. Statistically significant differences in facial anthropometric dimensions (P < 0.05) were noted between males and females, all racial/ethnic groups, and the subjects who were at least 45 years old when compared to workers between 18 and 29 years of age. These findings could be important to the design and manufacture of respirators, as well as employers responsible for supplying respiratory protective equipment to their employees.

  20. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  1. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    PubMed

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants. Copyright © 2018. Published by Elsevier Ltd.

  2. A study of the comparative effects of various means of motion cueing during a simulated compensatory tracking task

    NASA Technical Reports Server (NTRS)

    Mckissick, B. T.; Ashworth, B. R.; Parrish, R. V.; Martin, D. J., Jr.

    1980-01-01

    NASA's Langley Research Center conducted a simulation experiment to ascertain the comparative effects of motion cues (combinations of platform motion and g-seat normal acceleration cues) on compensatory tracking performance. In the experiment, a full six-degree-of-freedom YF-16 model was used as the simulated pursuit aircraft. The Langley Visual Motion Simulator (with in-house developed wash-out), and a Langley developed g-seat were principal components of the simulation. The results of the experiment were examined utilizing univariate and multivariate techniques. The statistical analyses demonstrate that the platform motion and g-seat cues provide additional information to the pilot that allows substantial reduction of lateral tracking error. Also, the analyses show that the g-seat cue helps reduce vertical error.

  3. Assessment of physicochemical and antioxidant characteristics of Quercus pyrenaica honeydew honeys.

    PubMed

    Shantal Rodríguez Flores, M; Escuredo, Olga; Carmen Seijo, M

    2015-01-01

    Consumers are exhibiting increasing interest in honeydew honey, principally due to its functional properties. Some plants can be sources of honeydew honey, but in north-western Spain, this honey type only comes from Quercus pyrenaica. In the present study, the melissopalynological and physicochemical characteristics and the antioxidant properties of 32 honeydew honey samples are described. Q. pyrenaica honeydew honey was defined by its colour, high pH, phenols and flavonoids. Multivariate statistical techniques were used to analyse the influence of the production year on the honey's physicochemical parameters and polyphenol content. Differences among the honey samples were found, showing that weather affected the physicochemical composition of the honey samples. Optimal conditions for oak growth favoured the production of honeydew honey. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Increased-resolution OCT thickness mapping of the human macula: a statistically based registration.

    PubMed

    Bernardes, Rui; Santos, Torcato; Cunha-Vaz, José

    2008-05-01

    To describe the development of a technique that enhances spatial resolution of retinal thickness maps of the Stratus OCT (Carl Zeiss Meditec, Inc., Dublin, CA). A retinal thickness atlas (RT-atlas) template was calculated, and a macular coordinate system was established, to pursue this objective. The RT-atlas was developed from principal component analysis of retinal thickness analyzer (RTA) maps acquired from healthy volunteers. The Stratus OCT radial thickness measurements were registered on the RT-atlas, from which an improved macular thickness map was calculated. Thereafter, Stratus OCT circular scans were registered on the previously calculated map to enhance spatial resolution. The developed technique was applied to Stratus OCT thickness data from healthy volunteers and from patients with diabetic retinopathy (DR) or age-related macular degeneration (AMD). Results showed that for normal, or close to normal, macular thickness maps from healthy volunteers and patients with DR, this technique can be an important aid in determining retinal thickness. Efforts are under way to improve the registration of retinal thickness data in patients with AMD. The developed technique enhances the evaluation of data acquired by the Stratus OCT, helping the detection of early retinal thickness abnormalities. Moreover, a normative database of retinal thickness measurements gained from this technique, as referenced to the macular coordinate system, can be created without errors induced by missed fixation and eye tilt.

  5. The Influence Function of Principal Component Analysis by Self-Organizing Rule.

    PubMed

    Higuchi; Eguchi

    1998-07-28

    This article is concerned with a neural network approach to principal component analysis (PCA). An algorithm for PCA by the self-organizing rule has been proposed and its robustness observed through the simulation study by Xu and Yuille (1995). In this article, the robustness of the algorithm against outliers is investigated by using the theory of influence function. The influence function of the principal component vector is given in an explicit form. Through this expression, the method is shown to be robust against any directions orthogonal to the principal component vector. In addition, a statistic generated by the self-organizing rule is proposed to assess the influence of data in PCA.

  6. A Study of the Effect of Secondary School Leadership Styles on Student Achievement in Selected Secondary School in Louisiana

    ERIC Educational Resources Information Center

    Harris, Cydnie Ellen Smith

    2012-01-01

    The effect of the leadership style of the secondary school principal on student achievement in select public schools in Louisiana was examined in this study. The null hypothesis was that there was no statistically significant difference between principal leadership style and student academic achievement. The researcher submitted the LEAD-Self…

  7. Principals' Time, Tasks, and Professional Development: An Analysis of Schools and Staffing Survey Data. REL 2017-201

    ERIC Educational Resources Information Center

    Lavigne, Heather J.; Shakman, Karen; Zweig, Jacqueline; Greller, Sara L.

    2016-01-01

    This study describes how principals reported spending their time and what professional development they reported participating in, based on data collected through the Schools and Staffing Survey by the National Center for Education Statistics during the 2011/12 school year. The study analyzes schools by grade level, poverty level, and within…

  8. The Relationship between Knowledge Management and Organizational Learning with the Effectiveness of Ordinary and Smart Secondary School Principals

    ERIC Educational Resources Information Center

    Khammar, Zahra; Heidarzadegan, Alireza; Balaghat, Seyed Reza; Salehi, Hadi

    2013-01-01

    This study aimed to investigate the relationship between knowledge management and organizational learning with the effectiveness of ordinary and smart high school principals in Zahedan Pre-province. The statistical community of this research is 1350 male and female teachers teaching in ordinary and smart students of high schools in that 300 ones…

  9. 76 FR 27563 - Margin and Capital Requirements for Covered Swap Entities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    .... Board: Sean D. Campbell, Deputy Associate Director, Division of Research and Statistics, (202) 452-3761, Michael Gibson, Senior Associate Director, Division of Research and Statistics, (202) 452- 2495, or Jeremy..., DC 20429. FHFA: Robert Collender, Principal Policy Analyst, Office of Policy Analysis and Research...

  10. Information theoretic partitioning and confidence based weight assignment for multi-classifier decision level fusion in hyperspectral target recognition applications

    NASA Astrophysics Data System (ADS)

    Prasad, S.; Bruce, L. M.

    2007-04-01

    There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.

  11. Hyperspectral imaging coupled with chemometric analysis for non-invasive differentiation of black pens

    NASA Astrophysics Data System (ADS)

    Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna

    2016-11-01

    Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.

  12. A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.

    PubMed

    Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain

    2015-10-01

    Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.

  13. Data Analysis & Statistical Methods for Command File Errors

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  14. A Generic multi-dimensional feature extraction method using multiobjective genetic programming.

    PubMed

    Zhang, Yang; Rockett, Peter I

    2009-01-01

    In this paper, we present a generic feature extraction method for pattern classification using multiobjective genetic programming. This not only evolves the (near-)optimal set of mappings from a pattern space to a multi-dimensional decision space, but also simultaneously optimizes the dimensionality of that decision space. The presented framework evolves vector-to-vector feature extractors that maximize class separability. We demonstrate the efficacy of our approach by making statistically-founded comparisons with a wide variety of established classifier paradigms over a range of datasets and find that for most of the pairwise comparisons, our evolutionary method delivers statistically smaller misclassification errors. At very worst, our method displays no statistical difference in a few pairwise comparisons with established classifier/dataset combinations; crucially, none of the misclassification results produced by our method is worse than any comparator classifier. Although principally focused on feature extraction, feature selection is also performed as an implicit side effect; we show that both feature extraction and selection are important to the success of our technique. The presented method has the practical consequence of obviating the need to exhaustively evaluate a large family of conventional classifiers when faced with a new pattern recognition problem in order to attain a good classification accuracy.

  15. A new statistical PCA-ICA algorithm for location of R-peaks in ECG.

    PubMed

    Chawla, M P S; Verma, H K; Kumar, Vinod

    2008-09-16

    The success of ICA to separate the independent components from the mixture depends on the properties of the electrocardiogram (ECG) recordings. This paper discusses some of the conditions of independent component analysis (ICA) that could affect the reliability of the separation and evaluation of issues related to the properties of the signals and number of sources. Principal component analysis (PCA) scatter plots are plotted to indicate the diagnostic features in the presence and absence of base-line wander in interpreting the ECG signals. In this analysis, a newly developed statistical algorithm by authors, based on the use of combined PCA-ICA for two correlated channels of 12-channel ECG data is proposed. ICA technique has been successfully implemented in identifying and removal of noise and artifacts from ECG signals. Cleaned ECG signals are obtained using statistical measures like kurtosis and variance of variance after ICA processing. This analysis also paper deals with the detection of QRS complexes in electrocardiograms using combined PCA-ICA algorithm. The efficacy of the combined PCA-ICA algorithm lies in the fact that the location of the R-peaks is bounded from above and below by the location of the cross-over points, hence none of the peaks are ignored or missed.

  16. A new methodology based on functional principal component analysis to study postural stability post-stroke.

    PubMed

    Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz

    2018-05-05

    A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. A statistically derived index for classifying East Coast fever reactions in cattle challenged with Theileria parva under experimental conditions.

    PubMed

    Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J

    2000-04-01

    A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.

  18. La estadistica en el planeamiento educativo (Statistics in Educational Planning).

    ERIC Educational Resources Information Center

    Leon Pacheco, Tomas

    1971-01-01

    This document is an English-language abstract (approximately 1500 words) summarizing the author's definitions of the principal physical and human characteristics of elementary and secondary education as presently constituted in Mexico so that school personnel may comply with Mexican regulations that force them to supply educational statistics. For…

  19. Similarities between principal components of protein dynamics and random diffusion

    NASA Astrophysics Data System (ADS)

    Hess, Berk

    2000-12-01

    Principal component analysis, also called essential dynamics, is a powerful tool for finding global, correlated motions in atomic simulations of macromolecules. It has become an established technique for analyzing molecular dynamics simulations of proteins. The first few principal components of simulations of large proteins often resemble cosines. We derive the principal components for high-dimensional random diffusion, which are almost perfect cosines. This resemblance between protein simulations and noise implies that for many proteins the time scales of current simulations are too short to obtain convergence of collective motions.

  20. XXI century projections of wind-wave conditions and sea-level rise in the Black sea

    NASA Astrophysics Data System (ADS)

    Polonsky, A.; Garmashov, A.; Fomin, V.; Valchev, N.; Trifonova, E.

    2012-04-01

    Projection of regional climate changes for XXI century is one of the priorities of EC environmental programme. Potential worsening of the waves' statistics, sea level rise and extreme surges are the principal negative consequences of the climate change for marine environment. That is why the main purpose of this presentation is to discuss the above issue for the Black sea region (with a strong focus to the south-west subregion because the maximum heights of waves exceeding 10 m occur just here) using output of several global coupled models (GCM) for XXI century, wave simulation, long-term observations of sea level and statistical techniques. First of all we tried to choose the best coupled model (s) simulated the Black sea climate change and variability using the control experiments for 20 century (203). The principal result is as follows. There is not one model which is simulating adequately even one atmospheric parameter for all seasons. Therefore we considered (for the climate projection) different outputs form various models. When it was possible we calculated also the ensemble mean projection for the selected model (s) and emission scenarios. To calculate the wave projection we used the output of SWAN model forced by the GCM wind projection for 2010 to 2100. To estimate the sea level rise in XXI century and future surges statistics we extrapolate the observed sea level rise tendencies, statistical relation between wave heights and sea level and wave scenarios. Results show that in general, the climate change in XXI century doesn't lead to the catastrophic change of the Black sea wind-wave statistics including the extreme waves in the S-W Black sea. The typical atmospheric pattern leading to the intense storm in the S-W Black sea is characterized by the persistent anticyclonic area to the North of the Black sea and cyclonic conditions in the Southern Black sea region. Such pressure pattern causes persistent and strong eastern or north-eastern wind which generates the high waves in the S-E Black sea. The climate projections show that the frequency of such atmospheric pattern will not principally increase. The recent probability of the extreme wave height (exceeding 8 to10 m) in the S-W Black sea (~1 occurrence per 10 years) will not be much worse in XXI century. Similar conclusion is true for the storm surges along the Bulgarian coastline. Expected sea level rise in the Black sea basin for XXI century due to regional climate changes is about 2 mm per year (±50%). However, some Black sea subregions (such as Odessa and Varna bay) are characterized by fivefold sea level rise because of the local land subsidence. So, this geomorphologic effect is the most dangerous local consequence for the sustainable development and management of the coastal zone in such subregions. This study was supported by EC project "THESEUS".

  1. Principal component analysis and neurocomputing-based models for total ozone concentration over different urban regions of India

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Goutami; Chattopadhyay, Surajit; Chakraborthy, Parthasarathi

    2012-07-01

    The present study deals with daily total ozone concentration time series over four metro cities of India namely Kolkata, Mumbai, Chennai, and New Delhi in the multivariate environment. Using the Kaiser-Meyer-Olkin measure, it is established that the data set under consideration are suitable for principal component analysis. Subsequently, by introducing rotated component matrix for the principal components, the predictors suitable for generating artificial neural network (ANN) for daily total ozone prediction are identified. The multicollinearity is removed in this way. Models of ANN in the form of multilayer perceptron trained through backpropagation learning are generated for all of the study zones, and the model outcomes are assessed statistically. Measuring various statistics like Pearson correlation coefficients, Willmott's indices, percentage errors of prediction, and mean absolute errors, it is observed that for Mumbai and Kolkata the proposed ANN model generates very good predictions. The results are supported by the linearly distributed coordinates in the scatterplots.

  2. Determination, speciation and distribution of mercury in soil in the surroundings of a former chlor-alkali plant: assessment of sequential extraction procedure and analytical technique

    PubMed Central

    2013-01-01

    Background The paper presents the evaluation of soil contamination with total, water-available, mobile, semi-mobile and non-mobile Hg fractions in the surroundings of a former chlor-alkali plant in connection with several chemical soil characteristics. Principal Component Analysis and Cluster Analysis were used to evaluate the chemical composition variability of soil and factors influencing the fate of Hg in such areas. The sequential extraction EPA 3200-Method and the determination technique based on capacitively coupled microplasma optical emission spectrometry were checked. Results A case study was conducted in the Turda town, Romania. The results revealed a high contamination with Hg in the area of the former chlor-alkali plant and waste landfills, where soils were categorized as hazardous waste. The weight of the Hg fractions decreased in the order semi-mobile > non-mobile > mobile > water leachable. Principal Component Analysis revealed 7 factors describing chemical composition variability of soil, of which 3 attributed to Hg species. Total Hg, semi-mobile, non-mobile and mobile fractions were observed to have a strong influence, while the water leachable fraction a weak influence. The two-dimensional plot of PCs highlighted 3 groups of sites according to the Hg contamination factor. The statistical approach has shown that the Hg fate in soil is dependent on pH, content of organic matter, Ca, Fe, Mn, Cu and SO42- rather than natural components, such as aluminosilicates. Cluster analysis of soil characteristics revealed 3 clusters, one of which including Hg species. Soil contamination with Cu as sulfate and Zn as nitrate was also observed. Conclusions The approach based on speciation and statistical interpretation of data developed in this study could be useful in the investigation of other chlor-alkali contaminated areas. According to the Bland and Altman test the 3-step sequential extraction scheme is suitable for Hg speciation in soil, while the used determination method of Hg is appropriate. PMID:24252185

  3. EVALUATION OF ACID DEPOSITION MODELS USING PRINCIPAL COMPONENT SPACES

    EPA Science Inventory

    An analytical technique involving principal components analysis is proposed for use in the evaluation of acid deposition models. elationships among model predictions are compared to those among measured data, rather than the more common one-to-one comparison of predictions to mea...

  4. Leadership and Personality: Is There a Relationship between Self-Assessed Leadership Traits and Self-Assessed Personality Traits of Female Elementary School Principals in the Hampton Roads Area of Virginia?

    ERIC Educational Resources Information Center

    Ireland, Lakisha Nicole

    2017-01-01

    This study attempted to determine if there were statistically significant relationships between leadership traits and personality traits of female elementary school principals who serve in school districts located within the Hampton Roads area of Virginia. This study examined randomly selected participants from three school divisions. These…

  5. "A Compliment Is All I Need"--Teachers Telling Principals How to Promote Their Staff's Self-Efficacy

    ERIC Educational Resources Information Center

    Kass, Efrat

    2013-01-01

    The purpose of the present study is to compare the perceptions of teachers representing opposite ends of the self-efficacy spectrum regarding the effects of the principal's behavior on their professional self-efficacy. In the first quantitative stage, a statistical procedure was conducted to identify the two groups of teachers: a group of 16…

  6. Discrimination of soft tissues using laser-induced breakdown spectroscopy in combination with k nearest neighbors (kNN) and support vector machine (SVM) classifiers

    NASA Astrophysics Data System (ADS)

    Li, Xiaohui; Yang, Sibo; Fan, Rongwei; Yu, Xin; Chen, Deying

    2018-06-01

    In this paper, discrimination of soft tissues using laser-induced breakdown spectroscopy (LIBS) in combination with multivariate statistical methods is presented. Fresh pork fat, skin, ham, loin and tenderloin muscle tissues are manually cut into slices and ablated using a 1064 nm pulsed Nd:YAG laser. Discrimination analyses between fat, skin and muscle tissues, and further between highly similar ham, loin and tenderloin muscle tissues, are performed based on the LIBS spectra in combination with multivariate statistical methods, including principal component analysis (PCA), k nearest neighbors (kNN) classification, and support vector machine (SVM) classification. Performances of the discrimination models, including accuracy, sensitivity and specificity, are evaluated using 10-fold cross validation. The classification models are optimized to achieve best discrimination performances. The fat, skin and muscle tissues can be definitely discriminated using both kNN and SVM classifiers, with accuracy of over 99.83%, sensitivity of over 0.995 and specificity of over 0.998. The highly similar ham, loin and tenderloin muscle tissues can also be discriminated with acceptable performances. The best performances are achieved with SVM classifier using Gaussian kernel function, with accuracy of 76.84%, sensitivity of over 0.742 and specificity of over 0.869. The results show that the LIBS technique assisted with multivariate statistical methods could be a powerful tool for online discrimination of soft tissues, even for tissues of high similarity, such as muscles from different parts of the animal body. This technique could be used for discrimination of tissues suffering minor clinical changes, thus may advance the diagnosis of early lesions and abnormalities.

  7. A new validation technique for estimations of body segment inertia tensors: Principal axes of inertia do matter.

    PubMed

    Rossi, Marcel M; Alderson, Jacqueline; El-Sallam, Amar; Dowling, James; Reinbolt, Jeffrey; Donnelly, Cyril J

    2016-12-08

    The aims of this study were to: (i) establish a new criterion method to validate inertia tensor estimates by setting the experimental angular velocity data of an airborne objects as ground truth against simulations run with the estimated tensors, and (ii) test the sensitivity of the simulations to changes in the inertia tensor components. A rigid steel cylinder was covered with reflective kinematic markers and projected through a calibrated motion capture volume. Simulations of the airborne motion were run with two models, using inertia tensor estimated with geometric formula or the compound pendulum technique. The deviation angles between experimental (ground truth) and simulated angular velocity vectors and the root mean squared deviation angle were computed for every simulation. Monte Carlo analyses were performed to assess the sensitivity of simulations to changes in magnitude of principal moments of inertia within ±10% and to changes in orientation of principal axes of inertia within ±10° (of the geometric-based inertia tensor). Root mean squared deviation angles ranged between 2.9° and 4.3° for the inertia tensor estimated geometrically, and between 11.7° and 15.2° for the compound pendulum values. Errors up to 10% in magnitude of principal moments of inertia yielded root mean squared deviation angles ranging between 3.2° and 6.6°, and between 5.5° and 7.9° when lumped with errors of 10° in principal axes of inertia orientation. The proposed technique can effectively validate inertia tensors from novel estimation methods of body segment inertial parameter. Principal axes of inertia orientation should not be neglected when modelling human/animal mechanics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    PubMed

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders approach outperforms FA/PCA when limited water quality and extensive watershed information is available. The available water quality dataset is limited and FA/PCA-based approach fails to identify monitoring locations with higher variation, as these multivariate statistical approaches are data-driven. The priority/hierarchy and number of sampling sites designed by modified Sanders approach are well justified by the land use practices and observed river basin characteristics of the study area.

  9. Microwave Technique for Detecting and Locating Concealed Weapons

    DOT National Transportation Integrated Search

    1971-12-01

    The subject of this report is the evaluation of a microwave technique for detecting and locating weapons concealed under clothing. The principal features of this technique are: persons subjected to search are not exposed to 'objectional' microwave ra...

  10. Biennial Survey of Education in the United States, 1936-1938. Bulletin, 1940, No. 2. Chapter I: Statistical Summary of Education, 1937-38

    ERIC Educational Resources Information Center

    Foster, Emily M.

    1942-01-01

    The U.S. Office of Education is required by law to collect statistics to show the condition and progress of education. Statistics can be made available, on a national scale, to the extent that school administrators, principals, and college officials cooperate on a voluntary basis with the Office of Education in making the facts available. This…

  11. An Empirical Cumulus Parameterization Scheme for a Global Spectral Model

    NASA Technical Reports Server (NTRS)

    Rajendran, K.; Krishnamurti, T. N.; Misra, V.; Tao, W.-K.

    2004-01-01

    Realistic vertical heating and drying profiles in a cumulus scheme is important for obtaining accurate weather forecasts. A new empirical cumulus parameterization scheme based on a procedure to improve the vertical distribution of heating and moistening over the tropics is developed. The empirical cumulus parameterization scheme (ECPS) utilizes profiles of Tropical Rainfall Measuring Mission (TRMM) based heating and moistening derived from the European Centre for Medium- Range Weather Forecasts (ECMWF) analysis. A dimension reduction technique through rotated principal component analysis (RPCA) is performed on the vertical profiles of heating (Q1) and drying (Q2) over the convective regions of the tropics, to obtain the dominant modes of variability. Analysis suggests that most of the variance associated with the observed profiles can be explained by retaining the first three modes. The ECPS then applies a statistical approach in which Q1 and Q2 are expressed as a linear combination of the first three dominant principal components which distinctly explain variance in the troposphere as a function of the prevalent large-scale dynamics. The principal component (PC) score which quantifies the contribution of each PC to the corresponding loading profile is estimated through a multiple screening regression method which yields the PC score as a function of the large-scale variables. The profiles of Q1 and Q2 thus obtained are found to match well with the observed profiles. The impact of the ECPS is investigated in a series of short range (1-3 day) prediction experiments using the Florida State University global spectral model (FSUGSM, T126L14). Comparisons between short range ECPS forecasts and those with the modified Kuo scheme show a very marked improvement in the skill in ECPS forecasts. This improvement in the forecast skill with ECPS emphasizes the importance of incorporating realistic vertical distributions of heating and drying in the model cumulus scheme. This also suggests that in the absence of explicit models for convection, the proposed statistical scheme improves the modeling of the vertical distribution of heating and moistening in areas of deep convection.

  12. Integrated Approaches On Archaeo-Geophysical Data

    NASA Astrophysics Data System (ADS)

    Kucukdemirci, M.; Piro, S.; Zamuner, D.; Ozer, E.

    2015-12-01

    Key words: Ground Penetrating Radar (GPR), Magnetometry, Geophysical Data Integration, Principal Component Analyse (PCA), Aizanoi Archaeological Site An application of geophysical integration methods which often appealed are divided into two classes as qualitative and quantitative approaches. This work focused on the application of quantitative integration approaches, which involve the mathematical and statistical integration techniques, on the archaeo-geophysical data obtained in Aizanoi Archaeological Site,Turkey. Two geophysical methods were applied as Ground Penetrating Radar (GPR) and Magnetometry for archaeological prospection on the selected archaeological site. After basic data processing of each geophysical method, the mathematical approaches of Sums and Products and the statistical approach of Principal Component Analysis (PCA) have been applied for the integration. These integration approches were first tested on synthetic digital images before application to field data. Then the same approaches were applied to 2D magnetic maps and 2D GPR time slices which were obtained on the same unit grids in the archaeological site. Initially, the geophysical data were examined individually by referencing with archeological maps and informations obtained from archaeologists and some important structures as possible walls, roads and relics were determined. The results of all integration approaches provided very important and different details about the anomalies related to archaeological features. By using all those applications, integrated images can provide complementary informations as well about the archaeological relics under the ground. Acknowledgements The authors would like to thanks to Scientific and Technological Research Council of Turkey (TUBITAK), Fellowship for Visiting Scientists Programme for their support, Istanbul University Scientific Research Project Fund, (Project.No:12302) and archaeologist team of Aizanoi Archaeological site for their support during the field work.

  13. Parallel auto-correlative statistics with VTK.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  14. Noninvasive prostate cancer screening based on serum surface-enhanced Raman spectroscopy and support vector machine

    NASA Astrophysics Data System (ADS)

    Li, Shaoxin; Zhang, Yanjiao; Xu, Junfa; Li, Linfang; Zeng, Qiuyao; Lin, Lin; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Liu, Songhao

    2014-09-01

    This study aims to present a noninvasive prostate cancer screening methods using serum surface-enhanced Raman scattering (SERS) and support vector machine (SVM) techniques through peripheral blood sample. SERS measurements are performed using serum samples from 93 prostate cancer patients and 68 healthy volunteers by silver nanoparticles. Three types of kernel functions including linear, polynomial, and Gaussian radial basis function (RBF) are employed to build SVM diagnostic models for classifying measured SERS spectra. For comparably evaluating the performance of SVM classification models, the standard multivariate statistic analysis method of principal component analysis (PCA) is also applied to classify the same datasets. The study results show that for the RBF kernel SVM diagnostic model, the diagnostic accuracy of 98.1% is acquired, which is superior to the results of 91.3% obtained from PCA methods. The receiver operating characteristic curve of diagnostic models further confirm above research results. This study demonstrates that label-free serum SERS analysis technique combined with SVM diagnostic algorithm has great potential for noninvasive prostate cancer screening.

  15. Reconstructing Past Admixture Processes from Local Genomic Ancestry Using Wavelet Transformation

    PubMed Central

    Sanderson, Jean; Sudoyo, Herawati; Karafet, Tatiana M.; Hammer, Michael F.; Cox, Murray P.

    2015-01-01

    Admixture between long-separated populations is a defining feature of the genomes of many species. The mosaic block structure of admixed genomes can provide information about past contact events, including the time and extent of admixture. Here, we describe an improved wavelet-based technique that better characterizes ancestry block structure from observed genomic patterns. principal components analysis is first applied to genomic data to identify the primary population structure, followed by wavelet decomposition to develop a new characterization of local ancestry information along the chromosomes. For testing purposes, this method is applied to human genome-wide genotype data from Indonesia, as well as virtual genetic data generated using genome-scale sequential coalescent simulations under a wide range of admixture scenarios. Time of admixture is inferred using an approximate Bayesian computation framework, providing robust estimates of both admixture times and their associated levels of uncertainty. Crucially, we demonstrate that this revised wavelet approach, which we have released as the R package adwave, provides improved statistical power over existing wavelet-based techniques and can be used to address a broad range of admixture questions. PMID:25852078

  16. Detection and discrimination of microorganisms on various substrates with quantum cascade laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Padilla-Jiménez, Amira C.; Ortiz-Rivera, William; Rios-Velazquez, Carlos; Vazquez-Ayala, Iris; Hernández-Rivera, Samuel P.

    2014-06-01

    Investigations focusing on devising rapid and accurate methods for developing signatures for microorganisms that could be used as biological warfare agents' detection, identification, and discrimination have recently increased significantly. Quantum cascade laser (QCL)-based spectroscopic systems have revolutionized many areas of defense and security including this area of research. In this contribution, infrared spectroscopy detection based on QCL was used to obtain the mid-infrared (MIR) spectral signatures of Bacillus thuringiensis, Escherichia coli, and Staphylococcus epidermidis. These bacteria were used as microorganisms that simulate biothreats (biosimulants) very truthfully. The experiments were conducted in reflection mode with biosimulants deposited on various substrates including cardboard, glass, travel bags, wood, and stainless steel. Chemometrics multivariate statistical routines, such as principal component analysis regression and partial least squares coupled to discriminant analysis, were used to analyze the MIR spectra. Overall, the investigated infrared vibrational techniques were useful for detecting target microorganisms on the studied substrates, and the multivariate data analysis techniques proved to be very efficient for classifying the bacteria and discriminating them in the presence of highly IR-interfering media.

  17. Assessing the varietal origin of extra-virgin olive oil using liquid chromatography fingerprints of phenolic compound, data fusion and chemometrics.

    PubMed

    Bajoub, Aadil; Medina-Rodríguez, Santiago; Gómez-Romero, María; Ajal, El Amine; Bagur-González, María Gracia; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría

    2017-01-15

    High Performance Liquid Chromatography (HPLC) with diode array (DAD) and fluorescence (FLD) detection was used to acquire the fingerprints of the phenolic fraction of monovarietal extra-virgin olive oils (extra-VOOs) collected over three consecutive crop seasons (2011/2012-2013/2014). The chromatographic fingerprints of 140 extra-VOO samples processed from olive fruits of seven olive varieties, were recorded and statistically treated for varietal authentication purposes. First, DAD and FLD chromatographic-fingerprint datasets were separately processed and, subsequently, were joined using "Low-level" and "Mid-Level" data fusion methods. After the preliminary examination by principal component analysis (PCA), three supervised pattern recognition techniques, Partial Least Squares Discriminant Analysis (PLS-DA), Soft Independent Modeling of Class Analogies (SIMCA) and K-Nearest Neighbors (k-NN) were applied to the four chromatographic-fingerprinting matrices. The classification models built were very sensitive and selective, showing considerably good recognition and prediction abilities. The combination "chromatographic dataset+chemometric technique" allowing the most accurate classification for each monovarietal extra-VOO was highlighted. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Evaluation of SLAR and simulated thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.; Dean, M. E.; Knowlton, D. J.; Latty, R. S.

    1982-01-01

    Kershaw County, South Carolina was selected as the study site for analyzing simulated thematic mapper MSS data and dual-polarized X-band synthetic aperture radar (SAR) data. The impact of the improved spatial and spectral characteristics of the LANDSAT D thematic mapper data on computer aided analysis for forest cover type mapping was examined as well as the value of synthetic aperture radar data for differentiating forest and other cover types. The utility of pattern recognition techniques for analyzing SAR data was assessed. Topics covered include: (1) collection and of TMS and reference data; (2) reformatting, geometric and radiometric rectification, and spatial resolution degradation of TMS data; (3) development of training statistics and test data sets; (4) evaluation of different numbers and combinations of wavelength bands on classification performance; (5) comparison among three classification algorithms; and (6) the effectiveness of the principal component transformation in data analysis. The collection, digitization, reformatting, and geometric adjustment of SAR data are also discussed. Image interpretation results and classification results are presented.

  19. Forest statistics for Southeast Georgia, 1996

    Treesearch

    Michael T. Thompson; Raymond M. Sheffield

    1997-01-01

    This report highlights the principal findings of the seventh forest survey of Southeast Georgia. Field work began in November 1995 and was completed in November 1996. Six previous surveys, completed in 1934, 1952, 1960, 1971, 1981, and 1988 provide statistics for measuring changes and trends over the past 62 years. This report primarily emphasizes the changes and...

  20. Visualization of time series statistical data by shape analysis (GDP ratio changes among Asia countries)

    NASA Astrophysics Data System (ADS)

    Shirota, Yukari; Hashimoto, Takako; Fitri Sari, Riri

    2018-03-01

    It has been very significant to visualize time series big data. In the paper we shall discuss a new analysis method called “statistical shape analysis” or “geometry driven statistics” on time series statistical data in economics. In the paper, we analyse the agriculture, value added and industry, value added (percentage of GDP) changes from 2000 to 2010 in Asia. We handle the data as a set of landmarks on a two-dimensional image to see the deformation using the principal components. The point of the analysis method is the principal components of the given formation which are eigenvectors of its bending energy matrix. The local deformation can be expressed as the set of non-Affine transformations. The transformations give us information about the local differences between in 2000 and in 2010. Because the non-Affine transformation can be decomposed into a set of partial warps, we present the partial warps visually. The statistical shape analysis is widely used in biology but, in economics, no application can be found. In the paper, we investigate its potential to analyse the economic data.

  1. Cryptic or pseudocryptic: can morphological methods inform copepod taxonomy? An analysis of publications and a case study of the Eurytemora affinis species complex

    PubMed Central

    Lajus, Dmitry; Sukhikh, Natalia; Alekseev, Victor

    2015-01-01

    Interest in cryptic species has increased significantly with current progress in genetic methods. The large number of cryptic species suggests that the resolution of traditional morphological techniques may be insufficient for taxonomical research. However, some species now considered to be cryptic may, in fact, be designated pseudocryptic after close morphological examination. Thus the “cryptic or pseudocryptic” dilemma speaks to the resolution of morphological analysis and its utility for identifying species. We address this dilemma first by systematically reviewing data published from 1980 to 2013 on cryptic species of Copepoda and then by performing an in-depth morphological study of the former Eurytemora affinis complex of cryptic species. Analyzing the published data showed that, in 5 of 24 revisions eligible for systematic review, cryptic species assignment was based solely on the genetic variation of forms without detailed morphological analysis to confirm the assignment. Therefore, some newly described cryptic species might be designated pseudocryptic under more detailed morphological analysis as happened with Eurytemora affinis complex. Recent genetic analyses of the complex found high levels of heterogeneity without morphological differences; it is argued to be cryptic. However, next detailed morphological analyses allowed to describe a number of valid species. Our study, using deep statistical analyses usually not applied for new species describing, of this species complex confirmed considerable differences between former cryptic species. In particular, fluctuating asymmetry (FA), the random variation of left and right structures, was significantly different between forms and provided independent information about their status. Our work showed that multivariate statistical approaches, such as principal component analysis, can be powerful techniques for the morphological discrimination of cryptic taxons. Despite increasing cryptic species designations, morphological techniques have great potential in determining copepod taxonomy. PMID:26120427

  2. Detection of cervical lesions by multivariate analysis of diffuse reflectance spectra: a clinical study.

    PubMed

    Prabitha, Vasumathi Gopala; Suchetha, Sambasivan; Jayanthi, Jayaraj Lalitha; Baiju, Kamalasanan Vijayakumary; Rema, Prabhakaran; Anuraj, Koyippurath; Mathews, Anita; Sebastian, Paul; Subhash, Narayanan

    2016-01-01

    Diffuse reflectance (DR) spectroscopy is a non-invasive, real-time, and cost-effective tool for early detection of malignant changes in squamous epithelial tissues. The present study aims to evaluate the diagnostic power of diffuse reflectance spectroscopy for non-invasive discrimination of cervical lesions in vivo. A clinical trial was carried out on 48 sites in 34 patients by recording DR spectra using a point-monitoring device with white light illumination. The acquired data were analyzed and classified using multivariate statistical analysis based on principal component analysis (PCA) and linear discriminant analysis (LDA). Diagnostic accuracies were validated using random number generators. The receiver operating characteristic (ROC) curves were plotted for evaluating the discriminating power of the proposed statistical technique. An algorithm was developed and used to classify non-diseased (normal) from diseased sites (abnormal) with a sensitivity of 72 % and specificity of 87 %. While low-grade squamous intraepithelial lesion (LSIL) could be discriminated from normal with a sensitivity of 56 % and specificity of 80 %, and high-grade squamous intraepithelial lesion (HSIL) from normal with a sensitivity of 89 % and specificity of 97 %, LSIL could be discriminated from HSIL with 100 % sensitivity and specificity. The areas under the ROC curves were 0.993 (95 % confidence interval (CI) 0.0 to 1) and 1 (95 % CI 1) for the discrimination of HSIL from normal and HSIL from LSIL, respectively. The results of the study show that DR spectroscopy could be used along with multivariate analytical techniques as a non-invasive technique to monitor cervical disease status in real time.

  3. Robotic Rock Classification

    NASA Technical Reports Server (NTRS)

    Hebert, Martial

    1999-01-01

    This report describes a three-month research program undertook jointly by the Robotics Institute at Carnegie Mellon University and Ames Research Center as part of the Ames' Joint Research Initiative (JRI.) The work was conducted at the Ames Research Center by Mr. Liam Pedersen, a graduate student in the CMU Ph.D. program in Robotics under the supervision Dr. Ted Roush at the Space Science Division of the Ames Research Center from May 15 1999 to August 15, 1999. Dr. Martial Hebert is Mr. Pedersen's research adviser at CMU and is Principal Investigator of this Grant. The goal of this project is to investigate and implement methods suitable for a robotic rover to autonomously identify rocks and minerals in its vicinity, and to statistically characterize the local geological environment. Although primary sensors for these tasks are a reflection spectrometer and color camera, the goal is to create a framework under which data from multiple sensors, and multiple readings on the same object, can be combined in a principled manner. Furthermore, it is envisioned that knowledge of the local area, either a priori or gathered by the robot, will be used to improve classification accuracy. The key results obtained during this project are: The continuation of the development of a rock classifier; development of theoretical statistical methods; development of methods for evaluating and selecting sensors; and experimentation with data mining techniques on the Ames spectral library. The results of this work are being applied at CMU, in particular in the context of the Winter 99 Antarctica expedition in which the classification techniques will be used on the Nomad robot. Conversely, the software developed based on those techniques will continue to be made available to NASA Ames and the data collected from the Nomad experiments will also be made available.

  4. Trends in Public and Private School Principal Demographics and Qualifications: 1987-88 to 2011-12. Stats in Brief. NCES 2016-189

    ERIC Educational Resources Information Center

    Hill, Jason; Ottem, Randolph; DeRoche, John

    2016-01-01

    Using data from seven administrations of the Schools and Staffing Survey (SASS), this Statistics in Brief examines trends in public and private school principal demographics, experience, and compensation over 25 years, from 1987-88 through 2011-12. Data are drawn from the 1987-88, 1990-91, 1993-94, 1999-2000, 2003-04, 2007-08, and 2011-12 survey…

  5. Spatio-temporal variability of hydro-chemical characteristics of coastal waters of Gulf of Mannar Marine Biosphere Reserve (GoMMBR), South India

    NASA Astrophysics Data System (ADS)

    Kathiravan, K.; Natesan, Usha; Vishnunath, R.

    2017-03-01

    The intention of this study was to appraise the spatial and temporal variations in the physico-chemical parameters of coastal waters of Rameswaram Island, Gulf of Mannar Marine Biosphere Reserve, south India, using multivariate statistical techniques, such as cluster analysis, factor analysis and principal component analysis. Spatio-temporal variations among the physico-chemical parameters are observed in the coastal waters of Gulf of Mannar, especially during northeast and post monsoon seasons. It is inferred that the high loadings of pH, temperature, suspended particulate matter, salinity, dissolved oxygen, biochemical oxygen demand, chlorophyll a, nutrient species of nitrogen and phosphorus strongly determine the discrimination of coastal water quality. Results highlight the important role of monsoonal variations to determine the coastal water quality around Rameswaram Island.

  6. Assessing the determinants of evolutionary rates in the presence of noise.

    PubMed

    Plotkin, Joshua B; Fraser, Hunter B

    2007-05-01

    Although protein sequences are known to evolve at vastly different rates, little is known about what determines their rate of evolution. However, a recent study using principal component regression (PCR) has concluded that evolutionary rates in yeast are primarily governed by a single determinant related to translation frequency. Here, we demonstrate that noise in biological data can confound PCRs, leading to spurious conclusions. When equalizing noise levels across 7 predictor variables used in previous studies, we find no evidence that protein evolution is dominated by a single determinant. Our results indicate that a variety of factors--including expression level, gene dispensability, and protein-protein interactions--may independently affect evolutionary rates in yeast. More accurate measurements or more sophisticated statistical techniques will be required to determine which one, if any, of these factors dominates protein evolution.

  7. The hoard of Beçin—non-destructive analysis of the silver coins

    NASA Astrophysics Data System (ADS)

    Rodrigues, M.; Schreiner, M.; Mäder, M.; Melcher, M.; Guerra, M.; Salomon, J.; Radtke, M.; Alram, M.; Schindel, N.

    2010-05-01

    We report the results of an analytical investigation on 416 silver-copper coins stemming from the Ottoman Empire (end of 16th and beginning of 17th centuries), using synchrotron micro X-ray fluorescence analysis (SRXRF). In the past, analyses had already been conducted with energy dispersive X-ray fluorescence analysis (EDXRF), scanning electron microscopy with energy dispersive X-ray spectrometry (SEM/EDX) and proton induced X-ray emission spectroscopy (PIXE). With this combination of techniques it was possible to confirm the fineness of the coinage as well as to study the provenance of the alloy used for the coins. For the interpretation of the data statistical analysis (principal component analysis—PCA) has been performed. A definite local assignment was explored and significant clustering was obtained regarding the minor and trace elements composing the coin alloys.

  8. Resonance Raman spectroscopy for human cancer detection of key molecules with clinical diagnosis

    NASA Astrophysics Data System (ADS)

    Zhou, Yan; Liu, Cheng-hui; Li, Jiyou; Zhou, Lixin; He, Jingsheng; Sun, Yi; Pu, Yang; Zhu, Ke; Liu, Yulong; Li, Qingbo; Cheng, Gangge; Alfano, Robert R.

    2013-03-01

    Resonance Raman (RR) has the potential to reveal the differences between cancerous and normal breast and brain tissues in vitro. This differences caused by the changes of specific biomolecules in the tissues were displayed in resonance enhanced of vibrational fingerprints. It observed that the changes of reduced collagen contents and the number of methyl may show the sub-methylation of DNA in cancer cells. Statistical theoretical models of Bayesian, principal component analysis (PCA) and support vector machine (SVM) were used for distinguishing cancer from normal based on the RR spectral data of breast and meninges tissues yielding the diagnostic sensitivity of 80% and 90.9%, and specificity of 100% and 100%, respectively. The results demonstrated that the RR spectroscopic technique could be applied as clinical optical pathology tool with a high accuracy and reliability.

  9. A comparison of linear approaches to filter out environmental effects in structural health monitoring

    NASA Astrophysics Data System (ADS)

    Deraemaeker, A.; Worden, K.

    2018-05-01

    This paper discusses the possibility of using the Mahalanobis squared-distance to perform robust novelty detection in the presence of important environmental variability in a multivariate feature vector. By performing an eigenvalue decomposition of the covariance matrix used to compute that distance, it is shown that the Mahalanobis squared-distance can be written as the sum of independent terms which result from a transformation from the feature vector space to a space of independent variables. In general, especially when the size of the features vector is large, there are dominant eigenvalues and eigenvectors associated with the covariance matrix, so that a set of principal components can be defined. Because the associated eigenvalues are high, their contribution to the Mahalanobis squared-distance is low, while the contribution of the other components is high due to the low value of the associated eigenvalues. This analysis shows that the Mahalanobis distance naturally filters out the variability in the training data. This property can be used to remove the effect of the environment in damage detection, in much the same way as two other established techniques, principal component analysis and factor analysis. The three techniques are compared here using real experimental data from a wooden bridge for which the feature vector consists in eigenfrequencies and modeshapes collected under changing environmental conditions, as well as damaged conditions simulated with an added mass. The results confirm the similarity between the three techniques and the ability to filter out environmental effects, while keeping a high sensitivity to structural changes. The results also show that even after filtering out the environmental effects, the normality assumption cannot be made for the residual feature vector. An alternative is demonstrated here based on extreme value statistics which results in a much better threshold which avoids false positives in the training data, while allowing detection of all damaged cases.

  10. Detection of micro solder balls using active thermography and probabilistic neural network

    NASA Astrophysics Data System (ADS)

    He, Zhenzhi; Wei, Li; Shao, Minghui; Lu, Xingning

    2017-03-01

    Micro solder ball/bump has been widely used in electronic packaging. It has been challenging to inspect these structures as the solder balls/bumps are often embedded between the component and substrates, especially in flip-chip packaging. In this paper, a detection method for micro solder ball/bump based on the active thermography and the probabilistic neural network is investigated. A VH680 infrared imager is used to capture the thermal image of the test vehicle, SFA10 packages. The temperature curves are processed using moving average technique to remove the peak noise. And the principal component analysis (PCA) is adopted to reconstruct the thermal images. The missed solder balls can be recognized explicitly in the second principal component image. Probabilistic neural network (PNN) is then established to identify the defective bump intelligently. The hot spots corresponding to the solder balls are segmented from the PCA reconstructed image, and statistic parameters are calculated. To characterize the thermal properties of solder bump quantitatively, three representative features are selected and used as the input vector in PNN clustering. The results show that the actual outputs and the expected outputs are consistent in identification of the missed solder balls, and all the bumps were recognized accurately, which demonstrates the viability of the PNN in effective defect inspection in high-density microelectronic packaging.

  11. Phospholipid Fatty Acid Analysis: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Findlay, R. H.

    2008-12-01

    With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.

  12. [Discrimination of types of polyacrylamide based on near infrared spectroscopy coupled with least square support vector machine].

    PubMed

    Zhang, Hong-Guang; Yang, Qin-Min; Lu, Jian-Gang

    2014-04-01

    In this paper, a novel discriminant methodology based on near infrared spectroscopic analysis technique and least square support vector machine was proposed for rapid and nondestructive discrimination of different types of Polyacrylamide. The diffuse reflectance spectra of samples of Non-ionic Polyacrylamide, Anionic Polyacrylamide and Cationic Polyacrylamide were measured. Then principal component analysis method was applied to reduce the dimension of the spectral data and extract of the principal compnents. The first three principal components were used for cluster analysis of the three different types of Polyacrylamide. Then those principal components were also used as inputs of least square support vector machine model. The optimization of the parameters and the number of principal components used as inputs of least square support vector machine model was performed through cross validation based on grid search. 60 samples of each type of Polyacrylamide were collected. Thus a total of 180 samples were obtained. 135 samples, 45 samples for each type of Polyacrylamide, were randomly split into a training set to build calibration model and the rest 45 samples were used as test set to evaluate the performance of the developed model. In addition, 5 Cationic Polyacrylamide samples and 5 Anionic Polyacrylamide samples adulterated with different proportion of Non-ionic Polyacrylamide were also prepared to show the feasibilty of the proposed method to discriminate the adulterated Polyacrylamide samples. The prediction error threshold for each type of Polyacrylamide was determined by F statistical significance test method based on the prediction error of the training set of corresponding type of Polyacrylamide in cross validation. The discrimination accuracy of the built model was 100% for prediction of the test set. The prediction of the model for the 10 mixing samples was also presented, and all mixing samples were accurately discriminated as adulterated samples. The overall results demonstrate that the discrimination method proposed in the present paper can rapidly and nondestructively discriminate the different types of Polyacrylamide and the adulterated Polyacrylamide samples, and offered a new approach to discriminate the types of Polyacrylamide.

  13. Non-Markovian near-infrared Q branch of HCl diluted in liquid Ar.

    PubMed

    Padilla, Antonio; Pérez, Justo

    2013-08-28

    By using a non-Markovian spectral theory based in the Kubo cumulant expansion technique, we have qualitatively studied the infrared Q branch observed in the fundamental absorption band of HCl diluted in liquid Ar. The statistical parameters of the anisotropic interaction present in this spectral theory were calculated by means of molecular dynamics techniques, and found that the values of the anisotropic correlation times are significantly greater (by a factor of two) than those previously obtained by fitting procedures or microscopic cell models. This fact is decisive for the observation in the theoretical spectral band of a central Q resonance which is absent in the abundant previous researches carried out with the usual theories based in Kubo cumulant expansion techniques. Although the theory used in this work only allows a qualitative study of the Q branch, we can employ it to study the unknown characteristics of the Q resonance which are difficult to obtain with the quantum simulation techniques recently developed. For example, in this study we have found that the Q branch is basically a non-Markovian (or memory) effect produced by the spectral line interferences, where the PR interferential profile basically determines the Q branch spectral shape. Furthermore, we have found that the Q resonance is principally generated by the first rotational states of the first two vibrational levels, those more affected by the action of the dissolvent.

  14. Consistent Principal Component Modes from Molecular Dynamics Simulations of Proteins.

    PubMed

    Cossio-Pérez, Rodrigo; Palma, Juliana; Pierdominici-Sottile, Gustavo

    2017-04-24

    Principal component analysis is a technique widely used for studying the movements of proteins using data collected from molecular dynamics simulations. In spite of its extensive use, the technique has a serious drawback: equivalent simulations do not afford the same PC-modes. In this article, we show that concatenating equivalent trajectories and calculating the PC-modes from the concatenated one significantly enhances the reproducibility of the results. Moreover, the consistency of the modes can be systematically improved by adding more individual trajectories to the concatenated one.

  15. Applied Remote Sensing Program (ARSP)

    NASA Technical Reports Server (NTRS)

    Johnson, J. D.; Foster, K. E.; Mouat, D. A.; Miller, D. A.; Conn, J. S.

    1976-01-01

    The activities and accomplishments of the Applied Remote Sensing Program during FY 1975-1976 are reported. The principal objective of the Applied Remote Sensing Program continues to be designed projects having specific decision-making impacts as a principal goal. These projects are carried out in cooperation and collaboration with local, state and federal agencies whose responsibilities lie with planning, zoning and environmental monitoring and/or assessment in the application of remote sensing techniques. The end result of the projects is the use by the involved agencies of remote sensing techniques in problem solving.

  16. Analysis of the principal component algorithm in phase-shifting interferometry.

    PubMed

    Vargas, J; Quiroga, J Antonio; Belenguer, T

    2011-06-15

    We recently presented a new asynchronous demodulation method for phase-sampling interferometry. The method is based in the principal component analysis (PCA) technique. In the former work, the PCA method was derived heuristically. In this work, we present an in-depth analysis of the PCA demodulation method.

  17. Building Leadership Skills with Evaluation Techniques

    ERIC Educational Resources Information Center

    Hutton, Dawn

    2009-01-01

    The career of educational administrator does not lend itself well to allowing new principals to practice their skills. Because of the element of responsibility for administrators, most principals are reluctant to relinquish power to interns in fear of their failure. As a result, educational leadership programs struggle to find appropriate methods…

  18. Psychometric Measurement Models and Artificial Neural Networks

    ERIC Educational Resources Information Center

    Sese, Albert; Palmer, Alfonso L.; Montano, Juan J.

    2004-01-01

    The study of measurement models in psychometrics by means of dimensionality reduction techniques such as Principal Components Analysis (PCA) is a very common practice. In recent times, an upsurge of interest in the study of artificial neural networks apt to computing a principal component extraction has been observed. Despite this interest, the…

  19. "Circumstance and Proper Timing": Context and the Construction of a Standards Framework for School Principals' Performance.

    ERIC Educational Resources Information Center

    Louden, William; Wildy, Helen

    1999-01-01

    Professional standards for school principals typically describe an ideal performance in a generalized context. This article describes an alternative method of developing a standards framework, combining qualitative vignettes with probabilistic measurement techniques to provide essential or ideal performance qualities with contextually rich…

  20. Analysis of river pollution data from low-flow period by means of multivariate techniques: a case study from the oil-shale industry region, northeastern Estonia.

    PubMed

    Truu, Jaak; Heinaru, Eeva; Talpsep, Ene; Heinaru, Ain

    2002-01-01

    The oil-shale industry has created serious pollution problems in northeastern Estonia. Untreated, phenol-rich leachate from semi-coke mounds formed as a by-product of oil-shale processing is discharged into the Baltic Sea via channels and rivers. An exploratory analysis of water chemical and microbiological data sets from the low-flow period was carried out using different multivariate analysis techniques. Principal component analysis allowed us to distinguish different locations in the river system. The riverine microbial community response to water chemical parameters was assessed by co-inertia analysis. Water pH, COD and total nitrogen were negatively related to the number of biodegradative bacteria, while oxygen concentration promoted the abundance of these bacteria. The results demonstrate the utility of multivariate statistical techniques as tools for estimating the magnitude and extent of pollution based on river water chemical and microbiological parameters. An evaluation of river chemical and microbiological data suggests that the ambient natural attenuation mechanisms only partly eliminate pollutants from river water, and that a sufficient reduction of more recalcitrant compounds could be achieved through the reduction of wastewater discharge from the oil-shale chemical industry into the rivers.

  1. Modern approaches to the treatment of human infertility through assisted reproduction.

    PubMed

    Fernández Pelegrina, R; Kessler, A G; Rawlins, R G

    1991-08-01

    Medical statistics from the United States show approximately 15 percent of all couples of reproductive age are unable to conceive naturally. In recent years, the numbers of couples with reproductive problems has increased, principally due to changes in life style and delayed childbearing. Only 13 years after the birth of the first "test tube baby", advances in the field of human reproduction have created a wide range of alternatives to help infertile couples conceive a healthy infant. Together, these techniques are called Assisted Reproductive Technology (ART) and include: in vitro fertilization (IVF), intratubal transfer of gametes (GIFT), intratubal transfer of zygotes (ZIFT), tubal transfer of preimplantation embryos (TET), gamete or embryo donation, cryopreservtion, and micromanipulation. The application of these techniques is presented here. While much remains to be learned, the ability to fertilize ova in vitro and sustain early embryonic life outside the body is now a reality. Contrary to the idea that these techniques create life in vitro, they simply remove barriers caused by different forms of infertility which impede the creation of life. More than 30,000 infants have now been produced world-wide through ART. In the future, new developments in the field of assisted reproduction promise to bring new hope to the growing numbers of infertile couples around the world.

  2. Forest statistics for Central Georgia, 1982

    Treesearch

    Raymond M. Sheffield; John B. Tansey

    1982-01-01

    This report highlights the principal findings of the fifth forest survey of Central Georgia. Fieldwork began in October 1981 and was completed in June 1982. Four previous surveys, completed in 1936, 1952, 1961, and 1972, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and trends since...

  3. Forest statistics for North Central Georgia, 1998

    Treesearch

    Michael T. Thompson

    1998-01-01

    This report highlights the principal findings of the seventh forest survey of North Central Georgia. Field work began in June 1997 and was completed in November 1997. Six previous surveys, completed in 1936, 1953, 196 1, 1972, 1983, and 1989 provide statistics for measuring changes and trends over the past 6 1 years. This report primarily emphasizes the changes and...

  4. Forest statistics for South Florida, 1995

    Treesearch

    Michael T. Thompson

    1996-01-01

    This report highlights the principal findings of the seventh forest survey of South Florida. Field work began in September 1994 and was completed in November 1994. Six previous surveys, completed in 1936, 1949, 1959, 1970, 1980, and 1988 provide statistics for measuring changes and trends over the past 59 years. This report primarily emphasizes the changes and trends...

  5. Forest statistics for Central Georgia, 1997

    Treesearch

    Michael T. Thompson

    1998-01-01

    This report highlights the principal findings of the seventh forest survey of Central Georgia. Field work began in November 1996 and was completed in August 1997. Six previous surveys, completed in 1936, 1952, 1961, 1972, 1982, and 1989 provide statistics for measuring changes and trends over the past 61 years. This report primarily emphasizes the changes and trends...

  6. Forest statistics for Central Florida - 1995

    Treesearch

    Mark J. Brown

    1996-01-01

    This report highlights the principal findings of the seventh forest survey of Central Florida. Field work began in February 1995 and was completed in May 1995. Six previous surveys, completed in 1936, 1949, 1959, 1970, 1960, and 1988 provide statistics for measuring changes and trends over the past 59 years. This report primarily emphasizes the changes and trends since...

  7. Forest statistics for Southwest Georgia, 1996

    Treesearch

    Raymond M. Sheffield; Michael T. Thompson

    1997-01-01

    This report highlights the principal findings of the seventh forest survey of Southwest Georgia. Field work began in June 1995 and was completed in November 1995. Six previous surveys, completed in 1934, 1951, 1960, 1971, 1981, and 1988 provide statistics for measuring changes and trends over the past 62 years. This report primarily emphasizes the changes and trends...

  8. Forest statistics for Northeast Florida, 1980

    Treesearch

    Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth forest survey of Northeast Florida. Fieldwork began in June 1979 and was completed in December 1979. Four previous surveys, completed in 1934, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and trends since...

  9. Indicators of School Crime and Safety: 2017. NCES 2018-036/NCJ 251413

    ERIC Educational Resources Information Center

    Zhang, Anlan; Wang, Ke; Zhang, Jizhi; Kemp, Jana; Diliberti, Melissa; Oudekerk, Barbara A.

    2018-01-01

    A joint effort by the National Center for Education Statistics and the Bureau of Justice Statistics, this annual report examines crime occurring in schools and colleges. This report presents data on crime at school from the perspectives of students, teachers, principals, and the general population from an array of sources--the National Crime…

  10. Forest statistics for Virginia, 1992

    Treesearch

    Tony G. Johnson

    1992-01-01

    This report highlights the principal findings of the sixth forest survey of Virginia. Field work began in October 1990 and was completed in January 1992. Five previous surveys, completed in 1940, 1957, 1966, 1977, and 1986, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the changes and trends since...

  11. Forest statistics for the Northern Piedmont of Virginia 1976

    Treesearch

    Raymond M. Sheffield

    1976-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Northern Piedmont of Virginia. The inventory was started in March 1976 and completed in August 1976. Three previous inventories, completed in 1940, 1957, and 1965, provide statistics for measuring changes and trends over the past 36 years. In this report, the primary...

  12. Forest statistics for the Coastal Plain of Virginia, 1976

    Treesearch

    Noel D. Cost

    1976-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the coastal Plain of Virginia. The inventory was started in February 1975 and completed in November 1975. Three previous inventories, completed in 1940, 1956, and 1966, provide statistics for measuring changes and trends over the past 36 years. In this report, the primary...

  13. Forest statistics for Northeast Florida, 1987

    Treesearch

    Mark J. Brown

    1987-01-01

    This report highlights the principal findings of the sixth forest survey of Northeast Florida. Field work began in January 1987 and was completed in July 1987. Five previous surveys, completed in 1934, 1949, 1959, 1970, and 1980, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report is on the changes and trends...

  14. Forest statistics for Northwest Florida, 1979

    Treesearch

    Raymond M. Sheffield

    1979-01-01

    This report highlights the principal findings of the fifth forest survey of Northwest Florida. Fieldwork began in September 1978 and was completed in June 1979. Four previous surveys, completed in 1934, 1949, 1959, and 1969, provide statistics for measuring changes and trends over the past 45 years. The primary emphasis in this report is on the changes and trends since...

  15. Forest statistics for Northeast Florida, 1995

    Treesearch

    Raymond M. Sheffield

    1995-01-01

    This report highlights the principal findings of the seventh forest survey of Northeast Florida. Field work began in April 1994 and was completed in May 1995. Six previous surveys, completed in 1934, 1949. 1959, 1970, 1980, and 1987 provide statistics for measuring changes and trends over the past 61 years. The primary emphasis in this report is on the changes and...

  16. Forest statistics for North Georgia, 1983

    Treesearch

    John B. Tansey

    1983-01-01

    This report highlights the principal findings of the fifth forest survey of North Georgia. Fieldwork began in September 1982 and was completed in January 1983. Four previous surveys, completed in 1936, 1953, 1961, and 1972, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends since...

  17. Forest statistics for North Carolina, 1984

    Treesearch

    William A. Bechtold

    1984-01-01

    This report highlights the principal findings of the fifth forest survey of North Carolina, Fieldwork began in November 1982 and was completed in September 1984, Four previous surveys, completed in 1938, 1956, 1964, and 1974, provide statistics for measuring changes and trends over the past 46 years, The primary emphasis in this report is on the changes and trends...

  18. Forest statistics for North Central Georgia, 1983

    Treesearch

    John B. Tansey

    1983-01-01

    This report highlights the principal findings of the fifth forest survey of North Central Georgia. Fieldwork began in May 1982 and was completed in September 1982. Four previous surveys, completed in 1936, 1953, 1961, and 1972, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends...

  19. Forest statistics for South Carolina, 1978

    Treesearch

    Raymond M. Sheffield

    1978-01-01

    This report highlights the principal findings of the fifth inventory of South Carolina's forests. Fieldwork began in April 1977 and was completed in August 1978. Four previous statewide inventories, completed in 1936, 1947, 19.58, and 1968, provide statistics for measuring changes and trends over the past 42 years. The primary emphasis in this report is on the...

  20. Forest statistics for Southeast Georgia, 1981

    Treesearch

    Raymond M. Sheffield

    1982-01-01

    This report highlights the principal findings of the fifth forest survey of Southeast Georgia, Fieldwork began in November 1980 and was completed in October 1981. Four previous surveys, completed in 1934, 1952, 1960, and 1971, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends...

  1. Forest statistics for the Southern Piedmont of Virginia 1976

    Treesearch

    Raymond M. Sheffield

    1976-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Southern Piedmont of Virginia. The inventory was started in February 1975 and completed in November 1975. Three previous inventories, completed in 1940, 1956, and 1966, provide statistics for measuring changes and trends over the past 36 years. In this report, the...

  2. Data Sharing and the Development of the Cleveland Clinic Statistical Education Dataset Repository

    ERIC Educational Resources Information Center

    Nowacki, Amy S.

    2013-01-01

    Examples are highly sought by both students and teachers. This is particularly true as many statistical instructors aim to engage their students and increase active participation. While simulated datasets are functional, they lack real perspective and the intricacies of actual data. In order to obtain real datasets, the principal investigator of a…

  3. Forest statistics for Virginia, 1986

    Treesearch

    Mark J. Brown

    1986-01-01

    This report highlights the principal findings of the fifth forest survey of Virginia. Fieldwork began in September 1984 and was completed in November 1985. Four previous surveys, completed in 1940, 1957, 1966, and 1977, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and trends since 1977...

  4. Forest statistics for Southwest Georgia, 1981

    Treesearch

    Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth forest survey of southwest Georgia, Fieldwork began in May 1980 and was completed in November 1980. Four previous surveys, completed in 1938, 1951, 1960, 1971, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends since 1971...

  5. Forest statistics for the Northern Mountain region of Virginia 1977

    Treesearch

    Raymond M. Sheffield

    1977-01-01

    This report highlights the principal findings of the fourth inventory of timber resources in the Northern Mountain Region of Virginia. The inventory was started in August 1976 and completed in December 1976. Three previous inventories, completed in 1940, 1957 and 1966, provide statistics for measuring changes and trends over the past 37 years. In this report, the...

  6. Forest statistics for Central Florida - 1980

    Treesearch

    Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth forest survey of Central Florida. Fieldwork began in December 1979 and was completed in March 1980. Four previous surveys, completed in 1936, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is on the changes and trends since...

  7. Forest statistics for South Carolina, 1986

    Treesearch

    John B. Tansey

    1986-01-01

    This report highlights the principal findings of the sixth forest survey in South Carolina. Fieldwork began in November 1985 and was completed in September 1986. Five previous surveys, completed in 1936, 1947, 1958, 1968, and 1978, provide statistics for measuring changes and trends over the past 50 years, The primary emphasis in this report is on the changes and...

  8. Forest statistics for Southwest Georgia, 1988

    Treesearch

    Michael T. Thompson

    1988-01-01

    This report highlights the principal findings of the sixth forest survey in southwest Georgia. Field work began in October 1987 and was completed in January 1988. Five previous surveys, completed in 1934, 1951, 1960, 1971, and 1981, provide statistics for measuring changes and trends over the past 54 years. The primary emphasis in this report is on the changes and...

  9. Forest statistics for Central Florida - 1988

    Treesearch

    Mark J. Brown

    1988-01-01

    This report highlights the principal findings of the sixth forest survey of Central Florida. Field work began in July 1987 and was completed in September 1987. Five previous surveys, completed in 1936, 1949, 1959, 1970, and 1980, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the changes and trends...

  10. Intermittency of principal stress directions within Arctic sea ice.

    PubMed

    Weiss, Jérôme

    2008-05-01

    The brittle deformation of Arctic sea ice is not only characterized by strong spatial heterogeneity as well as intermittency of stress and strain-rate amplitudes, but also by an intermittency of principal stress directions, with power law statistics of angular fluctuations, long-range correlations in time, and multifractal scaling. This intermittency is much more pronounced than that of wind directions, i.e., is not a direct inheritance of the turbulent forcing.

  11. BAYESIAN SEMI-BLIND COMPONENT SEPARATION FOR FOREGROUND REMOVAL IN INTERFEROMETRIC 21 cm OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Timbie, Peter T.; Bunn, Emory F.

    In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less

  12. Cognitive load, emotion, and performance in high-fidelity simulation among beginning nursing students: a pilot study.

    PubMed

    Schlairet, Maura C; Schlairet, Timothy James; Sauls, Denise H; Bellflowers, Lois

    2015-03-01

    Establishing the impact of the high-fidelity simulation environment on student performance, as well as identifying factors that could predict learning, would refine simulation outcome expectations among educators. The purpose of this quasi-experimental pilot study was to explore the impact of simulation on emotion and cognitive load among beginning nursing students. Forty baccalaureate nursing students participated in teaching simulations, rated their emotional state and cognitive load, and completed evaluation simulations. Two principal components of emotion were identified representing the pleasant activation and pleasant deactivation components of affect. Mean rating of cognitive load following simulation was high. Linear regression identiffed slight but statistically nonsignificant positive associations between principal components of emotion and cognitive load. Logistic regression identified a negative but statistically nonsignificant effect of cognitive load on assessment performance. Among lower ability students, a more pronounced effect of cognitive load on assessment performance was observed; this also was statistically non-significant. Copyright 2015, SLACK Incorporated.

  13. Non-rigid image registration using a statistical spline deformation model.

    PubMed

    Loeckx, Dirk; Maes, Frederik; Vandermeulen, Dirk; Suetens, Paul

    2003-07-01

    We propose a statistical spline deformation model (SSDM) as a method to solve non-rigid image registration. Within this model, the deformation is expressed using a statistically trained B-spline deformation mesh. The model is trained by principal component analysis of a training set. This approach allows to reduce the number of degrees of freedom needed for non-rigid registration by only retaining the most significant modes of variation observed in the training set. User-defined transformation components, like affine modes, are merged with the principal components into a unified framework. Optimization proceeds along the transformation components rather then along the individual spline coefficients. The concept of SSDM's is applied to the temporal registration of thorax CR-images using pattern intensity as the registration measure. Our results show that, using 30 training pairs, a reduction of 33% is possible in the number of degrees of freedom without deterioration of the result. The same accuracy as without SSDM's is still achieved after a reduction up to 66% of the degrees of freedom.

  14. The Seasonality and Ecology of the Anopheles gambiae complex (Dipetra: Culicidae) in Liberia Using Molecular Identification.

    PubMed

    Fahmy, N T; Villinski, J T; Bolay, F; Stoops, C A; Tageldin, R A; Fakoli, L; Okasha, O; Obenauer, P J; Diclaro, J W

    2015-05-01

    Members of the Anopheles gambiae sensu lato (Giles) complex define a group of seven morphologically indistinguishable species, including the principal malaria vectors in Sub-Saharan Africa. Members of this complex differ in behavior and ability to transmit malaria; hence, precise identification of member species is critical to monitoring and evaluating malaria threat levels. We collected mosquitoes from five counties in Liberia every other month from May 2011 until May 2012, using various trapping techniques. A. gambiae complex members were identified using molecular techniques based on differences in the ribosomal DNA (rDNA) region between species and the molecular forms (S and M) of A. gambiae sensu stricto (s.s) specimens. In total, 1,696 A. gambiae mosquitoes were collected and identified. DNA was extracted from legs of each specimen with species identification determined by multiplex polymerase chain reaction using specific primers. The molecular forms (M or S) of A. gambiae s.s were determined by restriction fragment length polymorphism. Bivariate and multivariate logistic regression models identified environmental variables associated with genomic differentiation. Our results indicate widespread occurrence of A. gambiae s.s., the principal malaria vector in the complex, although two Anopheles melas Theobald/A. merus Donitz mosquitoes were detected. We found 72.6, 25.5, and 1.9% of A. gambiae s.s specimens were S, M, and hybrid forms, respectively. Statistical analysis indicates that the S form was more likely to be found in rural areas during rainy seasons and indoor catchments. This information will enhance vector control efforts in Liberia. Published by Oxford University Press on behalf of Entomological Society of America 2015. This work is written by US Government employees and is in the public domain in the US.

  15. The best motivator priorities parents choose via analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Latha, P.

    2015-05-01

    Motivation is probably the most important factor that educators can target in order to improve learning. Numerous cross-disciplinary theories have been postulated to explain motivation. While each of these theories has some truth, no single theory seems to adequately explain all human motivation. The fact is that human beings in general and pupils in particular are complex creatures with complex needs and desires. In this paper, Analytic Hierarchy Process (AHP) has been proposed as an emerging solution to move towards too large, dynamic and complex real world multi-criteria decision making problems in selecting the most suitable motivator when choosing school for their children. Data were analyzed using SPSS 17.0 ("Statistical Package for Social Science") software. Statistic testing used are descriptive and inferential statistic. Descriptive statistic used to identify respondent pupils and parents demographic factors. The statistical testing used to determine the pupils and parents highest motivator priorities and parents' best priorities using AHP to determine the criteria chosen by parents such as school principals, teachers, pupils and parents. The moderating factors are selected schools based on "Standard Kualiti Pendidikan Malaysia" (SKPM) in Ampang. Inferential statistics such as One-way ANOVA used to get the significant and data used to calculate the weightage of AHP. School principals is found to be the best motivator for parents in choosing school for their pupils followed by teachers, parents and pupils.

  16. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Fresh Biomass Estimation in Heterogeneous Grassland Using Hyperspectral Measurements and Multivariate Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Darvishzadeh, R.; Skidmore, A. K.; Mirzaie, M.; Atzberger, C.; Schlerf, M.

    2014-12-01

    Accurate estimation of grassland biomass at their peak productivity can provide crucial information regarding the functioning and productivity of the rangelands. Hyperspectral remote sensing has proved to be valuable for estimation of vegetation biophysical parameters such as biomass using different statistical techniques. However, in statistical analysis of hyperspectral data, multicollinearity is a common problem due to large amount of correlated hyper-spectral reflectance measurements. The aim of this study was to examine the prospect of above ground biomass estimation in a heterogeneous Mediterranean rangeland employing multivariate calibration methods. Canopy spectral measurements were made in the field using a GER 3700 spectroradiometer, along with concomitant in situ measurements of above ground biomass for 170 sample plots. Multivariate calibrations including partial least squares regression (PLSR), principal component regression (PCR), and Least-Squared Support Vector Machine (LS-SVM) were used to estimate the above ground biomass. The prediction accuracy of the multivariate calibration methods were assessed using cross validated R2 and RMSE. The best model performance was obtained using LS_SVM and then PLSR both calibrated with first derivative reflectance dataset with R2cv = 0.88 & 0.86 and RMSEcv= 1.15 & 1.07 respectively. The weakest prediction accuracy was appeared when PCR were used (R2cv = 0.31 and RMSEcv= 2.48). The obtained results highlight the importance of multivariate calibration methods for biomass estimation when hyperspectral data are used.

  18. Statistical downscaling rainfall using artificial neural network: significantly wetter Bangkok?

    NASA Astrophysics Data System (ADS)

    Vu, Minh Tue; Aribarg, Thannob; Supratid, Siriporn; Raghavan, Srivatsan V.; Liong, Shie-Yui

    2016-11-01

    Artificial neural network (ANN) is an established technique with a flexible mathematical structure that is capable of identifying complex nonlinear relationships between input and output data. The present study utilizes ANN as a method of statistically downscaling global climate models (GCMs) during the rainy season at meteorological site locations in Bangkok, Thailand. The study illustrates the applications of the feed forward back propagation using large-scale predictor variables derived from both the ERA-Interim reanalyses data and present day/future GCM data. The predictors are first selected over different grid boxes surrounding Bangkok region and then screened by using principal component analysis (PCA) to filter the best correlated predictors for ANN training. The reanalyses downscaled results of the present day climate show good agreement against station precipitation with a correlation coefficient of 0.8 and a Nash-Sutcliffe efficiency of 0.65. The final downscaled results for four GCMs show an increasing trend of precipitation for rainy season over Bangkok by the end of the twenty-first century. The extreme values of precipitation determined using statistical indices show strong increases of wetness. These findings will be useful for policy makers in pondering adaptation measures due to flooding such as whether the current drainage network system is sufficient to meet the changing climate and to plan for a range of related adaptation/mitigation measures.

  19. Estimations of ABL fluxes and other turbulence parameters from Doppler lidar data

    NASA Technical Reports Server (NTRS)

    Gal-Chen, Tzvi; Xu, Mei; Eberhard, Wynn

    1989-01-01

    Techniques for extraction boundary layer parameters from measurements of a short-pulse CO2 Doppler lidar are described. The measurements are those collected during the First International Satellites Land Surface Climatology Project (ISLSCP) Field Experiment (FIFE). By continuously operating the lidar for about an hour, stable statistics of the radial velocities can be extracted. Assuming that the turbulence is horizontally homogeneous, the mean wind, its standard deviations, and the momentum fluxes were estimated. Spectral analysis of the radial velocities is also performed from which, by examining the amplitude of the power spectrum at the inertial range, the kinetic energy dissipation was deduced. Finally, using the statistical form of the Navier-Stokes equations, the surface heat flux is derived as the residual balance between the vertical gradient of the third moment of the vertical velocity and the kinetic energy dissipation. Combining many measurements would normally reduce the error provided that, it is unbiased and uncorrelated. The nature of some of the algorithms however, is such that, biased and correlated errors may be generated even though the raw measurements are not. Data processing procedures were developed that eliminate bias and minimize error correlation. Once bias and error correlations are accounted for, the large sample size is shown to reduce the errors substantially. The principal features of the derived turbulence statistics for two case studied are presented.

  20. Energy resolution improvement of CdTe detectors by using the principal component analysis technique

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2018-02-01

    In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.

  1. Interactive Communication: A Few Research Answers for a Technological Explosion.

    ERIC Educational Resources Information Center

    Chapanis, Alphonse

    The techniques, procedures, and principal findings of 15 different experiments in a research program on interactive communication are summarized in this paper. Among the principal findings reported are that: problems are solved faster in communication modes that have a voice channel than in those that do not have a voice channel, modes of…

  2. Strategies for Solidarity Education at Catholic Schools in Chile: Approximations and Descriptions from the Perspectives of School Principals

    ERIC Educational Resources Information Center

    Santana Lopez, Alejandra Isabel; Hernandez Mary, Natalia

    2013-01-01

    This research project sought to learn how solidarity education is manifested in Chilean Catholic schools, considering the perspectives of school principals, programme directors and pastoral teams. Eleven Chilean schools were studied and the information gathering techniques applied included: a questionnaire, semi-structured individual interviews…

  3. Principals' and Teachers' Reports of Successful Teaching Strategies with Children with High-Functioning Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Stokes, Mark A.; Thomson, Mary; Macmillan, Caitlin A.; Pecora, Laura; Dymond, Sarah R.; Donaldson, Emma

    2017-01-01

    Little research has been conducted on the evidence base for educational interventions implemented by teachers targeting students with high-functioning autism spectrum disorder (HFASD). Research examining particular techniques perceived as effective may facilitate guidelines for the application of evidence-based practices. A principal and teacher…

  4. Principal Evaluation--Linking Individual and Building-Level Progress: Making the Connections and Embracing the Tensions

    ERIC Educational Resources Information Center

    Zepeda, Sally J.; Lanoue, Philip D.; Price, Noris F.; Jimenez, Albert M.

    2014-01-01

    The article examines the tensions one superintendent in the USA experienced as he evaluated principals in a high-stakes environment that had undergone numerous transformations at the central office. Using qualitative methods, primarily, shadowing techniques, observations and debriefing, the following tensions emerged and were examined in light of…

  5. Job Satisfaction: Factor Analysis of Greek Primary School Principals' Perceptions

    ERIC Educational Resources Information Center

    Saiti, Anna; Fassoulis, Konstantinos

    2012-01-01

    Purpose: The purpose of this paper is to investigate the factors that affect the level of job satisfaction that school principals experience and, based on the findings, to suggest policies or techniques for improving it. Design/methodology/approach: Questionnaires were administered to 180 primary school heads in 13 prefectures--one from each of…

  6. [Applications of three-dimensional fluorescence spectrum of dissolved organic matter to identification of red tide algae].

    PubMed

    Lü, Gui-Cai; Zhao, Wei-Hong; Wang, Jiang-Tao

    2011-01-01

    The identification techniques for 10 species of red tide algae often found in the coastal areas of China were developed by combining the three-dimensional fluorescence spectra of fluorescence dissolved organic matter (FDOM) from the cultured red tide algae with principal component analysis. Based on the results of principal component analysis, the first principal component loading spectrum of three-dimensional fluorescence spectrum was chosen as the identification characteristic spectrum for red tide algae, and the phytoplankton fluorescence characteristic spectrum band was established. Then the 10 algae species were tested using Bayesian discriminant analysis with a correct identification rate of more than 92% for Pyrrophyta on the level of species, and that of more than 75% for Bacillariophyta on the level of genus in which the correct identification rates were more than 90% for the phaeodactylum and chaetoceros. The results showed that the identification techniques for 10 species of red tide algae based on the three-dimensional fluorescence spectra of FDOM from the cultured red tide algae and principal component analysis could work well.

  7. Multi-segmental movements as a function of experience in karate.

    PubMed

    Zago, Matteo; Codari, Marina; Iaia, F Marcello; Sforza, Chiarella

    2017-08-01

    Karate is a martial art that partly depends on subjective scoring of complex movements. Principal component analysis (PCA)-based methods can identify the fundamental synergies (principal movements) of motor system, providing a quantitative global analysis of technique. In this study, we aimed at describing the fundamental multi-joint synergies of a karate performance, under the hypothesis that the latter are skilldependent; estimate karateka's experience level, expressed as years of practice. A motion capture system recorded traditional karate techniques of 10 professional and amateur karateka. At any time point, the 3D-coordinates of body markers produced posture vectors that were normalised, concatenated from all karateka and submitted to a first PCA. Five principal movements described both gross movement synergies and individual differences. A second PCA followed by linear regression estimated the years of practice using principal movements (eigenpostures and weighting curves) and centre of mass kinematics (error: 3.71 years; R2 = 0.91, P ≪ 0.001). Principal movements and eigenpostures varied among different karateka and as functions of experience. This approach provides a framework to develop visual tools for the analysis of motor synergies in karate, allowing to detect the multi-joint motor patterns that should be restored after an injury, or to be specifically trained to increase performance.

  8. Atomic-scale phase composition through multivariate statistical analysis of atom probe tomography data.

    PubMed

    Keenan, Michael R; Smentkowski, Vincent S; Ulfig, Robert M; Oltman, Edward; Larson, David J; Kelly, Thomas F

    2011-06-01

    We demonstrate for the first time that multivariate statistical analysis techniques can be applied to atom probe tomography data to estimate the chemical composition of a sample at the full spatial resolution of the atom probe in three dimensions. Whereas the raw atom probe data provide the specific identity of an atom at a precise location, the multivariate results can be interpreted in terms of the probabilities that an atom representing a particular chemical phase is situated there. When aggregated to the size scale of a single atom (∼0.2 nm), atom probe spectral-image datasets are huge and extremely sparse. In fact, the average spectrum will have somewhat less than one total count per spectrum due to imperfect detection efficiency. These conditions, under which the variance in the data is completely dominated by counting noise, test the limits of multivariate analysis, and an extensive discussion of how to extract the chemical information is presented. Efficient numerical approaches to performing principal component analysis (PCA) on these datasets, which may number hundreds of millions of individual spectra, are put forward, and it is shown that PCA can be computed in a few seconds on a typical laptop computer.

  9. Tools based on multivariate statistical analysis for classification of soil and groundwater in Apulian agricultural sites.

    PubMed

    Ielpo, Pierina; Leardi, Riccardo; Pappagallo, Giuseppe; Uricchio, Vito Felice

    2017-06-01

    In this paper, the results obtained from multivariate statistical techniques such as PCA (Principal component analysis) and LDA (Linear discriminant analysis) applied to a wide soil data set are presented. The results have been compared with those obtained on a groundwater data set, whose samples were collected together with soil ones, within the project "Improvement of the Regional Agro-meteorological Monitoring Network (2004-2007)". LDA, applied to soil data, has allowed to distinguish the geographical origin of the sample from either one of the two macroaeras: Bari and Foggia provinces vs Brindisi, Lecce e Taranto provinces, with a percentage of correct prediction in cross validation of 87%. In the case of the groundwater data set, the best classification was obtained when the samples were grouped into three macroareas: Foggia province, Bari province and Brindisi, Lecce and Taranto provinces, by reaching a percentage of correct predictions in cross validation of 84%. The obtained information can be very useful in supporting soil and water resource management, such as the reduction of water consumption and the reduction of energy and chemical (nutrients and pesticides) inputs in agriculture.

  10. Motion Control of Drives for Prosthetic Hand Using Continuous Myoelectric Signals

    NASA Astrophysics Data System (ADS)

    Purushothaman, Geethanjali; Ray, Kalyan Kumar

    2016-03-01

    In this paper the authors present motion control of a prosthetic hand, through continuous myoelectric signal acquisition, classification and actuation of the prosthetic drive. A four channel continuous electromyogram (EMG) signal also known as myoelectric signals (MES) are acquired from the abled-body to classify the six unique movements of hand and wrist, viz, hand open (HO), hand close (HC), wrist flexion (WF), wrist extension (WE), ulnar deviation (UD) and radial deviation (RD). The classification technique involves in extracting the features/pattern through statistical time domain (TD) parameter/autoregressive coefficients (AR), which are reduced using principal component analysis (PCA). The reduced statistical TD features and or AR coefficients are used to classify the signal patterns through k nearest neighbour (kNN) as well as neural network (NN) classifier and the performance of the classifiers are compared. Performance comparison of the above two classifiers clearly shows that kNN classifier in identifying the hidden intended motion in the myoelectric signals is better than that of NN classifier. Once the classifier identifies the intended motion, the signal is amplified to actuate the three low power DC motor to perform the above mentioned movements.

  11. Sequential analysis of hydrochemical data for watershed characterization.

    PubMed

    Thyne, Geoffrey; Güler, Cüneyt; Poeter, Eileen

    2004-01-01

    A methodology for characterizing the hydrogeology of watersheds using hydrochemical data that combine statistical, geochemical, and spatial techniques is presented. Surface water and ground water base flow and spring runoff samples (180 total) from a single watershed are first classified using hierarchical cluster analysis. The statistical clusters are analyzed for spatial coherence confirming that the clusters have a geological basis corresponding to topographic flowpaths and showing that the fractured rock aquifer behaves as an equivalent porous medium on the watershed scale. Then principal component analysis (PCA) is used to determine the sources of variation between parameters. PCA analysis shows that the variations within the dataset are related to variations in calcium, magnesium, SO4, and HCO3, which are derived from natural weathering reactions, and pH, NO3, and chlorine, which indicate anthropogenic impact. PHREEQC modeling is used to quantitatively describe the natural hydrochemical evolution for the watershed and aid in discrimination of samples that have an anthropogenic component. Finally, the seasonal changes in the water chemistry of individual sites were analyzed to better characterize the spatial variability of vertical hydraulic conductivity. The integrated result provides a method to characterize the hydrogeology of the watershed that fully utilizes traditional data.

  12. A Methodology for the Parametric Reconstruction of Non-Steady and Noisy Meteorological Time Series

    NASA Astrophysics Data System (ADS)

    Rovira, F.; Palau, J. L.; Millán, M.

    2009-09-01

    Climatic and meteorological time series often show some persistence (in time) in the variability of certain features. One could regard annual, seasonal and diurnal time variability as trivial persistence in the variability of some meteorological magnitudes (as, e.g., global radiation, air temperature above surface, etc.). In these cases, the traditional Fourier transform into frequency space will show the principal harmonics as the components with the largest amplitude. Nevertheless, meteorological measurements often show other non-steady (in time) variability. Some fluctuations in measurements (at different time scales) are driven by processes that prevail on some days (or months) of the year but disappear on others. By decomposing a time series into time-frequency space through the continuous wavelet transformation, one is able to determine both the dominant modes of variability and how those modes vary in time. This study is based on a numerical methodology to analyse non-steady principal harmonics in noisy meteorological time series. This methodology combines both the continuous wavelet transform and the development of a parametric model that includes the time evolution of the principal and the most statistically significant harmonics of the original time series. The parameterisation scheme proposed in this study consists of reproducing the original time series by means of a statistically significant finite sum of sinusoidal signals (waves), each defined by using the three usual parameters: amplitude, frequency and phase. To ensure the statistical significance of the parametric reconstruction of the original signal, we propose a standard statistical t-student analysis of the confidence level of the amplitude in the parametric spectrum for the different wave components. Once we have assured the level of significance of the different waves composing the parametric model, we can obtain the statistically significant principal harmonics (in time) of the original time series by using the Fourier transform of the modelled signal. Acknowledgements The CEAM Foundation is supported by the Generalitat Valenciana and BANCAIXA (València, Spain). This study has been partially funded by the European Commission (FP VI, Integrated Project CIRCE - No. 036961) and by the Ministerio de Ciencia e Innovación, research projects "TRANSREG” (CGL2007-65359/CLI) and "GRACCIE” (CSD2007-00067, Program CONSOLIDER-INGENIO 2010).

  13. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  14. Radar fall detection using principal component analysis

    NASA Astrophysics Data System (ADS)

    Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem

    2016-05-01

    Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.

  15. Forest statistics for the Southern Piedmont of Virginia, 1991

    Treesearch

    Tony G. Johnson

    1991-01-01

    This report highlights the principal findings of the sixth forest survey of the Southern Piedmont of Virginia. Field work began in March 1991 and was completed in June 1991. Five previous surveys, completed in 1940, 1957, 1965, 1976, and 1985, provide statistics for measuring changes and trends over the past 51 years. The primary emphasis in this report is on the...

  16. Forest statistics for the Northern Mountains of Virginia, 1992

    Treesearch

    Tony G. Johnson

    1992-01-01

    This report highlights the principal findings of the sixth forest survey of the Northern Mountains of Virginia. Field work began in September 1991 and was completed in November 1991. Five previous surveys, completed in 1940, 1957, 1966, 1977, and 1986, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on...

  17. Forest statistics for the Southern Coastal Plain of North Carolina, 1990

    Treesearch

    Tony G. Johnson

    1990-01-01

    This report highlights the principal findings of the sixth forest survey of the Southern Coastal Plain of North Carolina. Field work began in April 1989 and was completed in September 1989. Five previous surveys, completed in 1937, 1952, 1962, 1973, and 1983, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report...

  18. Forest statistics for the mountains of North Carolina, 1984

    Treesearch

    Gerald C. Craver

    1985-01-01

    This report highlights the principal findings of the fifth forest survey in the Mountains of North Carolina. Fieldwork began in April 1984 and was completed in September 1984. Four previous surveys, completed in 1938, 1955, 1964, and 1974, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes...

  19. Forest statistics for North Central Georgia, 1989

    Treesearch

    Tony G. Johnson

    1989-01-01

    This report highlights the principal findings of the sixth forest survey in North Central Georgia. Field work began in February 1989 and was completed in April 1989. Five previous surveys, completed in 1936, 1953, 1961, 1972, and 1983, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report is on the changes and...

  20. Forest statistics for the Piedmont of North Carolina 1975

    Treesearch

    Richard L. Welch

    1975-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Piedmont of North Carolina. The inventory was started in May 1964 and completed in January 1975. Three previous inventories, completed in 1937, 1956, and 1964m provide statistics for measuring changes and trends over the past 38 years. In this report, the primary...

  1. Forest statistics for the Northern Coastal Plain of North Carolina, 1984

    Treesearch

    Edgar L. Davenport

    1984-01-01

    This report highlights the principal findings of the fifth forest inventory in the Northern Coastal Plain of North Carolina. Fieldwork began in June 1983 and was completed in December 1983. Four previous surveys, completed in 1937, 1955, 1963, and 1974, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on...

  2. Forest statistics for the Southern Coastal Plain of North Carolina, 1983

    Treesearch

    John B. Tansey

    1984-01-01

    This report highlights the principal findings of the fifth forest survey in the southern Coastal Plain of North Carolina. Fieldwork began in November 1982 and was completed in June 1983. Four previous surveys, completed in 1938, 1952, 1962, and 1973, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on...

  3. Forest statistics for Florida, 1980

    Treesearch

    William A. Bechtold; Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth inventory of Florida’s forests. Fieldwork began in September 1978 and was completed in May 1980. Four previous surveys, completed in 1936, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is on the changes and trends since 1970...

  4. Forest statistics for the Piedmont of South Carolina 1977

    Treesearch

    Nolan L. Snyder

    1977-01-01

    This report highlights the principal findings of the fifth inventory of the timber resource in the Piedmont of South Carolina. The inventory was started in April 1977 and completed in September 1977. Four previous inventories, completed in 1936, 1947, 1958, and 1967, provide statistics for measuring changes and trends over the past 41 years. In this report, the primary...

  5. Forest statistics for the Southern Coastal Plain of South Carolina 1978

    Treesearch

    Raymond M. Sheffield; Joanne Hutchison

    1978-01-01

    This report highlights the principal findings of the fifth forest inventory of the Southern Coastal Plain of South Carolina. Fieldwork began in April 1978 and was completed in August 1978. Four previous inventories, completed in 1934, 1947, 1958, and 1968, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is...

  6. Forest statistics for the Northern Piedmont of Virginia, 1986

    Treesearch

    Mark J. Brown

    1986-01-01

    This report highlights the principal findings of the fifth forest survey in the Northern Piedmont of Virginia. Fieldwork began in July 1985 and was completed in September 1985. Four previous surveys, completed in 1940, 1957, 1965, and 1976, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes...

  7. Forest statistics for the Piedmont of North Carolina, 1984

    Treesearch

    Cecil C. Hutchins

    1984-01-01

    This report highlights the principal findings of the fifth forest survey in the Piedmont of North Carolina, Fieldwork began in December 1983 and was completed in August 1984, Four previous surveys, completed in 1937, 1956, 1964, and 1975, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes...

  8. Forest statistics for the Northern Piedmont of Virginia, 1992

    Treesearch

    Michael T. Thompson

    1992-01-01

    This report highlights the principal findings of the sixth forest survey of the Northern Piedmont of Virginia. Field work began in June 1991 and was completed in September 1991. Five previous surveys, completed in 1940, 1957, 1965, 1976, and 1986, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the...

  9. Forest statistics for the Southern Coastal Plain of South Carolina, 1987

    Treesearch

    John B. Tansey

    1987-01-01

    This report highlights the principal findings of the sixth forest survey in the Southern Coastal plain of South Carolina. Fieldwork began in June 1986 and was completed in September 1986. Five previous surveys, completed in 1934, 1947, 1958, 1968, and 1978, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report...

  10. Forest statistics for South Florida, 1980

    Treesearch

    Raymond M. Sheffield; William A. Bechtold

    1981-01-01

    This report highlights the principal findings of the fifth inventory of Florida’s forests. Fieldwork began in September 1978 and was completed in May 1980. Four previous surveys, completed in 1936, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is on the changes and trends since 1970...

  11. Forest statistics for the Northern Coastal plain of North Carolina 1974

    Treesearch

    Richard L. Welch; Herbert A. Knight

    1974-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Northern Coastal Plain of North Carolina. The inventory was started in July 1973 and completed in May 1974. Three previous inventories, completed in 1937, 1955, and 1963, provide statistics for measuring changes and trends over the past 37 years. In this report, the...

  12. Forest statistics for the Coastal Plain of Virginia, 1991

    Treesearch

    Michael T. Thompson

    1991-01-01

    This report highlights the principal findings of the sixth forest survey of the Coastal Plain of Virginia. Field work began in October 1990 and was completed in March 1991. Five previous surveys, completed in 1940, 1956, 1966, 1976, and 1985, provide statistics for measuring changes and trends over the past 51 years. The primary emphasis in this report is on the...

  13. Forest statistics for the mountain region of North Carolina 1974

    Treesearch

    Noel D. Cost

    1974-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Mountain Region of North Carolina. The inventory was started in May 1974 and completed in September 1974. Three previous inventories, completed in 1938, 1955, and 1964, provide statistic for measuring changes and trends over the past 36 years. In this report, the primary...

  14. Forest statistics for the mountains of North Carolina, 1990

    Treesearch

    Tony G. Johnson

    1991-01-01

    This report highlights the principal findings of the sixth forest survey of the Mountains of North Carolina. Field work began in August 1990 and was completed in November 1990. Five previous surveys, completed in 1938, 1955, 1964, 1974, and 1984, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the...

  15. Forest statistics for the Southern Mountain region of Virginia, 1977

    Treesearch

    Raymond M. Sheffield

    1977-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Southern Mountain Region of Virginia. The inventory was started in December 1976 and completed in March 1977. Three previous inventories, completed in 1940, 1957, and 1966, provide statistics for measuring changes and trends over the past 37 years. In this report, the...

  16. Forest statistics for the Coastal Plain of Virginia, 1985

    Treesearch

    Mark J. Brown; Gerald C. Craver

    1985-01-01

    This report highlights the principal findings of the fifth forest survey in the Coastal Plain of Virginia. Fieldwork began in September 1984 and was completed in February 1985. Four previous surveys, completed in 1940, 1956, 1966, and 1976, provide statistics for measuring changes and trends over the past 45 years. The primary emphasis in this report is on the changes...

  17. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    PubMed

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.

  18. Occurrence and transport of pesticides and alkylphenols in water samples along the Ebro River Basin

    NASA Astrophysics Data System (ADS)

    Navarro, Alícia; Tauler, Romà; Lacorte, Sílvia; Barceló, Damià

    2010-03-01

    SummaryWe report the temporal and geographical variations of a set of 30 pesticides (including triazines, organophosphorus and acetanilides) and industrial compounds in surface waters along the Ebro River during the period 2004-2006. Using descriptive statistics we found that the compounds with industrial origin (tributylphosphate, octylphenol and nonylphenol) appeared in over 60% of the samples analyzed and at very high concentrations, while pesticides had a point source origin in the Ebro delta area and overall low-levels, between 0.005 and 2.575 μg L -1. Correlations among pollutants and their distributions were studied using Principal Component Analysis (PCA), a multivariate exploratory data analysis technique which permitted us to discern between agricultural and industrial source contamination. Over a 3 years period a seasonal trend revealed highest concentrations of pesticides over the spring-summer period following pesticide application.

  19. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  20. Identifying Symptom Patterns in People Living With HIV Disease.

    PubMed

    Wilson, Natalie L; Azuero, Andres; Vance, David E; Richman, Joshua S; Moneyham, Linda D; Raper, James L; Heath, Sonya L; Kempf, Mirjam-Colette

    2016-01-01

    Symptoms guide disease management, and patients frequently report HIV-related symptoms, but HIV symptom patterns reported by patients have not been described in the era of improved antiretroviral treatment. The objectives of our study were to investigate the prevalence and burden of symptoms in people living with HIV and attending an outpatient clinic. The prevalence, burden, and bothersomeness of symptoms reported by patients in routine clinic visits during 2011 were assessed using the 20-item HIV Symptom Index. Principal component analysis was used to identify symptom clusters and relationships between groups using appropriate statistic techniques. Two main clusters were identified. The most prevalent and bothersome symptoms were muscle aches/joint pain, fatigue, and poor sleep. A third of patients had seven or more symptoms, including the most burdensome symptoms. Even with improved antiretroviral drug side-effect profiles, symptom prevalence and burden, independent of HIV viral load and CD4+ T cell count, are high. Published by Elsevier Inc.

  1. Identifying Symptom Patterns in People Living With HIV Disease

    PubMed Central

    Wilson, Natalie L.; Azuero, Andres; Vance, David E.; Richman, Joshua S.; Moneyham, Linda D.; Raper, James L.; Heath, Sonya L.; Kempf, Mirjam-Colette

    2016-01-01

    Symptoms guide disease management, and patients frequently report HIV-related symptoms, but HIV symptom patterns reported by patients have not been described in the era of improved antiretroviral treatment. The objectives of our study were to investigate the prevalence and burden of symptoms in people living with HIV and attending an outpatient clinic. The prevalence, burden, and bothersomeness of symptoms reported by patients in routine clinic visits during 2011 were assessed using the 20-item HIV Symptom Index. Principal component analysis was used to identify symptom clusters and relationships between groups using appropriate statistic techniques. Two main clusters were identified. The most prevalent and bothersome symptoms were muscle aches/joint pain, fatigue, and poor sleep. A third of patients had seven or more symptoms, including the most burdensome symptoms. Even with improved antiretroviral drug side-effect profiles, symptom prevalence and burden, independent of HIV viral load and CD4+ T cell count, are high. PMID:26790340

  2. Intelligent guidance and control for wind shear encounter

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1988-01-01

    The principal objective is to develop methods for assessing the likelihood of wind shear encounter, for deciding what flight path to pursue, and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's cockpit displays and autopilot for both manually controlled and automatic flight. The program has begun with the development of a real-time expert system for pilot aiding that is based on the results of the FAA Windshear Training Aids Program. A two-volume manual that presents an overview, pilot guide, training program, and substantiating data provides guidelines for this initial development. The Expert System to Avoid Wind Shear (ESAWS) currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP machine.

  3. Drop coating deposition Raman spectroscopy of blood plasma for the detection of colorectal cancer

    NASA Astrophysics Data System (ADS)

    Li, Pengpeng; Chen, Changshui; Deng, Xiaoyuan; Mao, Hua; Jin, Shaoqin

    2015-03-01

    We have recently applied the technique of drop coating deposition Raman (DCDR) spectroscopy for colorectal cancer (CRC) detection using blood plasma. The aim of this study was to develop a more convenient and stable method based on blood plasma for noninvasive CRC detection. Significant differences are observed in DCDR spectra between healthy (n=105) and cancer (n=75) plasma from 15 CRC patients and 21 volunteers, particularly in the spectra that are related to proteins, nucleic acids, and β-carotene. The multivariate analysis principal components analysis and the linear discriminate analysis, together with leave-one-out, cross validation were used on DCDR spectra and yielded a sensitivity of 100% (75/75) and specificity of 98.1% (103/105) for detection of CRC. This study demonstrates that DCDR spectroscopy of blood plasma associated with multivariate statistical algorithms has the potential for the noninvasive detection of CRC.

  4. Microorganisms detection on substrates using QCL spectroscopy

    NASA Astrophysics Data System (ADS)

    Padilla-Jiménez, Amira C.; Ortiz-Rivera, William; Castro-Suarez, John R.; Ríos-Velázquez, Carlos; Vázquez-Ayala, Iris; Hernández-Rivera, Samuel P.

    2013-05-01

    Recent investigations have focused on the improvement of rapid and accurate methods to develop spectroscopic markers of compounds constituting microorganisms that are considered biological threats. Quantum cascade lasers (QCL) systems have revolutionized many areas of research and development in defense and security applications, including his area of research. Infrared spectroscopy detection based on QCL was employed to acquire mid infrared (MIR) spectral signatures of Bacillus thuringiensis (Bt), Escherichia coli (Ec) and Staphylococcus epidermidis (Se), which were used as biological agent simulants of biothreats. The experiments were carried out in reflection mode on various substrates such as cardboard, glass, travel baggage, wood and stainless steel. Chemometrics statistical routines such as principal component analysis (PCA) regression and partial least squares-discriminant analysis (PLS-DA) were applied to the recorded MIR spectra. The results show that the infrared vibrational techniques investigated are useful for classification/detection of the target microorganisms on the types of substrates studied.

  5. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  6. 77 FR 59004 - Membership of the Senior Executive Service Standing Performance Review Boards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-25

    ..., MELODEE PRINCIPAL DEPUTY ADMINISTRATOR, OFFICE OF JUVENILE JUSTICE AND DELINQUENCY PREVENTION. FEUCHT.... SABOL, WILLIAM DEPUTY DIRECTOR, BUREAU OF JUSTICE STATISTICS. RIDGEWAY, GREG DEPUTY DIRECTOR, NATIONAL...

  7. Digital spatial data for predicted nitrate and arsenic concentrations in basin-fill aquifers of the Southwest Principal Aquifers study area

    USGS Publications Warehouse

    McKinney, Tim S.; Anning, David W.

    2012-01-01

    This product "Digital spatial data for predicted nitrate and arsenic concentrations in basin-fill aquifers of the Southwest Principal Aquifers study area" is a 1:250,000-scale vector spatial dataset developed as part of a regional Southwest Principal Aquifers (SWPA) study (Anning and others, 2012). The study examined the vulnerability of basin-fill aquifers in the southwestern United States to nitrate contamination and arsenic enrichment. Statistical models were developed by using the random forest classifier algorithm to predict concentrations of nitrate and arsenic across a model grid that represents local- and basin-scale measures of source, aquifer susceptibility, and geochemical conditions.

  8. A novel principal component analysis for spatially misaligned multivariate air pollution data.

    PubMed

    Jandarov, Roman A; Sheppard, Lianne A; Sampson, Paul D; Szpiro, Adam A

    2017-01-01

    We propose novel methods for predictive (sparse) PCA with spatially misaligned data. These methods identify principal component loading vectors that explain as much variability in the observed data as possible, while also ensuring the corresponding principal component scores can be predicted accurately by means of spatial statistics at locations where air pollution measurements are not available. This will make it possible to identify important mixtures of air pollutants and to quantify their health effects in cohort studies, where currently available methods cannot be used. We demonstrate the utility of predictive (sparse) PCA in simulated data and apply the approach to annual averages of particulate matter speciation data from national Environmental Protection Agency (EPA) regulatory monitors.

  9. Depth resolved compositional analysis of aluminium oxide thin film using non-destructive soft x-ray reflectivity technique

    NASA Astrophysics Data System (ADS)

    Sinha, Mangalika; Modi, Mohammed H.

    2017-10-01

    In-depth compositional analysis of 240 Å thick aluminium oxide thin film has been carried out using soft x-ray reflectivity (SXR) and x-ray photoelectron spectroscopy technique (XPS). The compositional details of the film is estimated by modelling the optical index profile obtained from the SXR measurements over 60-200 Å wavelength region. The SXR measurements are carried out at Indus-1 reflectivity beamline. The method suggests that the principal film region is comprised of Al2O3 and AlOx (x = 1.6) phases whereas the interface region comprised of SiO2 and AlOx (x = 1.6) mixture. The soft x-ray reflectivity technique combined with XPS measurements explains the compositional details of principal layer. Since the interface region cannot be analyzed with the XPS technique in a non-destructive manner in such a case the SXR technique is a powerful tool for nondestructive compositional analysis of interface region.

  10. Perceptions of Secondary School Principals on Management of Cultural Diversity in Spain. The Challenge of Educational Leadership

    ERIC Educational Resources Information Center

    Gómez-Hurtado, Inmaculada; González-Falcón, Inmaculada; Coronel, José M.

    2018-01-01

    The aim of this article is to examine how school principals perceive cultural diversity and management. To this end, qualitative research was carried out for one semester in four secondary schools in Andalusia (Spain). Through interviews and discussion groups, triangulated with other qualitative research techniques, we explored the mindset and…

  11. The Principal's Companion: Strategies and Hints To Make the Job Easier. Second Edition.

    ERIC Educational Resources Information Center

    Robbins, Pam; Alvy, Harvey B.

    Despite the administrative leadership that most principals receive in university courses, their most useful learning occurs once they are on the job. The new knowledge--much of it the result of trial and error--is gained in relative isolation. This second edition provides ideas, approaches, strategies, resources, tools, techniques, and reflective…

  12. The role of the elementary school principal — as perceived by Israeli principals an attempt at role analysis

    NASA Astrophysics Data System (ADS)

    Kremer, Lya

    1983-03-01

    The study aimed to define the role of principalship, necessary because of the conflicting loyalties and responsibilities inherent in this role. Four `facets' were used to classify areas of principals' activities, from which a questionnaire was constructed. Ninety principals, representing most of those in the northern part of Israel, participated in the project. On the basis of the questionnaire, they related their role activities to the degree of autonomy they felt they were granted. The findings identify the main factors in the role of the principal and show that principals' activities may be defined according to (a) the degree to which they have been planned in advance or carried out ad hoc; (b) their professional or bureaucratic character; (c) the goals which they are meant to serve and whether their aim is task/product-or person-orientated. In all of these areas the principal can act with varying degrees of autonomy. The techniques employed in obtaining and analyzing the results offer a means of creating and comparing principals' profiles, hence also of indicating the position of the principal in the more general system; producing a typology of administrative styles; and identifying personal styles of principalship.

  13. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5

  14. Evaluators' Perceptions of Teachers' Use of Behavior Alteration Techniques.

    ERIC Educational Resources Information Center

    Allen, Terre; Edwards, Renee

    1988-01-01

    Examines which message-based behavior alteration techniques (BATs) teacher evaluators perceive as commonly used by good, average, and poor teachers. Reports that principals equate reward-type messages with effective teaching and punishment-type messages with ineffective teaching. (MM)

  15. Analysis of satellite data on energetic particles of ionospheric origin

    NASA Technical Reports Server (NTRS)

    Sharp, R. D.; Johnson, R. G.; Shelley, E. G.

    1976-01-01

    The principal result of this program has been the completion of a detailed statistical study of the properties of precipitating O(+) and H(+) ions during two principal magnetic storms. The results of the analysis of selected data of ion mass spectrometer experiment on satellites are given with emphasis on the morphology of the O(+) ions of ionospheric origin with energies in the 0.7 les than or equal to E less than or equal to 12 keV range that were discovered with this experiment.

  16. Reconstruction Error and Principal Component Based Anomaly Detection in Hyperspectral Imagery

    DTIC Science & Technology

    2014-03-27

    2003), and (Jackson D. A., 1993). In 1933, Hotelling ( Hotelling , 1933), who coined the term ‘principal components,’ surmised that there was a...goodness of fit and multivariate quality control with the statistic Qi = (Xi(1×p) − X̂i(1×p) )(Xi(1×p) − X̂i(1×p) ) T (20) where, under the...sparsely targeted scenes through SNR or other methods. 5) Customize sorting and histogram construction methods in Multiple PCA to avoid redundancy

  17. Impact of multicollinearity on small sample hydrologic regression models

    NASA Astrophysics Data System (ADS)

    Kroll, Charles N.; Song, Peter

    2013-06-01

    Often hydrologic regression models are developed with ordinary least squares (OLS) procedures. The use of OLS with highly correlated explanatory variables produces multicollinearity, which creates highly sensitive parameter estimators with inflated variances and improper model selection. It is not clear how to best address multicollinearity in hydrologic regression models. Here a Monte Carlo simulation is developed to compare four techniques to address multicollinearity: OLS, OLS with variance inflation factor screening (VIF), principal component regression (PCR), and partial least squares regression (PLS). The performance of these four techniques was observed for varying sample sizes, correlation coefficients between the explanatory variables, and model error variances consistent with hydrologic regional regression models. The negative effects of multicollinearity are magnified at smaller sample sizes, higher correlations between the variables, and larger model error variances (smaller R2). The Monte Carlo simulation indicates that if the true model is known, multicollinearity is present, and the estimation and statistical testing of regression parameters are of interest, then PCR or PLS should be employed. If the model is unknown, or if the interest is solely on model predictions, is it recommended that OLS be employed since using more complicated techniques did not produce any improvement in model performance. A leave-one-out cross-validation case study was also performed using low-streamflow data sets from the eastern United States. Results indicate that OLS with stepwise selection generally produces models across study regions with varying levels of multicollinearity that are as good as biased regression techniques such as PCR and PLS.

  18. Forest statistics for the Southern Coastal Plain of North Carolina 1973

    Treesearch

    Noel D. Cost

    1973-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Southern Coastal plain of North Carolina. The inventory was s t a r t e d in November 1972 and completed in August 1973. Three previous inventories, completed in 1937, 1952, and 1962, provide statistics for measuring changes and trends over the past 36 years. In this...

  19. Forest statistics for the Northern Coastal Plain of South Carolina, 1986

    Treesearch

    John B. Tansey

    1987-01-01

    This report highlights the principal findings of the sixth forest survey in the Northern Coastal Plain of South Carolina. Fieldwork began in April 1986 and was completed in July 1986. Five previous surveys, completed in 1936, 1947, 1958, 1968, and 1978, provide statistics for measuring changes and trends over the past 50 years. The primary emphasis in this report is on...

  20. 75 FR 61136 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    ... EdFacts data as well as data from surveys of school principals and special education designees about their school improvement practices. The study will use descriptive statistics and regression analysis to...

  1. Interpersonal differentiation within depression diagnosis: relating interpersonal subgroups to symptom load and the quality of the early therapeutic alliance.

    PubMed

    Grosse Holtforth, Martin; Altenstein, David; Krieger, Tobias; Flückiger, Christoph; Wright, Aidan G C; Caspar, Franz

    2014-01-01

    We examined interpersonal problems in psychotherapy outpatients with a principal diagnosis of a depressive disorder in routine care (n=361). These patients were compared to a normative non-clinical sample and to outpatients with other principal diagnoses (n=959). Furthermore, these patients were statistically assigned to interpersonally defined subgroups that were compared regarding symptoms and the quality of the early alliance. The sample of depressive patients reported higher levels of interpersonal problems than the normative sample and the sample of outpatients without a principal diagnosis of depression. Latent Class Analysis identified eight distinct interpersonal subgroups, which differed regarding self-reported symptom load and the quality of the early alliance. However, therapists' alliance ratings did not differentiate between the groups. This interpersonal differentiation within the group of patients with a principal diagnosis of depression may add to a personalized psychotherapy based on interpersonal profiles.

  2. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    PubMed

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  3. Multimodal fiber-probe spectroscopy for the diagnostics and classification of bladder tumors

    NASA Astrophysics Data System (ADS)

    Anand, Suresh; Cicchi, Riccardo; Fantechi, Riccardo; Gacci, Mauro; Nesi, Gabriella; Carini, Marco; Pavone, Francesco S.

    2017-02-01

    The gold standard for the detection of bladder cancer is white light cystoscopy, followed by an invasive biopsy and pathological examination. Tissue pathology is time consuming and often prone to sampling errors. Recently, optical spectroscopy techniques have evolved as promising techniques for the detection of neoplasia. The specific goal of this study is to evaluate the application of combined auto-fluorescence (excited using 378 nm and 445 nm wavelengths) and diffuse reflectance spectroscopy to discriminate normal bladder tissue from tumor at different grades. The fluorescence spectrum at both excitation wavelengths showed an increased spectral intensity in tumors with respect to normal tissues. Reflectance data indicated an increased reflectance in the wavelength range 610 nm - 700 nm for different grades of tumors, compared to normal tissues. The spectral data were further analyzed using principal component analysis for evaluating the sensitivity and specificity for diagnosing tumor. The spectral differences observed between various grades of tumors provides a strong genesis for the future evaluation on a larger patient population to achieve statistical significance. This study indicates that a combined spectroscopic strategy, incorporating fluorescence and reflectance spectroscopy, could improve the capability for diagnosing bladder tumor as well as for differentiating tumors in different grades.

  4. [Periodontal guided tissue regeneration with a rubber dam: short term clinical study].

    PubMed

    D'Archivio, D; Di Placido, G; Tumini, V; Paolantonio, M

    1998-03-01

    The guided regeneration of periodontal tissues demonstrated to represent a therapeutical technique with predictable results. It has been observed that different materials, used as regenerative membranes, offer very similar results. Unconventional materials too, like the rubber dam, seem to be useful in the guided tissues regeneration technique. The object of the present study has been to comparatively evaluate the effectiveness of Gore-Tex and rubber dam-made membranes in the therapy of intra-osseous periodontal defects. Six patients with two similar intra-osseous defects, participated in the study; one defect has been treated using, during the surgical intervention, a Gore-Tex membrane, while the other has received, a fragment of sterile rubber dam membranes. The principal clinical parameters of the periodontal health (probing depth -PD- and attachment loss -AL-) has been evaluated in both the defects before and 6 months after the periodontal surgery. The results have showed that there are not statistically significant differences (p > 0.05) in the healing of the intra-osseous defects treated by rubber dam or Gore-Tex. The conclusion is drawn that the rubber dam can represent a valid and cheap alternative to the materials traditionally used in the regenerative surgery of the periodontal tissues.

  5. Automated texture-based identification of ovarian cancer in confocal microendoscope images

    NASA Astrophysics Data System (ADS)

    Srivastava, Saurabh; Rodriguez, Jeffrey J.; Rouse, Andrew R.; Brewer, Molly A.; Gmitro, Arthur F.

    2005-03-01

    The fluorescence confocal microendoscope provides high-resolution, in-vivo imaging of cellular pathology during optical biopsy. There are indications that the examination of human ovaries with this instrument has diagnostic implications for the early detection of ovarian cancer. The purpose of this study was to develop a computer-aided system to facilitate the identification of ovarian cancer from digital images captured with the confocal microendoscope system. To achieve this goal, we modeled the cellular-level structure present in these images as texture and extracted features based on first-order statistics, spatial gray-level dependence matrices, and spatial-frequency content. Selection of the best features for classification was performed using traditional feature selection techniques including stepwise discriminant analysis, forward sequential search, a non-parametric method, principal component analysis, and a heuristic technique that combines the results of these methods. The best set of features selected was used for classification, and performance of various machine classifiers was compared by analyzing the areas under their receiver operating characteristic curves. The results show that it is possible to automatically identify patients with ovarian cancer based on texture features extracted from confocal microendoscope images and that the machine performance is superior to that of the human observer.

  6. Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    PubMed

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D

    2015-05-08

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  7. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  8. Automatic age and gender classification using supervised appearance model

    NASA Astrophysics Data System (ADS)

    Bukar, Ali Maina; Ugail, Hassan; Connah, David

    2016-11-01

    Age and gender classification are two important problems that recently gained popularity in the research community, due to their wide range of applications. Research has shown that both age and gender information are encoded in the face shape and texture, hence the active appearance model (AAM), a statistical model that captures shape and texture variations, has been one of the most widely used feature extraction techniques for the aforementioned problems. However, AAM suffers from some drawbacks, especially when used for classification. This is primarily because principal component analysis (PCA), which is at the core of the model, works in an unsupervised manner, i.e., PCA dimensionality reduction does not take into account how the predictor variables relate to the response (class labels). Rather, it explores only the underlying structure of the predictor variables, thus, it is no surprise if PCA discards valuable parts of the data that represent discriminatory features. Toward this end, we propose a supervised appearance model (sAM) that improves on AAM by replacing PCA with partial least-squares regression. This feature extraction technique is then used for the problems of age and gender classification. Our experiments show that sAM has better predictive power than the conventional AAM.

  9. The spectral analysis of fuel oils using terahertz radiation and chemometric methods

    NASA Astrophysics Data System (ADS)

    Zhan, Honglei; Zhao, Kun; Zhao, Hui; Li, Qian; Zhu, Shouming; Xiao, Lizhi

    2016-10-01

    The combustion characteristics of fuel oils are closely related to both engine efficiency and pollutant emissions, and the analysis of oils and their additives is thus important. These oils and additives have been found to generate distinct responses to terahertz (THz) radiation as the result of various molecular vibrational modes. In the present work, THz spectroscopy was employed to identify a number of oils, including lubricants, gasoline and diesel, with different additives. The identities of dozens of these oils could be readily established using statistical models based on principal component analysis. The THz spectra of gasoline, diesel, sulfur and methyl methacrylate (MMA) were acquired and linear fittings were obtained. By using chemometric methods, including back propagation, artificial neural network and support vector machine techniques, typical concentrations of sulfur in gasoline (ppm-grade) could be detected, together with MMA in diesel below 0.5%. The absorption characteristics of the oil additives were also assessed using 2D correlation spectroscopy, and several hidden absorption peaks were discovered. The technique discussed herein should provide a useful new means of analyzing fuel oils with various additives and impurities in a non-destructive manner and therefore will be of benefit to the field of chemical detection and identification.

  10. Evaluation of human serum of severe rheumatoid arthritis by confocal Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Carvalho, C. S.; Raniero, L.; Santo, A. M. E.; Pinheiro, M. M.; Andrade, L. E. C.; Cardoso, M. A. G.; Junior, J. S.; Martin, A. A.

    2010-02-01

    Rheumatoid Arthritis is a systemic chronic inflammatory disease, recurrent and systemic, initiated by autoantibodies and maintained by inflammatory mechanisms cellular applicants. The evaluation of this disease to promote early diagnosis, need an associations of many tools, such as clinical, physical examination and thorough medical history. However, there is no satisfactory consensus due to its complexity. In the present work, confocal Raman spectroscopy was used to evaluate the biochemical composition of human serum of 40 volunteers, 24 patients with rheumatoid arthritis presenting clinical signs and symptoms, and 16 healthy donors. The technique of latex agglutination for the polystyrene covered with human immunoglobulin G and PCR (protein c-reactive) was performed for confirmation of possible false-negative results within the groups, facilitating the statistical interpretation and validation of the technique. This study aimed to verify the changes for the characteristics Raman peaks of biomolecules such as immunoglobulins amides and protein. The results were highly significant with a good separation between groups mentioned. The discriminant analysis was performed through the principal components and correctly identified 92% of the donors. Based on these results, we observed the behavior of arthritis autoimmune, evident in certain spectral regions that characterize the serological differences between the groups.

  11. A hybrid sensing approach for pure and adulterated honey classification.

    PubMed

    Subari, Norazian; Mohamad Saleh, Junita; Md Shakaff, Ali Yeon; Zakaria, Ammar

    2012-10-17

    This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar) were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose) and Fourier Transform Infrared Spectroscopy (FTIR) were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0%) gave higher classification accuracy than e-nose data (76.5%) using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data.

  12. Spectral region optimization for Raman-based optical biopsy of inflammatory lesions.

    PubMed

    de Carvalho, Luis Felipe das Chagas E Silva; Bitar, Renata Andrade; Arisawa, Emília Angela Loschiavo; Brandão, Adriana Aigotti Haberbeck; Honório, Kathia Maria; Cabral, Luiz Antônio Guimarães; Martin, Airton Abrahão; Martinho, Herculano da Silva; Almeida, Janete Dias

    2010-08-01

    The biochemical alterations between inflammatory fibrous hyperplasia (IFH) and normal tissues of buccal mucosa were probed by using the FT-Raman spectroscopy technique. The aim was to find the minimal set of Raman bands that would furnish the best discrimination. Raman-based optical biopsy is a widely recognized potential technique for noninvasive real-time diagnosis. However, few studies had been devoted to the discrimination of very common subtle or early pathologic states as inflammatory processes that are always present on, for example, cancer lesion borders. Seventy spectra of IFH from 14 patients were compared with 30 spectra of normal tissues from six patients. The statistical analysis was performed with principal components analysis and soft independent modeling class analogy cross-validated, leave-one-out methods. Bands close to 574, 1,100, 1,250 to 1,350, and 1,500 cm(-1) (mainly amino acids and collagen bands) showed the main intragroup variations that are due to the acanthosis process in the IFH epithelium. The 1,200 (C-C aromatic/DNA), 1,350 (CH(2) bending/collagen 1), and 1,730 cm(-1) (collagen III) regions presented the main intergroup variations. This finding was interpreted as originating in an extracellular matrix-degeneration process occurring in the inflammatory tissues. The statistical analysis results indicated that the best discrimination capability (sensitivity of 95% and specificity of 100%) was found by using the 530-580 cm(-1) spectral region. The existence of this narrow spectral window enabling normal and inflammatory diagnosis also had useful implications for an in vivo dispersive Raman setup for clinical applications.

  13. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images.

    PubMed

    Barbosa, Daniel C; Roupar, Dalila B; Ramos, Jaime C; Tavares, Adriano C; Lima, Carlos S

    2012-01-11

    Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.

  14. Investigating elementary principals' science beliefs and knowledge and its relationship to students' science outcomes

    NASA Astrophysics Data System (ADS)

    Khan, Uzma Zafar

    The aim of this quantitative study was to investigate elementary principals' beliefs about reformed science teaching and learning, science subject matter knowledge, and how these factors relate to fourth grade students' superior science outcomes. Online survey methodology was used for data collection and included a demographic questionnaire and two survey instruments: the K-4 Physical Science Misconceptions Oriented Science Assessment Resources for Teachers (MOSART) and the Beliefs About Reformed Science Teaching and Learning (BARSTL). Hierarchical multiple regression analysis was used to assess the separate and collective contributions of background variables such as principals' personal and school characteristics, principals' science teaching and learning beliefs, and principals' science knowledge on students' superior science outcomes. Mediation analysis was also used to explore whether principals' science knowledge mediated the relationship between their beliefs about science teaching and learning and students' science outcomes. Findings indicated that principals' science beliefs and knowledge do not contribute to predicting students' superior science scores. Fifty-two percent of the variance in percentage of students with superior science scores was explained by school characteristics with free or reduced price lunch and school type as the only significant individual predictors. Furthermore, principals' science knowledge did not mediate the relationship between their science beliefs and students' science outcomes. There was no statistically significant variation among the variables. The data failed to support the proposed mediation model of the study. Implications for future research are discussed.

  15. Risk prediction for myocardial infarction via generalized functional regression models.

    PubMed

    Ieva, Francesca; Paganoni, Anna M

    2016-08-01

    In this paper, we propose a generalized functional linear regression model for a binary outcome indicating the presence/absence of a cardiac disease with multivariate functional data among the relevant predictors. In particular, the motivating aim is the analysis of electrocardiographic traces of patients whose pre-hospital electrocardiogram (ECG) has been sent to 118 Dispatch Center of Milan (the Italian free-toll number for emergencies) by life support personnel of the basic rescue units. The statistical analysis starts with a preprocessing of ECGs treated as multivariate functional data. The signals are reconstructed from noisy observations. The biological variability is then removed by a nonlinear registration procedure based on landmarks. Thus, in order to perform a data-driven dimensional reduction, a multivariate functional principal component analysis is carried out on the variance-covariance matrix of the reconstructed and registered ECGs and their first derivatives. We use the scores of the Principal Components decomposition as covariates in a generalized linear model to predict the presence of the disease in a new patient. Hence, a new semi-automatic diagnostic procedure is proposed to estimate the risk of infarction (in the case of interest, the probability of being affected by Left Bundle Brunch Block). The performance of this classification method is evaluated and compared with other methods proposed in literature. Finally, the robustness of the procedure is checked via leave-j-out techniques. © The Author(s) 2013.

  16. Critical Analysis of Cluster Models and Exchange-Correlation Functionals for Calculating Magnetic Shielding in Molecular Solids.

    PubMed

    Holmes, Sean T; Iuliucci, Robbie J; Mueller, Karl T; Dybowski, Cecil

    2015-11-10

    Calculations of the principal components of magnetic-shielding tensors in crystalline solids require the inclusion of the effects of lattice structure on the local electronic environment to obtain significant agreement with experimental NMR measurements. We assess periodic (GIPAW) and GIAO/symmetry-adapted cluster (SAC) models for computing magnetic-shielding tensors by calculations on a test set containing 72 insulating molecular solids, with a total of 393 principal components of chemical-shift tensors from 13C, 15N, 19F, and 31P sites. When clusters are carefully designed to represent the local solid-state environment and when periodic calculations include sufficient variability, both methods predict magnetic-shielding tensors that agree well with experimental chemical-shift values, demonstrating the correspondence of the two computational techniques. At the basis-set limit, we find that the small differences in the computed values have no statistical significance for three of the four nuclides considered. Subsequently, we explore the effects of additional DFT methods available only with the GIAO/cluster approach, particularly the use of hybrid-GGA functionals, meta-GGA functionals, and hybrid meta-GGA functionals that demonstrate improved agreement in calculations on symmetry-adapted clusters. We demonstrate that meta-GGA functionals improve computed NMR parameters over those obtained by GGA functionals in all cases, and that hybrid functionals improve computed results over the respective pure DFT functional for all nuclides except 15N.

  17. A gap-filling model for eddy covariance latent heat flux: Estimating evapotranspiration of a subtropical seasonal evergreen broad-leaved forest as an example

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Ying; Chu, Chia-Ren; Li, Ming-Hsu

    2012-10-01

    SummaryIn this paper we present a semi-parametric multivariate gap-filling model for tower-based measurement of latent heat flux (LE). Two statistical techniques, the principal component analysis (PCA) and a nonlinear interpolation approach were integrated into this LE gap-filling model. The PCA was first used to resolve the multicollinearity relationships among various environmental variables, including radiation, soil moisture deficit, leaf area index, wind speed, etc. Two nonlinear interpolation methods, multiple regressions (MRS) and the K-nearest neighbors (KNNs) were examined with random selected flux gaps for both clear sky and nighttime/cloudy data to incorporate into this LE gap-filling model. Experimental results indicated that the KNN interpolation approach is able to provide consistent LE estimations while MRS presents over estimations during nighttime/cloudy. Rather than using empirical regression parameters, the KNN approach resolves the nonlinear relationship between the gap-filled LE flux and principal components with adaptive K values under different atmospheric states. The developed LE gap-filling model (PCA with KNN) works with a RMSE of 2.4 W m-2 (˜0.09 mm day-1) at a weekly time scale by adding 40% artificial flux gaps into original dataset. Annual evapotranspiration at this study site were estimated at 736 mm (1803 MJ) and 728 mm (1785 MJ) for year 2008 and 2009, respectively.

  18. FY71 Engineering Report on Surveillance Techniques for Civil Aviation Security

    DOT National Transportation Integrated Search

    1971-11-01

    This document discusses the work performed by the TSC task group on surveillance techniques in FY71. The principal section is devoted to the technical description, classification and evaluation of commercial metal detectors for concealed weapons. It ...

  19. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.

  20. Sparse PCA with Oracle Property.

    PubMed

    Gu, Quanquan; Wang, Zhaoran; Liu, Han

    In this paper, we study the estimation of the k -dimensional sparse principal subspace of covariance matrix Σ in the high-dimensional setting. We aim to recover the oracle principal subspace solution, i.e., the principal subspace estimator obtained assuming the true support is known a priori. To this end, we propose a family of estimators based on the semidefinite relaxation of sparse PCA with novel regularizations. In particular, under a weak assumption on the magnitude of the population projection matrix, one estimator within this family exactly recovers the true support with high probability, has exact rank- k , and attains a [Formula: see text] statistical rate of convergence with s being the subspace sparsity level and n the sample size. Compared to existing support recovery results for sparse PCA, our approach does not hinge on the spiked covariance model or the limited correlation condition. As a complement to the first estimator that enjoys the oracle property, we prove that, another estimator within the family achieves a sharper statistical rate of convergence than the standard semidefinite relaxation of sparse PCA, even when the previous assumption on the magnitude of the projection matrix is violated. We validate the theoretical results by numerical experiments on synthetic datasets.

  1. Sparse PCA with Oracle Property

    PubMed Central

    Gu, Quanquan; Wang, Zhaoran; Liu, Han

    2014-01-01

    In this paper, we study the estimation of the k-dimensional sparse principal subspace of covariance matrix Σ in the high-dimensional setting. We aim to recover the oracle principal subspace solution, i.e., the principal subspace estimator obtained assuming the true support is known a priori. To this end, we propose a family of estimators based on the semidefinite relaxation of sparse PCA with novel regularizations. In particular, under a weak assumption on the magnitude of the population projection matrix, one estimator within this family exactly recovers the true support with high probability, has exact rank-k, and attains a s/n statistical rate of convergence with s being the subspace sparsity level and n the sample size. Compared to existing support recovery results for sparse PCA, our approach does not hinge on the spiked covariance model or the limited correlation condition. As a complement to the first estimator that enjoys the oracle property, we prove that, another estimator within the family achieves a sharper statistical rate of convergence than the standard semidefinite relaxation of sparse PCA, even when the previous assumption on the magnitude of the projection matrix is violated. We validate the theoretical results by numerical experiments on synthetic datasets. PMID:25684971

  2. Principals' Supervisory Techniques as Correlates of Teachers' Job Performance in Secondary Schools in Ebonyi State, Nigeria

    ERIC Educational Resources Information Center

    Chidi, Nnebedum; Victor, Akinfolarin Akinwale

    2017-01-01

    The persistent and prolonged pitiable state of teachers' job performance leading to poor academic achievement of secondary school students in Ebonyi State has become a source of concern and worry among stakeholders and parents. This could be that instructional supervision is not regularly performed by the principals in order to provide…

  3. In the Shadow/from the Shadow: The Principal as a Reflective Practitioner in Trinidad and Tobago

    ERIC Educational Resources Information Center

    Bristol, Laurette; Esnard, Talia; Brown, Launcelot

    2015-01-01

    This case highlights a school principal's leading practice as she worked to transform the social and educational status of students, teachers, and community in a small urban primary school. We employ shadowing, a technique popularized in work-based education and photography, as reflective and research tools. Teaching notes provide insight into the…

  4. Implementation of Quality Assurance Standards and Principals' Administrative Effectiveness in Public Secondary Schools in Edo and Delta States

    ERIC Educational Resources Information Center

    Momoh, U.; Osagiobare, Emmanuel Osamiro

    2015-01-01

    The study investigated principals' implementation of quality assurance standards and administrative effectiveness in public secondary schools in Edo and Delta States. To guide the study, four research questions and hypotheses were raised. Descriptive research design was adopted for the study and the simple random sampling technique was used to…

  5. Monitoring the Microgravity Environment Quality On-Board the International Space Station Using Soft Computing Techniques

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Lin, Paul P.

    2001-01-01

    This paper presents an artificial intelligence monitoring system developed by the NASA Glenn Principal Investigator Microgravity Services project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment in time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a graphical display, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, platform structural modes, etc., and decide whether or not to run their experiments based on the acceleration environment associated with a specific event. This monitoring system is focused primarily on detecting the vibratory disturbance sources, but could be used as well to detect some of the transient disturbance sources, depending on the events duration. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.

  6. Principal Curves on Riemannian Manifolds.

    PubMed

    Hauberg, Soren

    2016-09-01

    Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimizes a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend to imply that the methods only work well when the manifold is mostly flat within the support of the generating distribution. We argue that instead of generalizing linear Euclidean models, it is more fruitful to generalize non-linear Euclidean models. Specifically, we extend the classic Principal Curves from Hastie & Stuetzle to data residing on a complete Riemannian manifold. We show that for elliptical distributions in the tangent of spaces of constant curvature, the standard principal geodesic is a principal curve. The proposed model is simple to compute and avoids many of the pitfalls of traditional geodesic approaches. We empirically demonstrate the effectiveness of the Riemannian principal curves on several manifolds and datasets.

  7. Crashes & Fatalities Related To Driver Drowsiness/Fatigue

    DOT National Transportation Integrated Search

    1994-11-01

    THIS REPORT SUMMARIZES RECENT NATIONAL STATISTICS ON THE INCIDENCE AND CHARACTERISTICS OF CRASHES INVOLVING DRIVER FATIGUE, DROWSINESS, OR "ASLEEP-AT-THE-WHEEL." FOR THE PURPOSES OF THIS REPORT, THESE TERMS ARE : CONSIDERED SYNONYMOUS. PRINCIPAL DATA...

  8. Guide to reporting highway statistics

    DOT National Transportation Integrated Search

    1983-11-01

    Previous analyses conducted by the Federal Highway Administration (FHWA) are used to project year-by-year economical impacts of changes in highway performance out to 1995. In the principal scenario examined, highway performance is allowed to deterior...

  9. Instrumental Response Model and Detrending for the Dark Energy Camera

    DOE PAGES

    Bernstein, G. M.; Abbott, T. M. C.; Desai, S.; ...

    2017-09-14

    We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less

  10. Instrumental Response Model and Detrending for the Dark Energy Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, G. M.; Abbott, T. M. C.; Desai, S.

    We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less

  11. Statistical discrimination of footwear: a method for the comparison of accidentals on shoe outsoles inspired by facial recognition techniques.

    PubMed

    Petraco, Nicholas D K; Gambino, Carol; Kubic, Thomas A; Olivio, Dayhana; Petraco, Nicholas

    2010-01-01

    In the field of forensic footwear examination, it is a widely held belief that patterns of accidental marks found on footwear and footwear impressions possess a high degree of "uniqueness." This belief, however, has not been thoroughly studied in a numerical way using controlled experiments. As a result, this form of valuable physical evidence has been the subject of admissibility challenges. In this study, we apply statistical techniques used in facial pattern recognition, to a minimal set of information gleaned from accidental patterns. That is, in order to maximize the amount of potential similarity between patterns, we only use the coordinate locations of accidental marks (on the top portion of a footwear impression) to characterize the entire pattern. This allows us to numerically gauge how similar two patterns are to one another in a worst-case scenario, i.e., in the absence of a tremendous amount of information normally available to the footwear examiner such as accidental mark size and shape. The patterns were recorded from the top portion of the shoe soles (i.e., not the heel) of five shoe pairs. All shoes were the same make and model and all were worn by the same person for a period of 30 days. We found that in 20-30 dimensional principal component (PC) space (99.5% variance retained), patterns from the same shoe, even at different points in time, tended to cluster closer to each other than patterns from different shoes. Correct shoe identification rates using maximum likelihood linear classification analysis and the hold-one-out procedure ranged from 81% to 100%. Although low in variance, three-dimensional PC plots were made and generally corroborated the findings in the much higher dimensional PC-space. This study is intended to be a starting point for future research to build statistical models on the formation and evolution of accidental patterns.

  12. Improving the sampling strategy of the Joint Danube Survey 3 (2013) by means of multivariate statistical techniques applied on selected physico-chemical and biological data.

    PubMed

    Hamchevici, Carmen; Udrea, Ion

    2013-11-01

    The concept of basin-wide Joint Danube Survey (JDS) was launched by the International Commission for the Protection of the Danube River (ICPDR) as a tool for investigative monitoring under the Water Framework Directive (WFD), with a frequency of 6 years. The first JDS was carried out in 2001 and its success in providing key information for characterisation of the Danube River Basin District as required by WFD lead to the organisation of the second JDS in 2007, which was the world's biggest river research expedition in that year. The present paper presents an approach for improving the survey strategy for the next planned survey JDS3 (2013) by means of several multivariate statistical techniques. In order to design the optimum structure in terms of parameters and sampling sites, principal component analysis (PCA), factor analysis (FA) and cluster analysis were applied on JDS2 data for 13 selected physico-chemical and one biological element measured in 78 sampling sites located on the main course of the Danube. Results from PCA/FA showed that most of the dataset variance (above 75%) was explained by five varifactors loaded with 8 out of 14 variables: physical (transparency and total suspended solids), relevant nutrients (N-nitrates and P-orthophosphates), feedback effects of primary production (pH, alkalinity and dissolved oxygen) and algal biomass. Taking into account the representation of the factor scores given by FA versus sampling sites and the major groups generated by the clustering procedure, the spatial network of the next survey could be carefully tailored, leading to a decreasing of sampling sites by more than 30%. The approach of target oriented sampling strategy based on the selected multivariate statistics can provide a strong reduction in dimensionality of the original data and corresponding costs as well, without any loss of information.

  13. Automated classification of single airborne particles from two-dimensional angle-resolved optical scattering (TAOS) patterns by non-linear filtering

    NASA Astrophysics Data System (ADS)

    Crosta, Giovanni Franco; Pan, Yong-Le; Aptowicz, Kevin B.; Casati, Caterina; Pinnick, Ronald G.; Chang, Richard K.; Videen, Gorden W.

    2013-12-01

    Measurement of two-dimensional angle-resolved optical scattering (TAOS) patterns is an attractive technique for detecting and characterizing micron-sized airborne particles. In general, the interpretation of these patterns and the retrieval of the particle refractive index, shape or size alone, are difficult problems. By reformulating the problem in statistical learning terms, a solution is proposed herewith: rather than identifying airborne particles from their scattering patterns, TAOS patterns themselves are classified through a learning machine, where feature extraction interacts with multivariate statistical analysis. Feature extraction relies on spectrum enhancement, which includes the discrete cosine FOURIER transform and non-linear operations. Multivariate statistical analysis includes computation of the principal components and supervised training, based on the maximization of a suitable figure of merit. All algorithms have been combined together to analyze TAOS patterns, organize feature vectors, design classification experiments, carry out supervised training, assign unknown patterns to classes, and fuse information from different training and recognition experiments. The algorithms have been tested on a data set with more than 3000 TAOS patterns. The parameters that control the algorithms at different stages have been allowed to vary within suitable bounds and are optimized to some extent. Classification has been targeted at discriminating aerosolized Bacillus subtilis particles, a simulant of anthrax, from atmospheric aerosol particles and interfering particles, like diesel soot. By assuming that all training and recognition patterns come from the respective reference materials only, the most satisfactory classification result corresponds to 20% false negatives from B. subtilis particles and <11% false positives from all other aerosol particles. The most effective operations have consisted of thresholding TAOS patterns in order to reject defective ones, and forming training sets from three or four pattern classes. The presented automated classification method may be adapted into a real-time operation technique, capable of detecting and characterizing micron-sized airborne particles.

  14. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  15. Mapping brain activity in gradient-echo functional MRI using principal component analysis

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Singh, Manbir; Don, Manuel

    1997-05-01

    The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.

  16. [A study of Boletus bicolor from different areas using Fourier transform infrared spectrometry].

    PubMed

    Zhou, Zai-Jin; Liu, Gang; Ren, Xian-Pei

    2010-04-01

    It is hard to differentiate the same species of wild growing mushrooms from different areas by macromorphological features. In this paper, Fourier transform infrared (FTIR) spectroscopy combined with principal component analysis was used to identify 58 samples of boletus bicolor from five different areas. Based on the fingerprint infrared spectrum of boletus bicolor samples, principal component analysis was conducted on 58 boletus bicolor spectra in the range of 1 350-750 cm(-1) using the statistical software SPSS 13.0. According to the result, the accumulated contributing ratio of the first three principal components accounts for 88.87%. They included almost all the information of samples. The two-dimensional projection plot using first and second principal component is a satisfactory clustering effect for the classification and discrimination of boletus bicolor. All boletus bicolor samples were divided into five groups with a classification accuracy of 98.3%. The study demonstrated that wild growing boletus bicolor at species level from different areas can be identified by FTIR spectra combined with principal components analysis.

  17. Multivariate Statistical Approach Applied to Sediment Source Tracking Through Quantification and Mineral Identification, Cheyenne River, South Dakota

    NASA Astrophysics Data System (ADS)

    Valder, J.; Kenner, S.; Long, A.

    2008-12-01

    Portions of the Cheyenne River are characterized as impaired by the U.S. Environmental Protection Agency because of water-quality exceedences. The Cheyenne River watershed includes the Black Hills National Forest and part of the Badlands National Park. Preliminary analysis indicates that the Badlands National Park is a major contributor to the exceedances of the water-quality constituents for total dissolved solids and total suspended solids. Water-quality data have been collected continuously since 2007, and in the second year of collection (2008), monthly grab and passive sediment samplers are being used to collect total suspended sediment and total dissolved solids in both base-flow and runoff-event conditions. In addition, sediment samples from the river channel, including bed, bank, and floodplain, have been collected. These samples are being analyzed at the South Dakota School of Mines and Technology's X-Ray Diffraction Lab to quantify the mineralogy of the sediments. A multivariate statistical approach (including principal components, least squares, and maximum likelihood techniques) is applied to the mineral percentages that were characterized for each site to identify the contributing source areas that are causing exceedances of sediment transport in the Cheyenne River watershed. Results of the multivariate analysis demonstrate the likely sources of solids found in the Cheyenne River samples. A further refinement of the methods is in progress that utilizes a conceptual model which, when applied with the multivariate statistical approach, provides a better estimate for sediment sources.

  18. Research in Network Management Techniques for Tactical Data Communications Networks.

    DTIC Science & Technology

    1982-09-01

    COMPUTER COMMUNICATIONS US A.RMY (CECOM) V September 1980 to August 1982 Principal Investigatoi Robert Boorstyn Aaron Kershenbaum DTIC Basil Niaglaris Philip...COMMUNICATIONS US ARMY (CECOM) September 1980 to August 1982 Principal Investigators: Robert Boorstyn Aaron Kershenbaum Basil Maglaris Philip Sarachik...TABLE OF CONTENTS Summary of Report Personnel Activities Research Reports / , A. Packet Radio Networks A.1 Throughput Analysis of Multihop Packet

  19. ERTS-B imagery interpretation techniques in the Tennessee Valley

    NASA Technical Reports Server (NTRS)

    Gonzalez, R. C. (Principal Investigator)

    1973-01-01

    There are no author-identified significant results in this report. The proposed investigation is a continuation of an ERTS-1 project. The principal missions are to serve as the principal supporter on computer and image processing problems for the multidisciplinary ERTS effort of the University of Tennessee, and to carry out research in improved methods for the computer processing, enhancement, and recognition of ERTS imagery.

  20. Climate drivers on malaria transmission in Arunachal Pradesh, India.

    PubMed

    Upadhyayula, Suryanaryana Murty; Mutheneni, Srinivasa Rao; Chenna, Sumana; Parasaram, Vaideesh; Kadiri, Madhusudhan Rao

    2015-01-01

    The present study was conducted during the years 2006 to 2012 and provides information on prevalence of malaria and its regulation with effect to various climatic factors in East Siang district of Arunachal Pradesh, India. Correlation analysis, Principal Component Analysis and Hotelling's T² statistics models are adopted to understand the effect of weather variables on malaria transmission. The epidemiological study shows that the prevalence of malaria is mostly caused by the parasite Plasmodium vivax followed by Plasmodium falciparum. It is noted that, the intensity of malaria cases declined gradually from the year 2006 to 2012. The transmission of malaria observed was more during the rainy season, as compared to summer and winter seasons. Further, the data analysis study with Principal Component Analysis and Hotelling's T² statistic has revealed that the climatic variables such as temperature and rainfall are the most influencing factors for the high rate of malaria transmission in East Siang district of Arunachal Pradesh.

  1. Visual enhancement of images of natural resources: Applications in geology

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Neto, G.; Araujo, E. O.; Mascarenhas, N. D. A.; Desouza, R. C. M.

    1980-01-01

    The principal components technique for use in multispectral scanner LANDSAT data processing results in optimum dimensionality reduction. A powerful tool for MSS IMAGE enhancement, the method provides a maximum impression of terrain ruggedness; this fact makes the technique well suited for geological analysis.

  2. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    NASA Technical Reports Server (NTRS)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  3. Preliminary study on the effectiveness of short group cognitive behavioral therapy (GCBT) on Indonesian older adults.

    PubMed

    Utoyo, Dharmayati Bambang; Lubis, Dharmayati Utoyo; Jaya, Edo Sebastian; Arjadi, Retha; Hanum, Lathifah; Astri, Kresna; Putri, Maha Decha Dwi

    2013-01-01

    This research aims to develop evidence based affordable psychological therapy for Indonesian older adults. An affordable psychological therapy is important as there is virtually no managed care or health insurance that covers psychological therapy in Indonesia. Multicomponent group cognitive behavior therapy (GCBGT) was chosen as a starting point due to its extensive evidence, short sessions, and success for a wide range of psychological problems. The group format was chosen to address both the economic and the cultural context of Indonesia. Then, the developed treatment is tested to common psychological problems in older adults' population (anxiety, chronic pain, depression, and insomnia). The treatment consists of 8 sessions with twice a week meetings for 2.5 hours. There are similarities and differences among the techniques used in the treatment for the different psychological problems. The final participants are 38 older adults that are divided into the treatment groups; 8 participants joined the anxiety treatment, 10 participants for the chronic pain treatment, 10 participants for depression treatment, and lastly, 10 participants joined the insomnia treatment. The research design is pre-test post-test with within group analysis. We used principal outcome measure that is specific for each treatment group, as well as additional outcome measures. Overall, the result shows statistical significance change with large effect size for the principal outcome measure. In addition, the result for the additional measures varies from slight improvement with small effect size to statistically significant improvement with large effect size. The result indicates that short multicomponent GCBT is effective in alleviating various common psychological problems in Indonesian older adults. Therefore, multicomponent GCBT may be a good starting point to develop an effective and affordable psychological therapy for Indonesian older adults. Lastly, this result adds to the accumulating body of evidence on the effectiveness of multicomponent GCBT outside western context.

  4. Preliminary Study on the Effectiveness of Short Group Cognitive Behavioral Therapy (GCBT) on Indonesian Older Adults

    PubMed Central

    Lubis, Dharmayati Utoyo; Jaya, Edo Sebastian; Arjadi, Retha; Hanum, Lathifah; Astri, Kresna; Putri, Maha Decha Dwi

    2013-01-01

    This research aims to develop evidence based affordable psychological therapy for Indonesian older adults. An affordable psychological therapy is important as there is virtually no managed care or health insurance that covers psychological therapy in Indonesia. Multicomponent group cognitive behavior therapy (GCBGT) was chosen as a starting point due to its extensive evidence, short sessions, and success for a wide range of psychological problems. The group format was chosen to address both the economic and the cultural context of Indonesia. Then, the developed treatment is tested to common psychological problems in older adults' population (anxiety, chronic pain, depression, and insomnia). The treatment consists of 8 sessions with twice a week meetings for 2.5 hours. There are similarities and differences among the techniques used in the treatment for the different psychological problems. The final participants are 38 older adults that are divided into the treatment groups; 8 participants joined the anxiety treatment, 10 participants for the chronic pain treatment, 10 participants for depression treatment, and lastly, 10 participants joined the insomnia treatment. The research design is pre-test post-test with within group analysis. We used principal outcome measure that is specific for each treatment group, as well as additional outcome measures. Overall, the result shows statistical significance change with large effect size for the principal outcome measure. In addition, the result for the additional measures varies from slight improvement with small effect size to statistically significant improvement with large effect size. The result indicates that short multicomponent GCBT is effective in alleviating various common psychological problems in Indonesian older adults. Therefore, multicomponent GCBT may be a good starting point to develop an effective and affordable psychological therapy for Indonesian older adults. Lastly, this result adds to the accumulating body of evidence on the effectiveness of multicomponent GCBT outside western context. PMID:23437339

  5. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise), and study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, we apply vbICA to different tectonically active scenarios, such as earthquakes in central and northern Italy, as well as the study of slow slip events in Cascadia.

  6. Biomedical visual data analysis to build an intelligent diagnostic decision support system in medical genetics.

    PubMed

    Kuru, Kaya; Niranjan, Mahesan; Tunca, Yusuf; Osvank, Erhan; Azim, Tayyaba

    2014-10-01

    In general, medical geneticists aim to pre-diagnose underlying syndromes based on facial features before performing cytological or molecular analyses where a genotype-phenotype interrelation is possible. However, determining correct genotype-phenotype interrelationships among many syndromes is tedious and labor-intensive, especially for extremely rare syndromes. Thus, a computer-aided system for pre-diagnosis can facilitate effective and efficient decision support, particularly when few similar cases are available, or in remote rural districts where diagnostic knowledge of syndromes is not readily available. The proposed methodology, visual diagnostic decision support system (visual diagnostic DSS), employs machine learning (ML) algorithms and digital image processing techniques in a hybrid approach for automated diagnosis in medical genetics. This approach uses facial features in reference images of disorders to identify visual genotype-phenotype interrelationships. Our statistical method describes facial image data as principal component features and diagnoses syndromes using these features. The proposed system was trained using a real dataset of previously published face images of subjects with syndromes, which provided accurate diagnostic information. The method was tested using a leave-one-out cross-validation scheme with 15 different syndromes, each of comprised 5-9 cases, i.e., 92 cases in total. An accuracy rate of 83% was achieved using this automated diagnosis technique, which was statistically significant (p<0.01). Furthermore, the sensitivity and specificity values were 0.857 and 0.870, respectively. Our results show that the accurate classification of syndromes is feasible using ML techniques. Thus, a large number of syndromes with characteristic facial anomaly patterns could be diagnosed with similar diagnostic DSSs to that described in the present study, i.e., visual diagnostic DSS, thereby demonstrating the benefits of using hybrid image processing and ML-based computer-aided diagnostics for identifying facial phenotypes. Copyright © 2014. Published by Elsevier B.V.

  7. A study of machine learning regression methods for major elemental analysis of rocks using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Boucher, Thomas F.; Ozanne, Marie V.; Carmosino, Marco L.; Dyar, M. Darby; Mahadevan, Sridhar; Breves, Elly A.; Lepore, Kate H.; Clegg, Samuel M.

    2015-05-01

    The ChemCam instrument on the Mars Curiosity rover is generating thousands of LIBS spectra and bringing interest in this technique to public attention. The key to interpreting Mars or any other types of LIBS data are calibrations that relate laboratory standards to unknowns examined in other settings and enable predictions of chemical composition. Here, LIBS spectral data are analyzed using linear regression methods including partial least squares (PLS-1 and PLS-2), principal component regression (PCR), least absolute shrinkage and selection operator (lasso), elastic net, and linear support vector regression (SVR-Lin). These were compared against results from nonlinear regression methods including kernel principal component regression (K-PCR), polynomial kernel support vector regression (SVR-Py) and k-nearest neighbor (kNN) regression to discern the most effective models for interpreting chemical abundances from LIBS spectra of geological samples. The results were evaluated for 100 samples analyzed with 50 laser pulses at each of five locations averaged together. Wilcoxon signed-rank tests were employed to evaluate the statistical significance of differences among the nine models using their predicted residual sum of squares (PRESS) to make comparisons. For MgO, SiO2, Fe2O3, CaO, and MnO, the sparse models outperform all the others except for linear SVR, while for Na2O, K2O, TiO2, and P2O5, the sparse methods produce inferior results, likely because their emission lines in this energy range have lower transition probabilities. The strong performance of the sparse methods in this study suggests that use of dimensionality-reduction techniques as a preprocessing step may improve the performance of the linear models. Nonlinear methods tend to overfit the data and predict less accurately, while the linear methods proved to be more generalizable with better predictive performance. These results are attributed to the high dimensionality of the data (6144 channels) relative to the small number of samples studied. The best-performing models were SVR-Lin for SiO2, MgO, Fe2O3, and Na2O, lasso for Al2O3, elastic net for MnO, and PLS-1 for CaO, TiO2, and K2O. Although these differences in model performance between methods were identified, most of the models produce comparable results when p ≤ 0.05 and all techniques except kNN produced statistically-indistinguishable results. It is likely that a combination of models could be used together to yield a lower total error of prediction, depending on the requirements of the user.

  8. Pattern Discovery in Biomolecular Data – Tools, Techniques, and Applications | Center for Cancer Research

    Cancer.gov

    Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph

  9. The U.S. geological survey rass-statpac system for management and statistical reduction of geochemical data

    USGS Publications Warehouse

    VanTrump, G.; Miesch, A.T.

    1977-01-01

    RASS is an acronym for Rock Analysis Storage System and STATPAC, for Statistical Package. The RASS and STATPAC computer programs are integrated into the RASS-STATPAC system for the management and statistical reduction of geochemical data. The system, in its present form, has been in use for more than 9 yr by scores of U.S. Geological Survey geologists, geochemists, and other scientists engaged in a broad range of geologic and geochemical investigations. The principal advantage of the system is the flexibility afforded the user both in data searches and retrievals and in the manner of statistical treatment of data. The statistical programs provide for most types of statistical reduction normally used in geochemistry and petrology, but also contain bridges to other program systems for statistical processing and automatic plotting. ?? 1977.

  10. Potential effect of diaper and cotton ball contamination on NMR- and LC/MS-based metabonomics studies of urine from newborn babies.

    PubMed

    Goodpaster, Aaron M; Ramadas, Eshwar H; Kennedy, Michael A

    2011-02-01

    Nuclear magnetic resonance (NMR) and liquid chromatography/mass spectrometry (LC/MS) based metabonomics screening of urine has great potential for discovery of biomarkers for diseases that afflict newborn and preterm infants. However, urine collection from newborn infants presents a potential confounding problem due to the possibility that contaminants might leach from materials used for urine collection and influence statistical analysis of metabonomics data. In this manuscript, we have analyzed diaper and cotton ball contamination using synthetic urine to assess its potential to influence the outcome of NMR- and LC/MS-based metabonomics studies of human infant urine. Eight diaper brands were examined using the "diaper plus cotton ball" technique. Data were analyzed using conventional principal components analysis, as well as a statistical significance algorithm developed for, and applied to, NMR data. Results showed most diaper brands had distinct contaminant profiles that could potentially influence NMR- and LC/MS-based metabonomics studies. On the basis of this study, it is recommended that diaper and cotton ball brands be characterized using metabonomics methodologies prior to initiating a metabonomics study to ensure that contaminant profiles are minimal or manageable and that the same diaper and cotton ball brands be used throughout a study to minimize variation.

  11. Color quality of pigments in cochineals (Dactylopius coccus Costa). Geographical origin characterization using multivariate statistical analysis.

    PubMed

    Méndez, Jesús; González, Mónica; Lobo, M Gloria; Carnero, Aurelio

    2004-03-10

    The commercial value of a cochineal (Dactylopius coccus Costa) sample is associated with its color quality. Because the cochineal is a legal food colorant, its color quality is generally understood as its pigment content. Simply put, the higher this content, the more valuable the sample is to the market. In an effort to devise a way to measure the color quality of a cochineal, the present study evaluates different parameters of color measurement such as chromatic attributes (L*, and a*), percentage of carminic acid, tint determination, and chromatographic profile of pigments. Tint determination did not achieve this objective because this parameter does not correlate with carminic acid content. On the other hand, carminic acid showed a highly significant correlation (r = - 0.922, p = 0.000) with L* values determined from powdered cochineal samples. The combination of the information from the spectrophotometric determination of carminic acid with that of the pigment profile acquired by liquid chromatography (LC) and the composition of the red and yellow pigment groups, also acquired by LC, enables greater accuracy in judging the quality of the final sample. As a result of this study, it was possible to achieve the separation of cochineal samples according to geographical origin using two statistical techniques: cluster analysis and principal component analysis.

  12. Accurate Structural Correlations from Maximum Likelihood Superpositions

    PubMed Central

    Theobald, Douglas L; Wuttke, Deborah S

    2008-01-01

    The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR) models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA) of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method (“PCA plots”) for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology. PMID:18282091

  13. Fault Detection and Diagnosis In Hall-Héroult Cells Based on Individual Anode Current Measurements Using Dynamic Kernel PCA

    NASA Astrophysics Data System (ADS)

    Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey

    2018-04-01

    Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.

  14. Spatial and temporal variation of water quality of a segment of Marikina River using multivariate statistical methods.

    PubMed

    Chounlamany, Vanseng; Tanchuling, Maria Antonia; Inoue, Takanobu

    2017-09-01

    Payatas landfill in Quezon City, Philippines, releases leachate to the Marikina River through a creek. Multivariate statistical techniques were applied to study temporal and spatial variations in water quality of a segment of the Marikina River. The data set included 12 physico-chemical parameters for five monitoring stations over a year. Cluster analysis grouped the monitoring stations into four clusters and identified January-May as dry season and June-September as wet season. Principal components analysis showed that three latent factors are responsible for the data set explaining 83% of its total variance. The chemical oxygen demand, biochemical oxygen demand, total dissolved solids, Cl - and PO 4 3- are influenced by anthropogenic impact/eutrophication pollution from point sources. Total suspended solids, turbidity and SO 4 2- are influenced by rain and soil erosion. The highest state of pollution is at the Payatas creek outfall from March to May, whereas at downstream stations it is in May. The current study indicates that the river monitoring requires only four stations, nine water quality parameters and testing over three specific months of the year. The findings of this study imply that Payatas landfill requires a proper leachate collection and treatment system to reduce its impact on the Marikina River.

  15. Benthic foraminifera and trace element distribution: a case-study from the heavily polluted lagoon of Venice (Italy).

    PubMed

    Coccioni, Rodolfo; Frontalini, Fabrizio; Marsili, Andrea; Mana, Davide

    2009-01-01

    Living benthic foraminiferal assemblages were studied in surface samples collected from the lagoon of Venice (Italy) in order to investigate the relationship between these sensitive microorganisms and trace element pollution. Geochemical analysis of sediments shows that the lagoon is affected by trace element pollution (Cd, Cu, Ni, Pb, Zn and Hg) with the highest concentrations in its inner part, which corresponds to the Porto Marghera industrial area. The biocenosis are largely dominated by Ammonia tepida, Haynesina germanica and Cribroelphidium oceanensis and, subordinately, by Aubignyna perlucida, Ammonia parkinsoniana and Bolivina striatula. Biotic and abiotic factors were statistically analyzed with multivariate technique of cluster analysis and principal component analysis. The statistical analysis reveals a strong relationship between trace elements (in particular Mn, Pb and Hg) and the occurrence of abnormalities in foraminiferal tests. Remarkably, greater proportions of abnormal specimens are usually found at stations located close to the heaviest polluted industrial zone of Porto Marghera. This paper shows that benthic foraminifera can be used as useful and relatively speedy and inexpensive bio-indicators in monitoring the health quality of the lagoon of Venice. It also provides a basis for future investigations aimed at unraveling the benthic foraminiferal response to human-induced pollution in marine and transitional marine environments.

  16. New generation of hydraulic pedotransfer functions for Europe

    PubMed Central

    Tóth, B; Weynants, M; Nemes, A; Makó, A; Bilas, G; Tóth, G

    2015-01-01

    A range of continental-scale soil datasets exists in Europe with different spatial representation and based on different principles. We developed comprehensive pedotransfer functions (PTFs) for applications principally on spatial datasets with continental coverage. The PTF development included the prediction of soil water retention at various matric potentials and prediction of parameters to characterize soil moisture retention and the hydraulic conductivity curve (MRC and HCC) of European soils. We developed PTFs with a hierarchical approach, determined by the input requirements. The PTFs were derived by using three statistical methods: (i) linear regression where there were quantitative input variables, (ii) a regression tree for qualitative, quantitative and mixed types of information and (iii) mean statistics of developer-defined soil groups (class PTF) when only qualitative input parameters were available. Data of the recently established European Hydropedological Data Inventory (EU-HYDI), which holds the most comprehensive geographical and thematic coverage of hydro-pedological data in Europe, were used to train and test the PTFs. The applied modelling techniques and the EU-HYDI allowed the development of hydraulic PTFs that are more reliable and applicable for a greater variety of input parameters than those previously available for Europe. Therefore the new set of PTFs offers tailored advanced tools for a wide range of applications in the continent. PMID:25866465

  17. Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall

    NASA Astrophysics Data System (ADS)

    Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.

  18. [The social vulnerability index regarding Medellín's disabled population].

    PubMed

    Cardona-Arango, Doris; Agudelo-Martínez, Alejandra; Restrepo-Molina, Lucas; Segura-Cardona, Angela M

    2014-01-01

    Constructing a social vulnerability index (SVI) for Medellín's disabled population during 2008 aimed at determining areas which were reducing opportunities for this population to use their tangible and intangible assets, thus impairing their quality of life. This descriptive cross-sectional study drew on a source of secondary information regarding people having some kind of limitation recorded in the Quality of Life Survey, 2008. Physical, human and social variables were grouped when constructing the SVI; the models were run in principal component analysis to determine their degree of vulnerability, defined by the number of negative factors identified (high category=4 or 5, medium=2 or 3 and low=1 or none). Such classification led to identifying non-causal relationships with demographic variables through Mann-Whitney, Chi-square and Kruskal-Wallis tests (5.0 % statistical significance level); multinomial logistic regression was used for calculating adjusted measures for epidemiological measurement, such as opportunity ratios and confidence intervals. A degree of medium vulnerability predominated in disabled people living in Medellín (60.3 %) followed by low vulnerability (28.7 %) and high vulnerability populations (11.0 %). The proposed ISV classified the city's communes according to high, medium or low vulnerability, supported by the use of statistical and spatial location techniques.

  19. Principals' Values in School Administration

    ERIC Educational Resources Information Center

    Aslanargun, Engin

    2012-01-01

    School administration is value driven area depending on the emotions, cultures, and human values as well as technique and structure. Over the long years, educational administration throughout the world have experienced the influence of logical positivism that is based on rational techniques more than philosophical consideration, ignored values and…

  20. 42 CFR 476.74 - General requirements for the assumption of review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... inspection at its principal business office— (1) A copy of each agreement with Medicare fiscal intermediaries... by CMS, a QIO is responsible for compiling statistics based on the criteria contained in § 405.332 of...

  1. Online signature recognition using principal component analysis and artificial neural network

    NASA Astrophysics Data System (ADS)

    Hwang, Seung-Jun; Park, Seung-Je; Baek, Joong-Hwan

    2016-12-01

    In this paper, we propose an algorithm for on-line signature recognition using fingertip point in the air from the depth image acquired by Kinect. We extract 10 statistical features from X, Y, Z axis, which are invariant to changes in shifting and scaling of the signature trajectories in three-dimensional space. Artificial neural network is adopted to solve the complex signature classification problem. 30 dimensional features are converted into 10 principal components using principal component analysis, which is 99.02% of total variances. We implement the proposed algorithm and test to actual on-line signatures. In experiment, we verify the proposed method is successful to classify 15 different on-line signatures. Experimental result shows 98.47% of recognition rate when using only 10 feature vectors.

  2. Psychometric evaluation of the Persian version of the Templer's Death Anxiety Scale in cancer patients.

    PubMed

    Soleimani, Mohammad Ali; Yaghoobzadeh, Ameneh; Bahrami, Nasim; Sharif, Saeed Pahlevan; Sharif Nia, Hamid

    2016-10-01

    In this study, 398 Iranian cancer patients completed the 15-item Templer's Death Anxiety Scale (TDAS). Tests of internal consistency, principal components analysis, and confirmatory factor analysis were conducted to assess the internal consistency and factorial validity of the Persian TDAS. The construct reliability statistic and average variance extracted were also calculated to measure construct reliability, convergent validity, and discriminant validity. Principal components analysis indicated a 3-component solution, which was generally supported in the confirmatory analysis. However, acceptable cutoffs for construct reliability, convergent validity, and discriminant validity were not fulfilled for the three subscales that were derived from the principal component analysis. This study demonstrated both the advantages and potential limitations of using the TDAS with Persian-speaking cancer patients.

  3. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  4. Portable XRF and principal component analysis for bill characterization in forensic science.

    PubMed

    Appoloni, C R; Melquiades, F L

    2014-02-01

    Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. School food policies and practices: a state-wide survey of secondary school principals.

    PubMed

    French, Simone A; Story, Mary; Fulkerson, Jayne A

    2002-12-01

    To describe food-related policies and practices in secondary schools in Minnesota. Mailed anonymous survey including questions about the secondary school food environment and food-related practices and policies. Members of a statewide professional organization for secondary school principals (n = 610; response rate: 463/610 = 75%). Of the 463 surveys returned, 336 met the eligibility criteria (current position was either principal or assistant principal and school included at least one of the grades of 9 through 12). Descriptive statistics examined the prevalence of specific policies and practices. Chi2 analysis examined associations between policies and practices and school variables. Among principals, 65% believed it was important to have a nutrition policy for the high school; however, only 32% reported a policy at their school. Principals reported positive attitudes about providing a healthful school food environment, but 98% of the schools had soft drink vending machines and 77% had contracts with soft drink companies. Food sold at school fundraisers was most often candy, fruit, and cookies. Dietetics professionals who work in secondary school settings should collaborate with other key school staff members and parents to develop and implement a comprehensive school nutrition policy. Such a policy could foster a school food environment that is supportive of healthful food choices among youth.

  6. Simultaneous spectrophotometric determination of glimepiride and pioglitazone in binary mixture and combined dosage form using chemometric-assisted techniques

    NASA Astrophysics Data System (ADS)

    El-Zaher, Asmaa A.; Elkady, Ehab F.; Elwy, Hanan M.; Saleh, Mahmoud Abo El Makarim

    2017-07-01

    In the present work, pioglitazone and glimepiride, 2 widely used antidiabetics, were simultaneously determined by a chemometric-assisted UV-spectrophotometric method which was applied to a binary synthetic mixture and a pharmaceutical preparation containing both drugs. Three chemometric techniques - Concentration residual augmented classical least-squares (CRACLS), principal component regression (PCR), and partial least-squares (PLS) were implemented by using the synthetic mixtures containing the two drugs in acetonitrile. The absorbance data matrix corresponding to the concentration data matrix was obtained by the measurements of absorbencies in the range between 215 and 235 nm in the intervals with Δλ = 0.4 nm in their zero-order spectra. Then, calibration or regression was obtained by using the absorbance data matrix and concentration data matrix for the prediction of the unknown concentrations of pioglitazone and glimepiride in their mixtures. The described techniques have been validated by analyzing synthetic mixtures containing the two drugs showing good mean recovery values lying between 98 and 100%. In addition, accuracy and precision of the three methods have been assured by recovery values lying between 98 and 102% and R.S.D. % ˂0.6 for intra-day precision and ˂1.2 for inter-day precision. The proposed chemometric techniques were successfully applied to a pharmaceutical preparation containing a combination of pioglitazone and glimepiride in the ratio of 30: 4, showing good recovery values. Finally, statistical analysis was carried out to add a value to the verification of the proposed methods. It was carried out by an intrinsic comparison between the 3 chemometric techniques and by comparing values of present methods with those obtained by implementing reference pharmacopeial methods for each of pioglitazone and glimepiride.

  7. Estimating the palliative effect of percutaneous endoscopic gastrostomy in an observational registry using principal stratification and generalized propensity scores

    PubMed Central

    Mishra-Kalyani, Pallavi S.; Johnson, Brent A.; Glass, Jonathan D.; Long, Qi

    2016-01-01

    Clinical disease registries offer a rich collection of valuable patient information but also pose challenges that require special care and attention in statistical analyses. The goal of this paper is to propose a statistical framework that allows for estimating the effect of surgical insertion of a percutaneous endogastrostomy (PEG) tube for patients living with amyotrophic lateral sclerosis (ALS) using data from a clinical registry. Although all ALS patients are informed about PEG, only some patients agree to the procedure which, leads to the potential for selection bias. Assessing the effect of PEG is further complicated by the aggressively fatal disease, such that time to death competes directly with both the opportunity to receive PEG and clinical outcome measurements. Our proposed methodology handles the “censoring by death” phenomenon through principal stratification and selection bias for PEG treatment through generalized propensity scores. We develop a fully Bayesian modeling approach to estimate the survivor average causal effect (SACE) of PEG on BMI, a surrogate outcome measure of nutrition and quality of life. The use of propensity score methods within the principal stratification framework demonstrates a significant and positive effect of PEG treatment, particularly when time of treatment is included in the treatment definition. PMID:27640365

  8. Estimating the palliative effect of percutaneous endoscopic gastrostomy in an observational registry using principal stratification and generalized propensity scores

    NASA Astrophysics Data System (ADS)

    Mishra-Kalyani, Pallavi S.; Johnson, Brent A.; Glass, Jonathan D.; Long, Qi

    2016-09-01

    Clinical disease registries offer a rich collection of valuable patient information but also pose challenges that require special care and attention in statistical analyses. The goal of this paper is to propose a statistical framework that allows for estimating the effect of surgical insertion of a percutaneous endogastrostomy (PEG) tube for patients living with amyotrophic lateral sclerosis (ALS) using data from a clinical registry. Although all ALS patients are informed about PEG, only some patients agree to the procedure which, leads to the potential for selection bias. Assessing the effect of PEG is further complicated by the aggressively fatal disease, such that time to death competes directly with both the opportunity to receive PEG and clinical outcome measurements. Our proposed methodology handles the “censoring by death” phenomenon through principal stratification and selection bias for PEG treatment through generalized propensity scores. We develop a fully Bayesian modeling approach to estimate the survivor average causal effect (SACE) of PEG on BMI, a surrogate outcome measure of nutrition and quality of life. The use of propensity score methods within the principal stratification framework demonstrates a significant and positive effect of PEG treatment, particularly when time of treatment is included in the treatment definition.

  9. Covariate selection with iterative principal component analysis for predicting physical

    USDA-ARS?s Scientific Manuscript database

    Local and regional soil data can be improved by coupling new digital soil mapping techniques with high resolution remote sensing products to quantify both spatial and absolute variation of soil properties. The objective of this research was to advance data-driven digital soil mapping techniques for ...

  10. Technique and interpretation in tree seed radiography

    Treesearch

    Howard B. Kriebel

    1966-01-01

    The study of internal seed structure by radiography requires techniques which will give good definition. To establish the best procedures, we conducted a series of experiments in which we manipulated the principal controllable variables affecting the quality of X-radiographs: namely, focus-to-film distance, film speed (grain), exposure time, kilovoltage, and...

  11. Further Research Using a Psychological Diary Technique to Investigate Psychosomatic Relationships.

    ERIC Educational Resources Information Center

    Robbins, Paul R.; Tanck, Roland H.

    1982-01-01

    Reported further data using a psychological diary technique designed to monitor emotional states over time. The principal factors identified were interpersonal stress, depression-isolation, and physical complaints. Items in both the interpersonal stress and depression-isolation factors tended to be related positively to physical complaints…

  12. Nonmedical influences on medical decision making: an experimental technique using videotapes, factorial design, and survey sampling.

    PubMed Central

    Feldman, H A; McKinlay, J B; Potter, D A; Freund, K M; Burns, R B; Moskowitz, M A; Kasten, L E

    1997-01-01

    OBJECTIVE: To study nonmedical influences on the doctor-patient interaction. A technique using simulated patients and "real" doctors is described. DATA SOURCES: A random sample of physicians, stratified on such characteristics as demographics, specialty, or experience, and selected from commercial and professional listings. STUDY DESIGN: A medical appointment is depicted on videotape by professional actors. The patient's presenting complaint (e.g., chest pain) allows a range of valid interpretation. Several alternative versions are taped, featuring the same script with patient-actors of different age, sex, race, or other characteristics. Fractional factorial design is used to select a balanced subset of patient characteristics, reducing costs without biasing the outcome. DATA COLLECTION: Each physician is shown one version of the videotape appointment and is asked to describe how he or she would diagnose or treat such a patient. PRINCIPAL FINDINGS: Two studies using this technique have been completed to date, one involving chest pain and dyspnea and the other involving breast cancer. The factorial design provided sufficient power, despite limited sample size, to demonstrate with statistical significance various influences of the experimental and stratification variables, including the patient's gender and age and the physician's experience. Persistent recruitment produced a high response rate, minimizing selection bias and enhancing validity. CONCLUSION: These techniques permit us to determine, with a degree of control unattainable in observational studies, whether medical decisions as described by actual physicians and drawn from a demographic or professional group of interest, are influenced by a prescribed set of nonmedical factors. PMID:9240285

  13. Qualitative Assessments via Infrared Vision of Sub-surface Defects Present Beneath Decorative Surface Coatings

    NASA Astrophysics Data System (ADS)

    Sfarra, Stefano; Fernandes, Henrique C.; López, Fernando; Ibarra-Castanedo, Clemente; Zhang, Hai; Maldague, Xavier

    2018-01-01

    In this work, the potentialities of the infrared vision to explore sub-superficial defects in polychromatic statues were investigated. In particular, it was possible to understand how the reflector effect of the exterior golden layers could be minimized, applying advanced statistical algorithms to thermal images. Since this noble metal is present as external coating in both artworks, an in-depth discussion concerning its physicochemical properties is also added. In this context, the principal component thermography technique and, the more recent, partial least squares thermography technique were used on three different datasets recorded, providing long thermal stimuli. The main images were compared both to phasegrams and to the thermographic signal reconstruction results in order to have a clear outline of the situation to be debated. The effects of view factors on the radiation transfer linked to the specular reflections from the surface did not falsely highlight certain features inadvertently. Indeed, the raw thermograms were analyzed one by one. Reflectograms were used to pinpoint emissivity variations due to, e. g., possible repainting. The paper concludes that, as it is possible to understand from a physical point of view, the near-infrared reflectography technique is able to examine the state of conservation of the upper layers in cultural heritage objects, while the infrared thermography technique explores them more in-depth. The thesis statement is based on the thermal and nonthermal parts of the infrared region, therefore, indicating what can be detected by heating the surface and what can be visualized by illuminating the surface, bearing in mind the nature of the external coating.

  14. Statistical error model for a solar electric propulsion thrust subsystem

    NASA Technical Reports Server (NTRS)

    Bantell, M. H.

    1973-01-01

    The solar electric propulsion thrust subsystem statistical error model was developed as a tool for investigating the effects of thrust subsystem parameter uncertainties on navigation accuracy. The model is currently being used to evaluate the impact of electric engine parameter uncertainties on navigation system performance for a baseline mission to Encke's Comet in the 1980s. The data given represent the next generation in statistical error modeling for low-thrust applications. Principal improvements include the representation of thrust uncertainties and random process modeling in terms of random parametric variations in the thrust vector process for a multi-engine configuration.

  15. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  16. Intelligent Transportation Systems (ITS) logical architecture : volume 3 : data dictionary

    DOT National Transportation Integrated Search

    1982-01-01

    A Guide to Reporting Highway Statistics is a principal part of Federal Highway Administration's comprehensive highway information collection effort. This Guide has two objectives: 1) To serve as a reference to the reporting system that the Federal Hi...

  17. Dimensionality of Community Satisfaction.

    ERIC Educational Resources Information Center

    Williams, R. Gary; Knop, Edward

    Using factor analysis (both principal factor solutions and rotated factor solutions) to search for underlying statistical commonality among 12 indicators of community satisfaction in 6 Colorado communities, the research explores various dimensions of community satisfaction that may be important across different communities. The communities under…

  18. Urban household energy use in Thailand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tyler, S.R.

    Changes in household fuel and electricity use that accompany urbanization in Third World countries bear large economic and environmental costs. The processes driving the fuel transition, and the policy mechanisms by which it can be influenced, need to be better understood for the sake of forecasting and planning, especially in the case of electricity demand. This study examines patterns of household fuel use and electrical appliance utilization in Bangkok, Chieng Mai and Ayutthaya, Thailand, based on the results of a household energy survey. Survey data are statistically analyzed using a variety of multiple regression techniques to evaluate the relative influencemore » of various household and fuel characteristics on fuel and appliance choice. Results suggest that changes to the value of women's time in urban households, as women become increasingly active in the labor force, have a major influence on patterns of household energy use. The use of the home for small-scale commercial activities, particularly food preparation, also has a significant influence on fuel choice. In general, household income does not prove to be an important factor in fuel and appliance selection in these cities, although income is closely related to total electricity use. The electricity use of individual household appliances is also analyzed using statistical techniques as well as limited direct metering. The technology of appliance production in Thailand is evaluated through interviews with manufacturers and comparisons of product performance. These data are used to develop policy recommendations for improving the efficiency of electrical appliances in Thailand by relying principally on the dynamism of the consumer goods market, rather than direct regulation. The annual electricity savings from the recommended program for fostering rapid adoption of efficient technologies are estimated to reach 1800 GWh by the year 2005 for urban households alone.« less

  19. Head-and-face shape variations of U.S. civilian workers

    PubMed Central

    Zhuang, Ziqing; Shu, Chang; Xi, Pengcheng; Bergman, Michael; Joseph, Michael

    2016-01-01

    The objective of this study was to quantify head-and-face shape variations of U.S. civilian workers using modern methods of shape analysis. The purpose of this study was based on previously highlighted changes in U.S. civilian worker head-and-face shape over the last few decades – touting the need for new and better fitting respirators – as well as the study's usefulness in designing more effective personal protective equipment (PPE) – specifically in the field of respirator design. The raw scan three-dimensional (3D) data for 1169 subjects were parameterized using geometry processing techniques. This process allowed the individual scans to be put in correspondence with each other in such a way that statistical shape analysis could be performed on a dense set of 3D points. This process also cleaned up the original scan data such that the noise was reduced and holes were filled in. The next step, statistical analysis of the variability of the head-and-face shape in the 3D database, was conducted using Principal Component Analysis (PCA) techniques. Through these analyses, it was shown that the space of the head-and-face shape was spanned by a small number of basis vectors. Less than 50 components explained more than 90% of the variability. Furthermore, the main mode of variations could be visualized through animating the shape changes along the PCA axes with computer software in executable form for Windows XP. The results from this study in turn could feed back into respirator design to achieve safer, more efficient product style and sizing. Future study is needed to determine the overall utility of the point cloud-based approach for the quantification of facial morphology variation and its relationship to respirator performance. PMID:23399025

  20. Head-and-face shape variations of U.S. civilian workers.

    PubMed

    Zhuang, Ziqing; Shu, Chang; Xi, Pengcheng; Bergman, Michael; Joseph, Michael

    2013-09-01

    The objective of this study was to quantify head-and-face shape variations of U.S. civilian workers using modern methods of shape analysis. The purpose of this study was based on previously highlighted changes in U.S. civilian worker head-and-face shape over the last few decades - touting the need for new and better fitting respirators - as well as the study's usefulness in designing more effective personal protective equipment (PPE) - specifically in the field of respirator design. The raw scan three-dimensional (3D) data for 1169 subjects were parameterized using geometry processing techniques. This process allowed the individual scans to be put in correspondence with each other in such a way that statistical shape analysis could be performed on a dense set of 3D points. This process also cleaned up the original scan data such that the noise was reduced and holes were filled in. The next step, statistical analysis of the variability of the head-and-face shape in the 3D database, was conducted using Principal Component Analysis (PCA) techniques. Through these analyses, it was shown that the space of the head-and-face shape was spanned by a small number of basis vectors. Less than 50 components explained more than 90% of the variability. Furthermore, the main mode of variations could be visualized through animating the shape changes along the PCA axes with computer software in executable form for Windows XP. The results from this study in turn could feed back into respirator design to achieve safer, more efficient product style and sizing. Future study is needed to determine the overall utility of the point cloud-based approach for the quantification of facial morphology variation and its relationship to respirator performance. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. A statistical model of catheter motion from interventional x-ray images: application to image-based gating

    NASA Astrophysics Data System (ADS)

    Panayiotou, M.; King, A. P.; Ma, Y.; Housden, R. J.; Rinaldi, C. A.; Gill, J.; Cooklin, M.; O'Neill, M.; Rhode, K. S.

    2013-11-01

    The motion and deformation of catheters that lie inside cardiac structures can provide valuable information about the motion of the heart. In this paper we describe the formation of a novel statistical model of the motion of a coronary sinus (CS) catheter based on principal component analysis of tracked electrode locations from standard mono-plane x-ray fluoroscopy images. We demonstrate the application of our model for the purposes of retrospective cardiac and respiratory gating of x-ray fluoroscopy images in normal dose x-ray fluoroscopy images, and demonstrate how a modification of the technique allows application to very low dose scenarios. We validated our method on ten mono-plane imaging sequences comprising a total of 610 frames from ten different patients undergoing radiofrequency ablation for the treatment of atrial fibrillation. For normal dose images we established systole, end-inspiration and end-expiration gating with success rates of 100%, 92.1% and 86.9%, respectively. For very low dose applications, the method was tested on the same ten mono-plane x-ray fluoroscopy sequences without noise and with added noise at signal to noise ratio (SNR) values of √50, √10, √8, √6, √5, √2 and √1 to simulate the image quality of increasingly lower dose x-ray images. The method was able to detect the CS catheter even in the lowest SNR images with median errors not exceeding 2.6 mm per electrode. Furthermore, gating success rates of 100%, 71.4% and 85.7% were achieved at the low SNR value of √2, representing a dose reduction of more than 25 times. Thus, the technique has the potential to extract useful information whilst substantially reducing the radiation exposure.

  2. Sequential projection pursuit for optimised vibration-based damage detection in an experimental wind turbine blade

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2018-02-01

    To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.

  3. Statistical modeling and MAP estimation for body fat quantification with MRI ratio imaging

    NASA Astrophysics Data System (ADS)

    Wong, Wilbur C. K.; Johnson, David H.; Wilson, David L.

    2008-03-01

    We are developing small animal imaging techniques to characterize the kinetics of lipid accumulation/reduction of fat depots in response to genetic/dietary factors associated with obesity and metabolic syndromes. Recently, we developed an MR ratio imaging technique that approximately yields lipid/{lipid + water}. In this work, we develop a statistical model for the ratio distribution that explicitly includes a partial volume (PV) fraction of fat and a mixture of a Rician and multiple Gaussians. Monte Carlo hypothesis testing showed that our model was valid over a wide range of coefficient of variation of the denominator distribution (c.v.: 0-0:20) and correlation coefficient among the numerator and denominator (ρ 0-0.95), which cover the typical values that we found in MRI data sets (c.v.: 0:027-0:063, ρ: 0:50-0:75). Then a maximum a posteriori (MAP) estimate for the fat percentage per voxel is proposed. Using a digital phantom with many PV voxels, we found that ratio values were not linearly related to PV fat content and that our method accurately described the histogram. In addition, the new method estimated the ground truth within +1.6% vs. +43% for an approach using an uncorrected ratio image, when we simply threshold the ratio image. On the six genetically obese rat data sets, the MAP estimate gave total fat volumes of 279 +/- 45mL, values 21% smaller than those from the uncorrected ratio images, principally due to the non-linear PV effect. We conclude that our algorithm can increase the accuracy of fat volume quantification even in regions having many PV voxels, e.g. ectopic fat depots.

  4. Multivariate statistical analysis of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  5. Surface-enhanced Raman spectroscopy introduced into the International Standard Organization (ISO) regulations as an alternative method for detection and identification of pathogens in the food industry.

    PubMed

    Witkowska, Evelin; Korsak, Dorota; Kowalska, Aneta; Księżopolska-Gocalska, Monika; Niedziółka-Jönsson, Joanna; Roźniecka, Ewa; Michałowicz, Weronika; Albrycht, Paweł; Podrażka, Marta; Hołyst, Robert; Waluk, Jacek; Kamińska, Agnieszka

    2017-02-01

    We show that surface-enhanced Raman spectroscopy (SERS) coupled with principal component analysis (PCA) can serve as a fast, reliable, and easy method for detection and identification of food-borne bacteria, namely Salmonella spp., Listeria monocytogenes, and Cronobacter spp., in different types of food matrices (salmon, eggs, powdered infant formula milk, mixed herbs, respectively). The main aim of this work was to introduce the SERS technique into three ISO (6579:2002; 11290-1:1996/A1:2004; 22964:2006) standard procedures required for detection of these bacteria in food. Our study demonstrates that the SERS technique is effective in distinguishing very closely related bacteria within a genus grown on solid and liquid media. The advantages of the proposed ISO-SERS method for bacteria identification include simplicity and reduced time of analysis, from almost 144 h required by standard methods to 48 h for the SERS-based approach. Additionally, PCA allows one to perform statistical classification of studied bacteria and to identify the spectrum of an unknown sample. Calculated first and second principal components (PC-1, PC-2) account for 96, 98, and 90% of total variance in the spectra and enable one to identify the Salmonella spp., L. monocytogenes, and Cronobacter spp., respectively. Moreover, the presented study demonstrates the excellent possibility for simultaneous detection of analyzed food-borne bacteria in one sample test (98% of PC-1 and PC-2) with a goal of splitting the data set into three separated clusters corresponding to the three studied bacteria species. The studies described in this paper suggest that SERS represents an alternative to standard microorganism diagnostic procedures. Graphical Abstract New approach of the SERS strategy for detection and identification of food-borne bacteria, namely S. enterica, L. monocytogenes, and C. sakazakii in selected food matrices.

  6. Quarry identification of historical building materials by means of laser induced breakdown spectroscopy, X-ray fluorescence and chemometric analysis

    NASA Astrophysics Data System (ADS)

    Colao, F.; Fantoni, R.; Ortiz, P.; Vazquez, M. A.; Martin, J. M.; Ortiz, R.; Idris, N.

    2010-08-01

    To characterize historical building materials according to the geographic origin of the quarries from which they have been mined, the relative content of major and trace elements were determined by means of Laser Induced Breakdown Spectroscopy (LIBS) and X-ray Fluorescence (XRF) techniques. 48 different specimens were studied and the entire samples' set was divided in two different groups: the first, used as reference set, was composed by samples mined from eight different quarries located in Seville province; the second group was composed by specimens of unknown provenance collected in several historical buildings and churches in the city of Seville. Data reduction and analysis on laser induced breakdown spectroscopy and X-ray fluorescence measurements was performed using multivariate statistical approach, namely the Linear Discriminant Analysis (LDA), Principal Component Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA). A clear separation among reference sample materials mined from different quarries was observed in Principal Components (PC) score plots, then a supervised soft independent modeling of class analogy classification was trained and run, aiming to assess the provenance of unknown samples according to their elemental content. The obtained results were compared with the provenance assignments made on the basis of petrographical description. This work gives experimental evidence that laser induced breakdown spectroscopy measurements on a relatively small set of elements is a fast and effective method for the purpose of origin identification.

  7. Detection of Cell Wall Chemical Variation in Zea Mays Mutants Using Near-Infrared Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buyck, N.; Thomas, S.

    Corn stover is regarded as the prime candidate feedstock material for commercial biomass conversion in the United States. Variations in chemical composition of Zea mays cell walls can affect biomass conversion process yields and economics. Mutant lines were constructed by activating a Mu transposon system. The cell wall chemical composition of 48 mutant families was characterized using near-infrared (NIR) spectroscopy. NIR data were analyzed using a multivariate statistical analysis technique called Principal Component Analysis (PCA). PCA of the NIR data from 349 maize leaf samples reveals 57 individuals as outliers on one or more of six Principal Components (PCs) atmore » the 95% confidence interval. Of these, 19 individuals from 16 families are outliers on either PC3 (9% of the variation) or PC6 (1% of the variation), the two PCs that contain information about cell wall polymers. Those individuals for which altered cell wall chemistry is confirmed with wet chemical analysis will then be subjected to fermentation analysis to determine whether or not biomass conversion process kinetics, yields and/or economics are significantly affected. Those mutants that provide indications for a decrease in process cost will be pursued further to identify the gene(s) responsible for the observed changes in cell wall composition and associated changes in process economics. These genes will eventually be incorporated into maize breeding programs directed at the development of a truly dual use crop.« less

  8. Field Validity and Feasibility of Four Techniques for the Detection of Trichuris in Simians: A Model for Monitoring Drug Efficacy in Public Health?

    PubMed Central

    Levecke, Bruno; De Wilde, Nathalie; Vandenhoute, Els; Vercruysse, Jozef

    2009-01-01

    Background Soil-transmitted helminths, such as Trichuris trichiura, are of major concern in public health. Current efforts to control these helminth infections involve periodic mass treatment in endemic areas. Since these large-scale interventions are likely to intensify, monitoring the drug efficacy will become indispensible. However, studies comparing detection techniques based on sensitivity, fecal egg counts (FEC), feasibility for mass diagnosis and drug efficacy estimates are scarce. Methodology/Principal Findings In the present study, the ether-based concentration, the Parasep Solvent Free (SF), the McMaster and the FLOTAC techniques were compared based on both validity and feasibility for the detection of Trichuris eggs in 100 fecal samples of nonhuman primates. In addition, the drug efficacy estimates of quantitative techniques was examined using a statistical simulation. Trichuris eggs were found in 47% of the samples. FLOTAC was the most sensitive technique (100%), followed by the Parasep SF (83.0% [95% confidence interval (CI): 82.4–83.6%]) and the ether-based concentration technique (76.6% [95% CI: 75.8–77.3%]). McMaster was the least sensitive (61.7% [95% CI: 60.7–62.6%]) and failed to detect low FEC. The quantitative comparison revealed a positive correlation between the four techniques (Rs = 0.85–0.93; p<0.0001). However, the ether-based concentration technique and the Parasep SF detected significantly fewer eggs than both the McMaster and the FLOTAC (p<0.0083). Overall, the McMaster was the most feasible technique (3.9 min/sample for preparing, reading and cleaning of the apparatus), followed by the ether-based concentration technique (7.7 min/sample) and the FLOTAC (9.8 min/sample). Parasep SF was the least feasible (17.7 min/sample). The simulation revealed that the sensitivity is less important for monitoring drug efficacy and that both FLOTAC and McMaster were reliable estimators. Conclusions/Significance The results of this study demonstrated that McMaster is a promising technique when making use of FEC to monitor drug efficacy in Trichuris. PMID:19172171

  9. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  10. Automated Assessment of Child Vocalization Development Using LENA.

    PubMed

    Richards, Jeffrey A; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-07-12

    To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Assessment was based on full-day audio recordings collected in a child's unrestricted, natural language environment. AVA estimates were derived using automatic speech recognition modeling techniques to categorize and quantify the sounds in child vocalizations (e.g., protophones and phonemes). These were expressed as phone and biphone frequencies, reduced to principal components, and inputted to age-based multiple linear regression models to predict independently collected criterion-expressive language scores. From these models, we generated vocal development AVA estimates as age-standardized scores and development age estimates. AVA estimates demonstrated strong statistical reliability and validity when compared with standard criterion expressive language assessments. Automated analysis of child vocalizations extracted from full-day recordings in natural settings offers a novel and efficient means to assess children's expressive vocal development. More research remains to identify specific mechanisms of operation.

  11. Attenuated total reflectance-FT-IR spectroscopy for gunshot residue analysis: potential for ammunition determination.

    PubMed

    Bueno, Justin; Sikirzhytski, Vitali; Lednev, Igor K

    2013-08-06

    The ability to link a suspect to a particular shooting incident is a principal task for many forensic investigators. Here, we attempt to achieve this goal by analysis of gunshot residue (GSR) through the use of attenuated total reflectance (ATR) Fourier transform infrared spectroscopy (FT-IR) combined with statistical analysis. The firearm discharge process is analogous to a complex chemical process. Therefore, the products of this process (GSR) will vary based upon numerous factors, including the specific combination of the firearm and ammunition which was discharged. Differentiation of FT-IR data, collected from GSR particles originating from three different firearm-ammunition combinations (0.38 in., 0.40 in., and 9 mm calibers), was achieved using projection to latent structures discriminant analysis (PLS-DA). The technique was cross (leave-one-out), both internally and externally, validated. External validation was achieved via assignment (caliber identification) of unknown FT-IR spectra from unknown GSR particles. The results demonstrate great potential for ATR-FT-IR spectroscopic analysis of GSR for forensic purposes.

  12. An expert system for wind shear avoidance

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Stratton, D. Alexander

    1990-01-01

    The principal objectives are to develop methods for assessing the likelihood of wind shear encounter (based on real-time information in the cockpit), for deciding what flight path to pursue (e.g., takeoff abort, landing go-around, or normal climbout or glide slope), and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's autopilot and flight directors for both automatic and manually controlled flight. The expert system for pilot aiding is based on the results of the FAA Windshear Training Aids Program, a two-volume manual that presents an overview, pilot guide, training program, and substantiating data that provides guidelines for this initial development. The Windshear Safety Advisor expert system currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP Machine.

  13. Automotive System for Remote Surface Classification.

    PubMed

    Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail

    2017-04-01

    In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions.

  14. Detection of Leukemia with Blood Samples Using Raman Spectroscopy and Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Martínez-Espinosa, J. C.; González-Solís, J. L.; Frausto-Reyes, C.; Miranda-Beltrán, M. L.; Soria-Fregoso, C.; Medina-Valtierra, J.

    2009-06-01

    The use of Raman spectroscopy to analyze blood biochemistry and hence distinguish between normal and abnormal blood was investigated. Blood samples were obtained from 6 patients who were clinically diagnosed with leukemia and 6 healthy volunteers. The imprint was put under the microscope and several points were chosen for Raman measurement. All the spectra were collected by a confocal Raman micro-spectroscopy (Renishaw) with a NIR 830 nm laser. It is shown that the serum samples from patients with leukemia and from the control group can be discriminated when the multivariate statistical methods of principal component analysis (PCA) and linear discriminated analysis (LDA) are applied to their Raman spectra. The ratios of some band intensities were analyzed and some band ratios were significant and corresponded to proteins, phospholipids, and polysaccharides. The preliminary results suggest that Raman Spectroscopy could be a new technique to study the degree of damage to the bone marrow using just blood samples instead of biopsies, treatment very painful for patients.

  15. Insights on the Spectral Signatures of Stellar Activity and Planets from PCA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Allen B.; Fischer, Debra A.; Cisewski, Jessi

    Photospheric velocities and stellar activity features such as spots and faculae produce measurable radial velocity signals that currently obscure the detection of sub-meter-per-second planetary signals. However, photospheric velocities are imprinted differently in a high-resolution spectrum than are Keplerian Doppler shifts. Photospheric activity produces subtle differences in the shapes of absorption lines due to differences in how temperature or pressure affects the atomic transitions. In contrast, Keplerian Doppler shifts affect every spectral line in the same way. With a high enough signal-to-noise (S/N) and resolution, statistical techniques can exploit differences in spectra to disentangle the photospheric velocities and detect lower-amplitude exoplanetmore » signals. We use simulated disk-integrated time-series spectra and principal component analysis (PCA) to show that photospheric signals introduce spectral line variability that is distinct from that of Doppler shifts. We quantify the impact of instrumental resolution and S/N for this work.« less

  16. Automotive System for Remote Surface Classification

    PubMed Central

    Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail

    2017-01-01

    In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions. PMID:28368297

  17. Composition and Chemical Variability of Enantia polycarpa Engler & Diels Leaf Essential Oil from Côte d'Ivoire.

    PubMed

    Yapi, Thierry Acafou; Ouattara, Zana Adama; Boti, Jean Brice; Tonzibo, Zanahi Félix; Paoli, Mathieu; Bighelli, Ange; Casanova, Joseph; Tomi, Félix

    2018-05-13

    The composition of Enantia polycarpa Engler & Diels leaf essential oil has been investigated for the first time using a combination of chromatographic and spectroscopic techniques. The compositions of 52 leaf essential oil samples have been subjected to statistical analysis, hierarchical cluster analysis (HCA) and principal component analysis (PCA). Four groups were differentiated, whose compositions were dominated by β-elemene and germacrene B (Group III, 22/52 samples); germacrene D (Group I, 16/52 samples); β-cubebene (Group IV, 8/52 samples) and by germacrene B and germacrene D (Group II, 6/52 samples). A special attention was brought to the quantification of the thermolabile components, germacrene A, germacrene B and germacrene C, as well as that of their rearranged compounds, β-elemene, γ-elemene and δ-elemene. 13 C NMR data of β-cubebene have been provided. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  18. Investigating sub-2 μm particle stationary phase supercritical fluid chromatography coupled to mass spectrometry for chemical profiling of chamomile extracts.

    PubMed

    Jones, Michael D; Avula, Bharathi; Wang, Yan-Hong; Lu, Lu; Zhao, Jianping; Avonto, Cristina; Isaac, Giorgis; Meeker, Larry; Yu, Kate; Legido-Quigley, Cristina; Smith, Norman; Khan, Ikhlas A

    2014-10-17

    Roman and German chamomile are widely used throughout the world. Chamomiles contain a wide variety of active constituents including sesquiterpene lactones. Various extraction techniques were performed on these two types of chamomile. A packed-column supercritical fluid chromatography-mass spectrometry method was designed for the identification of sesquiterpenes and other constituents from chamomile extracts with no derivatization step prior to analysis. Mass spectrometry detection was achieved by using electrospray ionization. All of the compounds of interest were separated within 15 min. The chamomile extracts were analyzed and compared for similarities and distinct differences. Multivariate statistical analysis including principal component analysis and orthogonal partial least squares-discriminant analysis (OPLS-DA) were used to differentiate between the chamomile samples. German chamomile samples confirmed the presence of cis- and trans-tonghaosu, chrysosplenols, apigenin diglucoside whereas Roman chamomile samples confirmed the presence of apigenin, nobilin, 1,10-epioxynobilin, and hydroxyisonobilin. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. ToF-SIMS PCA analysis of Myrtus communis L.

    NASA Astrophysics Data System (ADS)

    Piras, F. M.; Dettori, M. F.; Magnani, A.

    2009-06-01

    Nowadays there is a growing interest of researchers for the application of sophisticated analytical techniques in conjunction with statistical data analysis methods to the characterization of natural products to assure their authenticity and quality, and for the possibility of direct analysis of food to obtain maximum information. In this work, time-of-flight secondary ion mass spectrometry (ToF-SIMS) in conjunction with principal components analysis (PCA) are applied to study the chemical composition and variability of Sardinian myrtle ( Myrtus communis L.) through the analysis of both berries alcoholic extracts and berries epicarp. ToF-SIMS spectra of berries epicarp show that the epicuticular waxes consist mainly of carboxylic acids with chain length ranging from C20 to C30, or identical species formed from fragmentation of long-chain esters. PCA of ToF-SIMS data from myrtle berries epicarp distinguishes two groups characterized by a different surface concentration of triacontanoic acid. Variability in antocyanins, flavonols, α-tocopherol, and myrtucommulone contents is showed by ToF-SIMS PCA analysis of myrtle berries alcoholic extracts.

  20. Dynamic performance of an aero-assist spacecraft - AFE

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1992-01-01

    Dynamic performance of the Aero-assist Flight Experiment (AFE) spacecraft was investigated using a high-fidelity 6-DOF simulation model. Baseline guidance logic, control logic, and a strapdown navigation system to be used on the AFE spacecraft are also modeled in the 6-DOF simulation. During the AFE mission, uncertainties in the environment and the spacecraft are described by an error space which includes both correlated and uncorrelated error sources. The principal error sources modeled in this study include navigation errors, initial state vector errors, atmospheric variations, aerodynamic uncertainties, center-of-gravity off-sets, and weight uncertainties. The impact of the perturbations on the spacecraft performance is investigated using Monte Carlo repetitive statistical techniques. During the Solid Rocket Motor (SRM) deorbit phase, a target flight path angle of -4.76 deg at entry interface (EI) offers very high probability of avoiding SRM casing skip-out from the atmosphere. Generally speaking, the baseline designs of the guidance, navigation, and control systems satisfy most of the science and mission requirements.

  1. Chemical characterization of polycyclic aromatic hydrocarbons (PAHs) in 2013 Rayong oil spill-affected coastal areas of Thailand.

    PubMed

    Pongpiachan, S; Hattayanone, M; Tipmanee, D; Suttinun, O; Khumsup, C; Kittikoon, I; Hirunyatrakul, P

    2018-02-01

    Among Southeast Asian countries, Thailand has gradually accustomed to extremely prompt urbanization, motorization, and industrialization. Chonburi and Rayong provinces are two provinces involved in "eastern seaboard" industrial zones, which is an emerging economic region that plays a key role in Thailand's economy. The 2013 Rayong oil spill did not only cause damages to the coastal and maritime environment, but also undermine trust in the overall safety system and negatively affect the investor confidence. In this study, 69 coastal soils collected around Koh Samed Island were chemically extracted and analyzed for 15 PAHs by using a Shimadzu GCMS-QP2010 Ultra system comprising a high-speed performance system with ASSP function. In this study, numerous diagnostic binary ratios were applied to identify potential sources of PAHs. Advanced statistical techniques such as hierarchical cluster analysis coupled with principal component analysis were also conducted for further investigations of source identifications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Untargeted metabolomics reveals specific withanolides and fatty acyl glycoside as tentative metabolites to differentiate organic and conventional Physalis peruviana fruits.

    PubMed

    Llano, Sandra M; Muñoz-Jiménez, Ana M; Jiménez-Cartagena, Claudio; Londoño-Londoño, Julián; Medina, Sonia

    2018-04-01

    The agronomic production systems may affect the levels of food metabolites. Metabolomics approaches have been applied as useful tool for the characterization of fruit metabolome. In this study, metabolomics techniques were used to assess the differences in phytochemical composition between goldenberry samples produced by organic and conventional systems. To verify that the organic samples were free of pesticides, individual pesticides were analyzed. Principal component analysis showed a clear separation of goldenberry samples from two different farming systems. Via targeted metabolomics assays, whereby carotenoids and ascorbic acid were analyzed, not statistical differences between both crops were found. Conversely, untargeted metabolomics allowed us to identify two withanolides and one fatty acyl glycoside as tentative metabolites to differentiate goldenberry fruits, recording organic fruits higher amounts of these compounds than conventional samples. Hence, untargeted metabolomics technology could be suitable to research differences on phytochemicals under different agricultural management practices and to authenticate organic products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The Efficacy of Galaxy Shape Parameters in Photometric Redshift Estimation: A Neural Network Approach

    NASA Astrophysics Data System (ADS)

    Singal, J.; Shmakova, M.; Gerke, B.; Griffith, R. L.; Lotz, J.

    2011-05-01

    We present a determination of the effects of including galaxy morphological parameters in photometric redshift estimation with an artificial neural network method. Neural networks, which recognize patterns in the information content of data in an unbiased way, can be a useful estimator of the additional information contained in extra parameters, such as those describing morphology, if the input data are treated on an equal footing. We use imaging and five band photometric magnitudes from the All-wavelength Extended Groth Strip International Survey (AEGIS). It is shown that certain principal components of the morphology information are correlated with galaxy type. However, we find that for the data used the inclusion of morphological information does not have a statistically significant benefit for photometric redshift estimation with the techniques employed here. The inclusion of these parameters may result in a tradeoff between extra information and additional noise, with the additional noise becoming more dominant as more parameters are added.

  4. IA-Regional-Radio - Social Network for Radio Recommendation

    NASA Astrophysics Data System (ADS)

    Dziczkowski, Grzegorz; Bougueroua, Lamine; Wegrzyn-Wolska, Katarzyna

    This chapter describes the functions of a system proposed for the music hit recommendation from social network data base. This system carries out the automatic collection, evaluation and rating of music reviewers and the possibility for listeners to rate musical hits and recommendations deduced from auditor's profiles in the form of regional Internet radio. First, the system searches and retrieves probable music reviews from the Internet. Subsequently, the system carries out an evaluation and rating of those reviews. From this list of music hits, the system directly allows notation from our application. Finally, the system automatically creates the record list diffused each day depending on the region, the year season, the day hours and the age of listeners. Our system uses linguistics and statistic methods for classifying music opinions and data mining techniques for recommendation part needed for recorded list creation. The principal task is the creation of popular intelligent radio adaptive on auditor's age and region - IA-Regional-Radio.

  5. Determination of the authenticity of plastron-derived functional foods based on amino acid profiles analysed by MEKC.

    PubMed

    Li, Lin-Qiu; Baibado, Joewel T; Shen, Qing; Cheung, Hon-Yeung

    2017-12-01

    Plastron is a nutritive and superior functional food. Due to its limited supply yet enormous demands, some functional foods supposed to contain plastron may be forged with other substitutes. This paper reports a novel and simple method for determination of the authenticity of plastron-derived functional foods based on comparison of the amino acid (AA) profiles of plastron and its possible substitutes. By applying micellar electrokinetic chromatography (MEKC), 18 common AAs along with another 2 special AAs - hydroxyproline (Hyp) and hydroxylysine (Hyl) were detected in all plastron samples. Since chicken, egg, fish, milk, pork, nail and hair lacked of Hyp and Hyl, plastron could be easily distinguished. For those containing collagen, a statistical analysis technique - principal component analysis (PCA) was adopted and plastron was successfully distinguished. When applied the proposed method to authenticate turtle shell glue in the market, fake products were commonly found. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A bibliography of planetary geology principal investigators and their associates, 1976-1978

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This bibliography cites publications submitted by 484 principal investigators and their associates who were supported through NASA's Office of Space Sciences Planetary Geology Program. Subject classifications include: solar system formation, comets, and asteroids; planetary satellites, planetary interiors, geological and geochemical constraints on planetary evolution; impact crater studies, volcanism, eolian studies, fluvian studies, Mars geological mapping; Mercury geological mapping; planetary cartography; and instrument development and techniques. An author/editor index is provided.

  7. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  8. Use of principal-component, correlation, and stepwise multiple-regression analyses to investigate selected physical and hydraulic properties of carbonate-rock aquifers

    USGS Publications Warehouse

    Brown, C. Erwin

    1993-01-01

    Correlation analysis in conjunction with principal-component and multiple-regression analyses were applied to laboratory chemical and petrographic data to assess the usefulness of these techniques in evaluating selected physical and hydraulic properties of carbonate-rock aquifers in central Pennsylvania. Correlation and principal-component analyses were used to establish relations and associations among variables, to determine dimensions of property variation of samples, and to filter the variables containing similar information. Principal-component and correlation analyses showed that porosity is related to other measured variables and that permeability is most related to porosity and grain size. Four principal components are found to be significant in explaining the variance of data. Stepwise multiple-regression analysis was used to see how well the measured variables could predict porosity and (or) permeability for this suite of rocks. The variation in permeability and porosity is not totally predicted by the other variables, but the regression is significant at the 5% significance level. ?? 1993.

  9. Measurement of 13C chemical shift tensor principal values with a magic-angle turning experiment.

    PubMed

    Hu, J Z; Orendt, A M; Alderman, D W; Pugmire, R J; Ye, C; Grant, D M

    1994-08-01

    The magic-angle turning (MAT) experiment introduced by Gan is developed into a powerful and routine method for measuring the principal values of 13C chemical shift tensors in powdered solids. A large-volume MAT probe with stable rotation frequencies down to 22 Hz is described. A triple-echo MAT pulse sequence is introduced to improve the quality of the two-dimensional baseplane. It is shown that measurements of the principal values of chemical shift tensors in complex compounds can be enhanced by using either short contact times or dipolar dephasing pulse sequences to isolate the powder patterns from protonated or non-protonated carbons, respectively. A model compound, 1,2,3-trimethoxybenzene, is used to demonstrate these techniques, and the 13C principal values in 2,3-dimethylnaphthalene and Pocahontas coal are reported as typical examples.

  10. Characterization of Surface Water and Groundwater Quality in the Lower Tano River Basin Using Statistical and Isotopic Approach.

    NASA Astrophysics Data System (ADS)

    Edjah, Adwoba; Stenni, Barbara; Cozzi, Giulio; Turetta, Clara; Dreossi, Giuliano; Tetteh Akiti, Thomas; Yidana, Sandow

    2017-04-01

    Adwoba Kua- Manza Edjaha, Barbara Stennib,c,Giuliano Dreossib, Giulio Cozzic, Clara Turetta c,T.T Akitid ,Sandow Yidanae a,eDepartment of Earth Science, University of Ghana Legon, Ghana West Africa bDepartment of Enviromental Sciences, Informatics and Statistics, Ca Foscari University of Venice, Italy cInstitute for the Dynamics of Environmental Processes, CNR, Venice, Italy dDepartment of Nuclear Application and Techniques, Graduate School of Nuclear and Allied Sciences University of Ghana Legon This research is part of a PhD research work "Hydrogeological Assessment of the Lower Tano river basin for sustainable economic usage, Ghana, West - Africa". In this study, the researcher investigated surface water and groundwater quality in the Lower Tano river basin. This assessment was based on some selected sampling sites associated with mining activities, and the development of oil and gas. Statistical approach was applied to characterize the quality of surface water and groundwater. Also, water stable isotopes, which is a natural tracer of the hydrological cycle was used to investigate the origin of groundwater recharge in the basin. The study revealed that Pb and Ni values of the surface water and groundwater samples exceeded the WHO standards for drinking water. In addition, water quality index (WQI), based on physicochemical parameters(EC, TDS, pH) and major ions(Ca2+, Na+, Mg2+, HCO3-,NO3-, CL-, SO42-, K+) exhibited good quality water for 60% of the sampled surface water and groundwater. Other statistical techniques, such as Heavy metal pollution index (HPI), degree of contamination (Cd), and heavy metal evaluation index (HEI), based on trace element parameters in the water samples, reveal that 90% of the surface water and groundwater samples belong to high level of pollution. Principal component analysis (PCA) also suggests that the water quality in the basin is likely affected by rock - water interaction and anthropogenic activities (sea water intrusion). This was confirm by further statistical analysis (cluster analysis and correlation matrix) of the water quality parameters. Spatial distribution of water quality parameters, trace elements and the results obtained from the statistical analysis was determined by geographical information system (GIS). In addition, the isotopic analysis of the sampled surface water and groundwater revealed that most of the surface water and groundwater were of meteoric origin with little or no isotopic variations. It is expected that outcomes of this research will form a baseline for making appropriate decision on water quality management by decision makers in the Lower Tano river Basin. Keywords: Water stable isotopes, Trace elements, Multivariate statistics, Evaluation indices, Lower Tano river basin.

  11. Improved detection of highly energetic materials traces on surfaces by standoff laser-induced thermal emission incorporating neural networks

    NASA Astrophysics Data System (ADS)

    Figueroa-Navedo, Amanda; Galán-Freyle, Nataly Y.; Pacheco-Londoño, Leonardo C.; Hernández-Rivera, Samuel P.

    2013-05-01

    Terrorists conceal highly energetic materials (HEM) as Improvised Explosive Devices (IED) in various types of materials such as PVC, wood, Teflon, aluminum, acrylic, carton and rubber to disguise them from detection equipment used by military and security agency personnel. Infrared emissions (IREs) of substrates, with and without HEM, were measured to generate models for detection and discrimination. Multivariable analysis techniques such as principal component analysis (PCA), soft independent modeling by class analogy (SIMCA), partial least squares-discriminant analysis (PLS-DA), support vector machine (SVM) and neural networks (NN) were employed to generate models, in which the emission of IR light from heated samples was stimulated using a CO2 laser giving rise to laser induced thermal emission (LITE) of HEMs. Traces of a specific target threat chemical explosive: PETN in surface concentrations of 10 to 300 ug/cm2 were studied on the surfaces mentioned. Custom built experimental setup used a CO2 laser as a heating source positioned with a telescope, where a minimal loss in reflective optics was reported, for the Mid-IR at a distance of 4 m and 32 scans at 10 s. SVM-DA resulted in the best statistical technique for a discrimination performance of 97%. PLS-DA accurately predicted over 94% and NN 88%.

  12. [A method of measuring presampled modulation transfer function using a rationalized approximation of geometrical edge slope].

    PubMed

    Honda, Michitaka

    2014-04-01

    Several improvements were implemented in the edge method of presampled modulation transfer function measurements (MTFs). The estimation technique for edge angle was newly developed by applying an algorithm for principal components analysis. The error in the estimation was statistically confirmed to be less than 0.01 even in the presence of quantum noise. Secondly, the geometrical edge slope was approximated using a rationalized number, making it possible to obtain an oversampled edge response function (ESF) with equal intervals. Thirdly, the final MTFs were estimated using the average of multiple MTFs calculated for local areas. This averaging operation eliminates the errors caused by the rationalized approximation. Computer-simulated images were used to evaluate the accuracy of our method. The relative error between the estimated MTF and the theoretical MTF at the Nyquist frequency was less than 0.5% when the MTF was expressed as a sinc function. For MTFs representing an indirect detector and phase-contrast detector, good agreement was also observed for the estimated MTFs for each. The high accuracy of the MTF estimation was also confirmed, even for edge angles of around 10 degrees, which suggests the potential for simplification of the measurement conditions. The proposed method could be incorporated into an automated measurement technique using a software application.

  13. On the forecasting the unfavorable periods in the technosphere by the space weather factors

    NASA Astrophysics Data System (ADS)

    Lyakhov, N. N.

    2002-12-01

    There is the considerable progress in development of geomagnetic disturbances forecast technique, in the necessary time, by solar activity phenomena last years. The possible relationship between violations of the traffic safety terms (VTS) in East Siberian Railway during 1986-1999 and the space weather factors was investigated. The overall number of cases under consideration is equal to 11575. By methods of correlation and spectral analysis it was shown, that statistics of VTS has not a random and it's character is probably caused by space weather factors. The principal difference between rhythmic of VTS by purely technical reasons (MECH) (failures in mechanical systems) and, that of VTS caused by wrong operations of a personnel (MAN), is noted. Increase of sudden storm commencements number results in increase of probability of mistakable actions of an operator. Probability of violations in mechanical systems increases with increase of number of quiet geomagnetic conditions. This, in its turn, dictate different approach to the ordered rows of MECH and MAN data when forecasting the unfavourable periods as the priods of increased risk in working out a wrong decision by technological process participants. The advances in forecasting of geomagnetic environment technique made possible to start construction of systems of the operative informing about unfavourable factors of space weather for the interested organizations.

  14. A Hybrid Sensing Approach for Pure and Adulterated Honey Classification

    PubMed Central

    Subari, Norazian; Saleh, Junita Mohamad; Shakaff, Ali Yeon Md; Zakaria, Ammar

    2012-01-01

    This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar) were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose) and Fourier Transform Infrared Spectroscopy (FTIR) were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0%) gave higher classification accuracy than e-nose data (76.5%) using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data. PMID:23202033

  15. Estimation of surface curvature from full-field shape data using principal component analysis

    NASA Astrophysics Data System (ADS)

    Sharma, Sameer; Vinuchakravarthy, S.; Subramanian, S. J.

    2017-01-01

    Three-dimensional digital image correlation (3D-DIC) is a popular image-based experimental technique for estimating surface shape, displacements and strains of deforming objects. In this technique, a calibrated stereo rig is used to obtain and stereo-match pairs of images of the object of interest from which the shapes of the imaged surface are then computed using the calibration parameters of the rig. Displacements are obtained by performing an additional temporal correlation of the shapes obtained at various stages of deformation and strains by smoothing and numerically differentiating the displacement data. Since strains are of primary importance in solid mechanics, significant efforts have been put into computation of strains from the measured displacement fields; however, much less attention has been paid to date to computation of curvature from the measured 3D surfaces. In this work, we address this gap by proposing a new method of computing curvature from full-field shape measurements using principal component analysis (PCA) along the lines of a similar work recently proposed to measure strains (Grama and Subramanian 2014 Exp. Mech. 54 913-33). PCA is a multivariate analysis tool that is widely used to reveal relationships between a large number of variables, reduce dimensionality and achieve significant denoising. This technique is applied here to identify dominant principal components in the shape fields measured by 3D-DIC and these principal components are then differentiated systematically to obtain the first and second fundamental forms used in the curvature calculation. The proposed method is first verified using synthetically generated noisy surfaces and then validated experimentally on some real world objects with known ground-truth curvatures.

  16. MORPHOLOGICAL VARIATION IN HATCHLING AMERICAN ALLIGATORS (ALLIGATOR MISSISSIPPIENSIS) FROM THREE FLORIDA LAKES

    EPA Science Inventory

    Morphological variation of 508 hatchling alligators from three lakes in north central Florida (Lakes Woodruff, Apopka, and Orange) was analyzed using multivariate statistics. Morphological variation was found among clutches as well as among lakes. Principal components analysis wa...

  17. SPATIAL STATISTICS AND ECONOMETRICS FOR MODELS IN FISHERIES ECONOMICS. (R828012)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  18. SOME STATISTICAL TOOLS FOR EVALUATING COMPUTER SIMULATIONS: A DATA ANALYSIS. (R825381)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  19. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... methods employed in statistical compilations. The principal title of each exhibit should state what it... furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including definitions of...

  20. 39 CFR 3001.31 - Evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... item of information used and the methods employed in statistical compilations. The principal title of... furnished: (i) Market research. (a) The following data and information shall be provided: (1) A clear and detailed description of the sample, observational, and data preparation designs, including definitions of...

Top