Salvatore, Stefania; Bramness, Jørgen G; Røislien, Jo
2016-07-12
Wastewater-based epidemiology (WBE) is a novel approach in drug use epidemiology which aims to monitor the extent of use of various drugs in a community. In this study, we investigate functional principal component analysis (FPCA) as a tool for analysing WBE data and compare it to traditional principal component analysis (PCA) and to wavelet principal component analysis (WPCA) which is more flexible temporally. We analysed temporal wastewater data from 42 European cities collected daily over one week in March 2013. The main temporal features of ecstasy (MDMA) were extracted using FPCA using both Fourier and B-spline basis functions with three different smoothing parameters, along with PCA and WPCA with different mother wavelets and shrinkage rules. The stability of FPCA was explored through bootstrapping and analysis of sensitivity to missing data. The first three principal components (PCs), functional principal components (FPCs) and wavelet principal components (WPCs) explained 87.5-99.6 % of the temporal variation between cities, depending on the choice of basis and smoothing. The extracted temporal features from PCA, FPCA and WPCA were consistent. FPCA using Fourier basis and common-optimal smoothing was the most stable and least sensitive to missing data. FPCA is a flexible and analytically tractable method for analysing temporal changes in wastewater data, and is robust to missing data. WPCA did not reveal any rapid temporal changes in the data not captured by FPCA. Overall the results suggest FPCA with Fourier basis functions and common-optimal smoothing parameter as the most accurate approach when analysing WBE data.
Analyzing coastal environments by means of functional data analysis
NASA Astrophysics Data System (ADS)
Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.
2017-07-01
Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.
Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S
2017-06-01
Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.
The variance needed to accurately describe jump height from vertical ground reaction force data.
Richter, Chris; McGuinness, Kevin; O'Connor, Noel E; Moran, Kieran
2014-12-01
In functional principal component analysis (fPCA) a threshold is chosen to define the number of retained principal components, which corresponds to the amount of preserved information. A variety of thresholds have been used in previous studies and the chosen threshold is often not evaluated. The aim of this study is to identify the optimal threshold that preserves the information needed to describe a jump height accurately utilizing vertical ground reaction force (vGRF) curves. To find an optimal threshold, a neural network was used to predict jump height from vGRF curve measures generated using different fPCA thresholds. The findings indicate that a threshold from 99% to 99.9% (6-11 principal components) is optimal for describing jump height, as these thresholds generated significantly lower jump height prediction errors than other thresholds.
A Multi-Dimensional Functional Principal Components Analysis of EEG Data
Hasenstab, Kyle; Scheffler, Aaron; Telesca, Donatello; Sugar, Catherine A.; Jeste, Shafali; DiStefano, Charlotte; Şentürk, Damla
2017-01-01
Summary The electroencephalography (EEG) data created in event-related potential (ERP) experiments have a complex high-dimensional structure. Each stimulus presentation, or trial, generates an ERP waveform which is an instance of functional data. The experiments are made up of sequences of multiple trials, resulting in longitudinal functional data and moreover, responses are recorded at multiple electrodes on the scalp, adding an electrode dimension. Traditional EEG analyses involve multiple simplifications of this structure to increase the signal-to-noise ratio, effectively collapsing the functional and longitudinal components by identifying key features of the ERPs and averaging them across trials. Motivated by an implicit learning paradigm used in autism research in which the functional, longitudinal and electrode components all have critical interpretations, we propose a multidimensional functional principal components analysis (MD-FPCA) technique which does not collapse any of the dimensions of the ERP data. The proposed decomposition is based on separation of the total variation into subject and subunit level variation which are further decomposed in a two-stage functional principal components analysis. The proposed methodology is shown to be useful for modeling longitudinal trends in the ERP functions, leading to novel insights into the learning patterns of children with Autism Spectrum Disorder (ASD) and their typically developing peers as well as comparisons between the two groups. Finite sample properties of MD-FPCA are further studied via extensive simulations. PMID:28072468
A multi-dimensional functional principal components analysis of EEG data.
Hasenstab, Kyle; Scheffler, Aaron; Telesca, Donatello; Sugar, Catherine A; Jeste, Shafali; DiStefano, Charlotte; Şentürk, Damla
2017-09-01
The electroencephalography (EEG) data created in event-related potential (ERP) experiments have a complex high-dimensional structure. Each stimulus presentation, or trial, generates an ERP waveform which is an instance of functional data. The experiments are made up of sequences of multiple trials, resulting in longitudinal functional data and moreover, responses are recorded at multiple electrodes on the scalp, adding an electrode dimension. Traditional EEG analyses involve multiple simplifications of this structure to increase the signal-to-noise ratio, effectively collapsing the functional and longitudinal components by identifying key features of the ERPs and averaging them across trials. Motivated by an implicit learning paradigm used in autism research in which the functional, longitudinal, and electrode components all have critical interpretations, we propose a multidimensional functional principal components analysis (MD-FPCA) technique which does not collapse any of the dimensions of the ERP data. The proposed decomposition is based on separation of the total variation into subject and subunit level variation which are further decomposed in a two-stage functional principal components analysis. The proposed methodology is shown to be useful for modeling longitudinal trends in the ERP functions, leading to novel insights into the learning patterns of children with Autism Spectrum Disorder (ASD) and their typically developing peers as well as comparisons between the two groups. Finite sample properties of MD-FPCA are further studied via extensive simulations. © 2017, The International Biometric Society.
Szczesniak, Rhonda D; Li, Dan; Duan, Leo L; Altaye, Mekibib; Miodovnik, Menachem; Khoury, Jane C
2016-11-01
Objective To identify phenotypes of type 1 diabetes control and associations with maternal/neonatal characteristics based on blood pressure (BP), glucose, and insulin curves during gestation, using a novel functional data analysis approach that accounts for sparse longitudinal patterns of medical monitoring during pregnancy. Methods We performed a retrospective longitudinal cohort study of women with type 1 diabetes whose BP, glucose, and insulin requirements were monitored throughout gestation as part of a program-project grant. Scores from sparse functional principal component analysis (fPCA) were used to classify gestational profiles according to the degree of control for each monitored measure. Phenotypes created using fPCA were compared with respect to maternal and neonatal characteristics and outcome. Results Most of the gestational profile variation in the monitored measures was explained by the first principal component (82-94%). Profiles clustered into three subgroups of high, moderate, or low heterogeneity, relative to the overall mean response. Phenotypes were associated with baseline characteristics, longitudinal changes in glycohemoglobin A1 and weight, and to pregnancy-related outcomes. Conclusion Three distinct longitudinal patterns of glucose, insulin, and BP control were found. By identifying these phenotypes, interventions can be targeted for subgroups at highest risk for compromised outcome, to optimize diabetes management during pregnancy. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
NASA Astrophysics Data System (ADS)
He, Shiyuan; Wang, Lifan; Huang, Jianhua Z.
2018-04-01
With growing data from ongoing and future supernova surveys, it is possible to empirically quantify the shapes of SNIa light curves in more detail, and to quantitatively relate the shape parameters with the intrinsic properties of SNIa. Building such relationships is critical in controlling systematic errors associated with supernova cosmology. Based on a collection of well-observed SNIa samples accumulated in the past years, we construct an empirical SNIa light curve model using a statistical method called the functional principal component analysis (FPCA) for sparse and irregularly sampled functional data. Using this method, the entire light curve of an SNIa is represented by a linear combination of principal component functions, and the SNIa is represented by a few numbers called “principal component scores.” These scores are used to establish relations between light curve shapes and physical quantities such as intrinsic color, interstellar dust reddening, spectral line strength, and spectral classes. These relations allow for descriptions of some critical physical quantities based purely on light curve shape parameters. Our study shows that some important spectral feature information is being encoded in the broad band light curves; for instance, we find that the light curve shapes are correlated with the velocity and velocity gradient of the Si II λ6355 line. This is important for supernova surveys (e.g., LSST and WFIRST). Moreover, the FPCA light curve model is used to construct the entire light curve shape, which in turn is used in a functional linear form to adjust intrinsic luminosity when fitting distance models.
Quantifying Individual Brain Connectivity with Functional Principal Component Analysis for Networks.
Petersen, Alexander; Zhao, Jianyang; Carmichael, Owen; Müller, Hans-Georg
2016-09-01
In typical functional connectivity studies, connections between voxels or regions in the brain are represented as edges in a network. Networks for different subjects are constructed at a given graph density and are summarized by some network measure such as path length. Examining these summary measures for many density values yields samples of connectivity curves, one for each individual. This has led to the adoption of basic tools of functional data analysis, most commonly to compare control and disease groups through the average curves in each group. Such group differences, however, neglect the variability in the sample of connectivity curves. In this article, the use of functional principal component analysis (FPCA) is demonstrated to enrich functional connectivity studies by providing increased power and flexibility for statistical inference. Specifically, individual connectivity curves are related to individual characteristics such as age and measures of cognitive function, thus providing a tool to relate brain connectivity with these variables at the individual level. This individual level analysis opens a new perspective that goes beyond previous group level comparisons. Using a large data set of resting-state functional magnetic resonance imaging scans, relationships between connectivity and two measures of cognitive function-episodic memory and executive function-were investigated. The group-based approach was implemented by dichotomizing the continuous cognitive variable and testing for group differences, resulting in no statistically significant findings. To demonstrate the new approach, FPCA was implemented, followed by linear regression models with cognitive scores as responses, identifying significant associations of connectivity in the right middle temporal region with both cognitive scores.
A robust functional-data-analysis method for data recovery in multichannel sensor systems.
Sun, Jian; Liao, Haitao; Upadhyaya, Belle R
2014-08-01
Multichannel sensor systems are widely used in condition monitoring for effective failure prevention of critical equipment or processes. However, loss of sensor readings due to malfunctions of sensors and/or communication has long been a hurdle to reliable operations of such integrated systems. Moreover, asynchronous data sampling and/or limited data transmission are usually seen in multiple sensor channels. To reliably perform fault diagnosis and prognosis in such operating environments, a data recovery method based on functional principal component analysis (FPCA) can be utilized. However, traditional FPCA methods are not robust to outliers and their capabilities are limited in recovering signals with strongly skewed distributions (i.e., lack of symmetry). This paper provides a robust data-recovery method based on functional data analysis to enhance the reliability of multichannel sensor systems. The method not only considers the possibly skewed distribution of each channel of signal trajectories, but is also capable of recovering missing data for both individual and correlated sensor channels with asynchronous data that may be sparse as well. In particular, grand median functions, rather than classical grand mean functions, are utilized for robust smoothing of sensor signals. Furthermore, the relationship between the functional scores of two correlated signals is modeled using multivariate functional regression to enhance the overall data-recovery capability. An experimental flow-control loop that mimics the operation of coolant-flow loop in a multimodular integral pressurized water reactor is used to demonstrate the effectiveness and adaptability of the proposed data-recovery method. The computational results illustrate that the proposed method is robust to outliers and more capable than the existing FPCA-based method in terms of the accuracy in recovering strongly skewed signals. In addition, turbofan engine data are also analyzed to verify the capability of the proposed method in recovering non-skewed signals.
Sutovska, Martina; Kocmalova, Michaela; Joskova, Marta; Adamkov, Marian; Franova, Sona
2015-04-01
Previously, therapeutic potency of CRAC channels blocker was evidenced as a significant decrease in airway smooth muscle hyperreactivity, antitussive and anti-inflammatory effects. The major role of the respiratory epithelium in asthma pathogenesis was highlighted only recently and CRAC channels were proposed as the most significant route of Ca2+ entry into epithelial cells. The aim of the study was to analyse the impact of long-term administered CRAC channels blocker on airway epithelium, e.g. cytokine production and ciliary beat frequency (CBF) using an animal model of allergic asthma. Ovalbumin-induced allergic airway inflammation of guinea pigs was followed by long-term (14 days lasted) therapy by CRAC blocker (3-fluoropyridine-4-carboxylic acid, FPCA). The influence of long-term therapy on cytokines (IL-4, IL-5 and IL-13) in BALF and in plasma, immunohistochemical staining of pulmonary tissue (c-Fos positivity) and CBF in vitro were used for analysis. Decrease in cytokine levels and in c-Fos positivity confirmed an anti-inflammatory effect of long-term administered FPCA. Cytokine levels in BALF and distribution of c-Fos positivity suggested that FPCA was a more potent inhibitor of respiratory epithelium secretory functions than budesonide. FPCA and budesonide reduced CBF only insignificantly. All findings supported CRAC channels as promising target in the new strategy of antiasthmatic treatment.
Kendrick, Sarah K; Zheng, Qi; Garbett, Nichola C; Brock, Guy N
2017-01-01
DSC is used to determine thermally-induced conformational changes of biomolecules within a blood plasma sample. Recent research has indicated that DSC curves (or thermograms) may have different characteristics based on disease status and, thus, may be useful as a monitoring and diagnostic tool for some diseases. Since thermograms are curves measured over a range of temperature values, they are considered functional data. In this paper we apply functional data analysis techniques to analyze differential scanning calorimetry (DSC) data from individuals from the Lupus Family Registry and Repository (LFRR). The aim was to assess the effect of lupus disease status as well as additional covariates on the thermogram profiles, and use FD analysis methods to create models for classifying lupus vs. control patients on the basis of the thermogram curves. Thermograms were collected for 300 lupus patients and 300 controls without lupus who were matched with diseased individuals based on sex, race, and age. First, functional regression with a functional response (DSC) and categorical predictor (disease status) was used to determine how thermogram curve structure varied according to disease status and other covariates including sex, race, and year of birth. Next, functional logistic regression with disease status as the response and functional principal component analysis (FPCA) scores as the predictors was used to model the effect of thermogram structure on disease status prediction. The prediction accuracy for patients with Osteoarthritis and Rheumatoid Arthritis but without Lupus was also calculated to determine the ability of the classifier to differentiate between Lupus and other diseases. Data were divided 1000 times into separate 2/3 training and 1/3 test data for evaluation of predictions. Finally, derivatives of thermogram curves were included in the models to determine whether they aided in prediction of disease status. Functional regression with thermogram as a functional response and disease status as predictor showed a clear separation in thermogram curve structure between cases and controls. The logistic regression model with FPCA scores as the predictors gave the most accurate results with a mean 79.22% correct classification rate with a mean sensitivity = 79.70%, and specificity = 81.48%. The model correctly classified OA and RA patients without Lupus as controls at a rate of 75.92% on average with a mean sensitivity = 79.70% and specificity = 77.6%. Regression models including FPCA scores for derivative curves did not perform as well, nor did regression models including covariates. Changes in thermograms observed in the disease state likely reflect covalent modifications of plasma proteins or changes in large protein-protein interacting networks resulting in the stabilization of plasma proteins towards thermal denaturation. By relating functional principal components from thermograms to disease status, our Functional Principal Component Analysis model provides results that are more easily interpretable compared to prior studies. Further, the model could also potentially be coupled with other biomarkers to improve diagnostic classification for lupus.
Ising model of cardiac thin filament activation with nearest-neighbor cooperative interactions
NASA Technical Reports Server (NTRS)
Rice, John Jeremy; Stolovitzky, Gustavo; Tu, Yuhai; de Tombe, Pieter P.; Bers, D. M. (Principal Investigator)
2003-01-01
We have developed a model of cardiac thin filament activation using an Ising model approach from equilibrium statistical physics. This model explicitly represents nearest-neighbor interactions between 26 troponin/tropomyosin units along a one-dimensional array that represents the cardiac thin filament. With transition rates chosen to match experimental data, the results show that the resulting force-pCa (F-pCa) relations are similar to Hill functions with asymmetries, as seen in experimental data. Specifically, Hill plots showing (log(F/(1-F)) vs. log [Ca]) reveal a steeper slope below the half activation point (Ca(50)) compared with above. Parameter variation studies show interplay of parameters that affect the apparent cooperativity and asymmetry in the F-pCa relations. The model also predicts that Ca binding is uncooperative for low [Ca], becomes steeper near Ca(50), and becomes uncooperative again at higher [Ca]. The steepness near Ca(50) mirrors the steep F-pCa as a result of thermodynamic considerations. The model also predicts that the correlation between troponin/tropomyosin units along the one-dimensional array quickly decays at high and low [Ca], but near Ca(50), high correlation occurs across the whole array. This work provides a simple model that can account for the steepness and shape of F-pCa relations that other models fail to reproduce.
Interpretable functional principal component analysis.
Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo
2016-09-01
Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.
Lauer, Richard T.; Keshner, Emily A.
2011-01-01
The effect of continuous visual flow on the ability to regain and maintain postural orientation was examined. Fourteen young (20–39 years old) and 14 older women (60–79 years old) stood quietly during 3° (30°/s) dorsiflexion tilt of the support surface combined with 30° and 45°/s upward or downward pitch rotations of the visual field. The support surface was held tilted for 30 s and then returned to neutral over a 30-s period while the visual field continued to rotate. Segmental displacement and bilateral tibialis anterior and gastrocnemius muscle EMG responses were recorded. Continuous wavelet transforms were calculated for each muscle EMG response. An instantaneous mean frequency curve (IMNF) of muscle activity, center of mass (COM), center of pressure (COP), and angular excursion at the hip and ankle were used in a functional principal component analysis (fPCA). Functional component weights were calculated and compared with mixed model repeated measures ANOVAs. The fPCA revealed greatest mathematical differences in COM and COP responses between groups or conditions during the period that the platform transitioned from the sustained tilt to a return to neutral position. Muscle EMG responses differed most in the period following support surface tilt indicating that muscle activity increased to support stabilization against the visual flow. Older women exhibited significantly larger COM and COP responses in the direction of visual field motion and less muscle modulation when the platform returned to neutral than younger women. Results on a Rod and Frame test indicated that older women were significantly more visually dependent than the younger women. We concluded that a stiffer body combined with heightened visual sensitivity in older women critically interferes with their ability to counteract posturally destabilizing environments. PMID:21479659
Salvatore, Stefania; Røislien, Jo; Baz-Lomba, Jose A; Bramness, Jørgen G
2017-03-01
Wastewater-based epidemiology is an alternative method for estimating the collective drug use in a community. We applied functional data analysis, a statistical framework developed for analysing curve data, to investigate weekly temporal patterns in wastewater measurements of three prescription drugs with known abuse potential: methadone, oxazepam and methylphenidate, comparing them to positive and negative control drugs. Sewage samples were collected in February 2014 from a wastewater treatment plant in Oslo, Norway. The weekly pattern of each drug was extracted by fitting of generalized additive models, using trigonometric functions to model the cyclic behaviour. From the weekly component, the main temporal features were then extracted using functional principal component analysis. Results are presented through the functional principal components (FPCs) and corresponding FPC scores. Clinically, the most important weekly feature of the wastewater-based epidemiology data was the second FPC, representing the difference between average midweek level and a peak during the weekend, representing possible recreational use of a drug in the weekend. Estimated scores on this FPC indicated recreational use of methylphenidate, with a high weekend peak, but not for methadone and oxazepam. The functional principal component analysis uncovered clinically important temporal features of the weekly patterns of the use of prescription drugs detected from wastewater analysis. This may be used as a post-marketing surveillance method to monitor prescription drugs with abuse potential. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao
2015-01-01
Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383
Wan, Caichao; Li, Jian
2017-04-01
Eco-friendly cellulose-derived carbon aerogels (CDCA) were employed as porous substrate to integrate with α-Fe 2 O 3 and polypyrrole (PPy) via pyrolysis and vapor-phase polymerization. The SEM and TEM observations present that the wrinkled PPy sheets and the α-Fe 2 O 3 nanoparticles were well dispersed in CDCA. The strong interactions (such as hydrogen bonding) between the substrate and the nanomaterials were demonstrated by the FTIR and XPS analysis. When utilized as electromagnetic interference (EMI) shielding materials, the α-Fe 2 O 3 /PPy/CDCA (FPCA) composite has the highest total shielding effectiveness (SE total ) of 39.4dB, about 2.0, 2.9, and 1.3 times that of the acid-treated CDCA (19.3dB), PPy (13.6dB), and α-Fe 2 O 3 /CDCA (29.3dB), respectively. Moreover, the shielding effectiveness due to absorption accounts for 78.2%-84.2% of SE total for FPCA, indicative of the absorption-dominant shielding mechanism contributing to alleviating secondary radiation. These features make the composite a useful alternative candidate for EMI shielding. Copyright © 2017 Elsevier Ltd. All rights reserved.
The construction of control chart for PM10 functional data
NASA Astrophysics Data System (ADS)
Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd
2014-06-01
In this paper, a statistical procedure to construct a control chart for monitoring air quality (PM10) using functional data is proposed. A set of daily indices that represent the daily PM10 curves were obtained using Functional Principal Component Analysis (FPCA). By means of an iterative charting procedure, a reference data set that represented a stable PM10 process was obtained. The data were then used as a reference for monitoring future data. The application of the procedure was conducted using seven-year (2004-2010) period of recorded data from the Klang air quality monitoring station located in the Klang Valley region of Peninsular Malaysia. The study showed that the control chart provided a useful visualization tool for monitoring air quality and was capable in detecting abnormality in the process system. As in the case of Klang station, the results showed that with reference to 2004-2008, the air quality (PM10) in 2010 was better than that in 2009.
Application of robust face recognition in video surveillance systems
NASA Astrophysics Data System (ADS)
Zhang, De-xin; An, Peng; Zhang, Hao-xiang
2018-03-01
In this paper, we propose a video searching system that utilizes face recognition as searching indexing feature. As the applications of video cameras have great increase in recent years, face recognition makes a perfect fit for searching targeted individuals within the vast amount of video data. However, the performance of such searching depends on the quality of face images recorded in the video signals. Since the surveillance video cameras record videos without fixed postures for the object, face occlusion is very common in everyday video. The proposed system builds a model for occluded faces using fuzzy principal component analysis (FPCA), and reconstructs the human faces with the available information. Experimental results show that the system has very high efficiency in processing the real life videos, and it is very robust to various kinds of face occlusions. Hence it can relieve people reviewers from the front of the monitors and greatly enhances the efficiency as well. The proposed system has been installed and applied in various environments and has already demonstrated its power by helping solving real cases.
Code of Federal Regulations, 2013 CFR
2013-07-01
... education initiative, establish a means to inform absent uniformed services members of absentee voting... Post Card Application (FPCA)” or National Mail Voter Registration Form. (C) Ensure that voting... registration and absentee ballot procedures. This can be met by providing the applicant with the SF 76, SF 186...
Code of Federal Regulations, 2014 CFR
2014-07-01
... education initiative, establish a means to inform absent uniformed services members of absentee voting... Post Card Application (FPCA)” or National Mail Voter Registration Form. (C) Ensure that voting... registration and absentee ballot procedures. This can be met by providing the applicant with the SF 76, SF 186...
Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo
2015-01-01
Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
Applications of functional data analysis: A systematic review.
Ullah, Shahid; Finch, Caroline F
2013-03-19
Functional data analysis (FDA) is increasingly being used to better analyze, model and predict time series data. Key aspects of FDA include the choice of smoothing technique, data reduction, adjustment for clustering, functional linear modeling and forecasting methods. A systematic review using 11 electronic databases was conducted to identify FDA application studies published in the peer-review literature during 1995-2010. Papers reporting methodological considerations only were excluded, as were non-English articles. In total, 84 FDA application articles were identified; 75.0% of the reviewed articles have been published since 2005. Application of FDA has appeared in a large number of publications across various fields of sciences; the majority is related to biomedicine applications (21.4%). Overall, 72 studies (85.7%) provided information about the type of smoothing techniques used, with B-spline smoothing (29.8%) being the most popular. Functional principal component analysis (FPCA) for extracting information from functional data was reported in 51 (60.7%) studies. One-quarter (25.0%) of the published studies used functional linear models to describe relationships between explanatory and outcome variables and only 8.3% used FDA for forecasting time series data. Despite its clear benefits for analyzing time series data, full appreciation of the key features and value of FDA have been limited to date, though the applications show its relevance to many public health and biomedical problems. Wider application of FDA to all studies involving correlated measurements should allow better modeling of, and predictions from, such data in the future especially as FDA makes no a priori age and time effects assumptions.
Applications of functional data analysis: A systematic review
2013-01-01
Background Functional data analysis (FDA) is increasingly being used to better analyze, model and predict time series data. Key aspects of FDA include the choice of smoothing technique, data reduction, adjustment for clustering, functional linear modeling and forecasting methods. Methods A systematic review using 11 electronic databases was conducted to identify FDA application studies published in the peer-review literature during 1995–2010. Papers reporting methodological considerations only were excluded, as were non-English articles. Results In total, 84 FDA application articles were identified; 75.0% of the reviewed articles have been published since 2005. Application of FDA has appeared in a large number of publications across various fields of sciences; the majority is related to biomedicine applications (21.4%). Overall, 72 studies (85.7%) provided information about the type of smoothing techniques used, with B-spline smoothing (29.8%) being the most popular. Functional principal component analysis (FPCA) for extracting information from functional data was reported in 51 (60.7%) studies. One-quarter (25.0%) of the published studies used functional linear models to describe relationships between explanatory and outcome variables and only 8.3% used FDA for forecasting time series data. Conclusions Despite its clear benefits for analyzing time series data, full appreciation of the key features and value of FDA have been limited to date, though the applications show its relevance to many public health and biomedical problems. Wider application of FDA to all studies involving correlated measurements should allow better modeling of, and predictions from, such data in the future especially as FDA makes no a priori age and time effects assumptions. PMID:23510439
Non-preconditioned conjugate gradient on cell and FPCA-based hybrid supercomputer nodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubois, David H; Dubois, Andrew J; Boorman, Thomas M
2009-03-10
This work presents a detailed implementation of a double precision, Non-Preconditioned, Conjugate Gradient algorithm on a Roadrunner heterogeneous supercomputer node. These nodes utilize the Cell Broadband Engine Architecture{trademark} in conjunction with x86 Opteron{trademark} processors from AMD. We implement a common Conjugate Gradient algorithm, on a variety of systems, to compare and contrast performance. Implementation results are presented for the Roadrunner hybrid supercomputer, SRC Computers, Inc. MAPStation SRC-6 FPGA enhanced hybrid supercomputer, and AMD Opteron only. In all hybrid implementations wall clock time is measured, including all transfer overhead and compute timings.
An Integrated Approach to Parameter Learning in Infinite-Dimensional Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, Zachary M.; Wendelberger, Joanne Roth
The availability of sophisticated modern physics codes has greatly extended the ability of domain scientists to understand the processes underlying their observations of complicated processes, but it has also introduced the curse of dimensionality via the many user-set parameters available to tune. Many of these parameters are naturally expressed as functional data, such as initial temperature distributions, equations of state, and controls. Thus, when attempting to find parameters that match observed data, being able to navigate parameter-space becomes highly non-trivial, especially considering that accurate simulations can be expensive both in terms of time and money. Existing solutions include batch-parallel simulations,more » high-dimensional, derivative-free optimization, and expert guessing, all of which make some contribution to solving the problem but do not completely resolve the issue. In this work, we explore the possibility of coupling together all three of the techniques just described by designing user-guided, batch-parallel optimization schemes. Our motivating example is a neutron diffusion partial differential equation where the time-varying multiplication factor serves as the unknown control parameter to be learned. We find that a simple, batch-parallelizable, random-walk scheme is able to make some progress on the problem but does not by itself produce satisfactory results. After reducing the dimensionality of the problem using functional principal component analysis (fPCA), we are able to track the progress of the solver in a visually simple way as well as viewing the associated principle components. This allows a human to make reasonable guesses about which points in the state space the random walker should try next. Thus, by combining the random walker's ability to find descent directions with the human's understanding of the underlying physics, it is possible to use expensive simulations more efficiently and more quickly arrive at the desired parameter set.« less
NASA Technical Reports Server (NTRS)
Schmeckpeper, K. R.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 1671 failure modes analyzed, 9 single failures were determined to result in loss of crew or vehicle. Three single failures unique to intact abort were determined to result in possible loss of the crew or vehicle. A possible loss of mission could result if any of 136 single failures occurred. Six of the criticality 1/1 failures are in two rotary and two pushbutton switches that control External Tank and Solid Rocket Booster separation. The other 6 criticality 1/1 failures are fuses, one each per Aft Power Control Assembly (APCA) 4, 5, and 6 and one each per Forward Power Control Assembly (FPCA) 1, 2, and 3, that supply power to certain Main Propulsion System (MPS) valves and Forward Reaction Control System (RCS) circuits.
Inventory of File gfs.t06z.pgrb2.0p25.anl
UGRD analysis U-Component of Wind [m/s] 005 10 mb VGRD analysis V-Component of Wind [m/s] 006 10 mb -Component of Wind [m/s] 011 20 mb VGRD analysis V-Component of Wind [m/s] 012 20 mb ABSV analysis Absolute UGRD analysis U-Component of Wind [m/s] 018 30 mb VGRD analysis V-Component of Wind [m/s] 019 30 mb
Inventory of File gfs.t06z.pgrb2.0p50.anl
UGRD analysis U-Component of Wind [m/s] 005 10 mb VGRD analysis V-Component of Wind [m/s] 006 10 mb -Component of Wind [m/s] 011 20 mb VGRD analysis V-Component of Wind [m/s] 012 20 mb ABSV analysis Absolute UGRD analysis U-Component of Wind [m/s] 018 30 mb VGRD analysis V-Component of Wind [m/s] 019 30 mb
Inventory of File sref_em.t03z.pgrb212.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_nmm.t03z.pgrb132.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_nmm.t03z.pgrb221.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_em.t03z.pgrb132.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_nmm.t03z.pgrb243.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_em.t03z.pgrb243.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_em.t03z.pgrb221.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_nmm.t03z.pgrb212.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_nmm.t03z.pgrb216.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Inventory of File sref_em.t03z.pgrb216.ctl.grib2
UGRD analysis U-Component of Wind [m/s] 006 10 m above ground VGRD analysis V-Component of Wind [m/s of Wind [m/s] 018 250 mb VGRD analysis V-Component of Wind [m/s] 019 500 mb HGT analysis Geopotential Height [gpm] 020 500 mb UGRD analysis U-Component of Wind [m/s] 021 500 mb VGRD analysis V-Component of
Using Structural Equation Modeling To Fit Models Incorporating Principal Components.
ERIC Educational Resources Information Center
Dolan, Conor; Bechger, Timo; Molenaar, Peter
1999-01-01
Considers models incorporating principal components from the perspectives of structural-equation modeling. These models include the following: (1) the principal-component analysis of patterned matrices; (2) multiple analysis of variance based on principal components; and (3) multigroup principal-components analysis. Discusses fitting these models…
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Inventory of File sref.t03z.pgrb216.mean_3hrly.grib2
UGRD analysis U-Component of Wind [m/s] wt ens-mean 002 10 m above ground VGRD analysis V-Component of Wind [m/s] wt ens-mean 003 1000 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 004 850 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 005 700 mb UGRD analysis U-Component of Wind [m/s] wt ens
Inventory of File sref.t03z.pgrb212.spread_3hrly.grib2
ground UGRD analysis U-Component of Wind [m/s] std dev 002 10 m above ground VGRD analysis V-Component of Wind [m/s] std dev 003 1000 mb UGRD analysis U-Component of Wind [m/s] std dev 004 850 mb UGRD analysis U-Component of Wind [m/s] std dev 005 700 mb UGRD analysis U-Component of Wind [m/s] std dev 006 600
Inventory of File sref.t03z.pgrb243.mean_3hrly.grib2
UGRD analysis U-Component of Wind [m/s] wt ens-mean 002 10 m above ground VGRD analysis V-Component of Wind [m/s] wt ens-mean 003 1000 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 004 850 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 005 700 mb UGRD analysis U-Component of Wind [m/s] wt ens
Inventory of File sref.t03z.pgrb216.spread_3hrly.grib2
ground UGRD analysis U-Component of Wind [m/s] std dev 002 10 m above ground VGRD analysis V-Component of Wind [m/s] std dev 003 1000 mb UGRD analysis U-Component of Wind [m/s] std dev 004 850 mb UGRD analysis U-Component of Wind [m/s] std dev 005 700 mb UGRD analysis U-Component of Wind [m/s] std dev 006 600
Inventory of File sref.t03z.pgrb243.spread_3hrly.grib2
ground UGRD analysis U-Component of Wind [m/s] std dev 002 10 m above ground VGRD analysis V-Component of Wind [m/s] std dev 003 1000 mb UGRD analysis U-Component of Wind [m/s] std dev 004 850 mb UGRD analysis U-Component of Wind [m/s] std dev 005 700 mb UGRD analysis U-Component of Wind [m/s] std dev 006 600
Inventory of File sref.t03z.pgrb212.mean_3hrly.grib2
UGRD analysis U-Component of Wind [m/s] wt ens-mean 002 10 m above ground VGRD analysis V-Component of Wind [m/s] wt ens-mean 003 1000 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 004 850 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 005 700 mb UGRD analysis U-Component of Wind [m/s] wt ens
Inventory of File sref.t03z.pgrb132.spread_3hrly.grib2
ground UGRD analysis U-Component of Wind [m/s] std dev 002 10 m above ground VGRD analysis V-Component of Wind [m/s] std dev 003 1000 mb UGRD analysis U-Component of Wind [m/s] std dev 004 850 mb UGRD analysis U-Component of Wind [m/s] std dev 005 700 mb UGRD analysis U-Component of Wind [m/s] std dev 006 600
Inventory of File sref.t03z.pgrb132.mean_3hrly.grib2
UGRD analysis U-Component of Wind [m/s] wt ens-mean 002 10 m above ground VGRD analysis V-Component of Wind [m/s] wt ens-mean 003 1000 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 004 850 mb UGRD analysis U-Component of Wind [m/s] wt ens-mean 005 700 mb UGRD analysis U-Component of Wind [m/s] wt ens
How Many Separable Sources? Model Selection In Independent Components Analysis
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988
Component Analyses Using Single-Subject Experimental Designs: A Review
ERIC Educational Resources Information Center
Ward-Horner, John; Sturmey, Peter
2010-01-01
A component analysis is a systematic assessment of 2 or more independent variables or components that comprise a treatment package. Component analyses are important for the analysis of behavior; however, previous research provides only cursory descriptions of the topic. Therefore, in this review the definition of "component analysis" is discussed,…
Analysis of truss, beam, frame, and membrane components. [composite structures
NASA Technical Reports Server (NTRS)
Knoell, A. C.; Robinson, E. Y.
1975-01-01
Truss components are considered, taking into account composite truss structures, truss analysis, column members, and truss joints. Beam components are discussed, giving attention to composite beams, laminated beams, and sandwich beams. Composite frame components and composite membrane components are examined. A description is given of examples of flat membrane components and examples of curved membrane elements. It is pointed out that composite structural design and analysis is a highly interactive, iterative procedure which does not lend itself readily to characterization by design or analysis function only.-
Wavelet decomposition based principal component analysis for face recognition using MATLAB
NASA Astrophysics Data System (ADS)
Sharma, Mahesh Kumar; Sharma, Shashikant; Leeprechanon, Nopbhorn; Ranjan, Aashish
2016-03-01
For the realization of face recognition systems in the static as well as in the real time frame, algorithms such as principal component analysis, independent component analysis, linear discriminate analysis, neural networks and genetic algorithms are used for decades. This paper discusses an approach which is a wavelet decomposition based principal component analysis for face recognition. Principal component analysis is chosen over other algorithms due to its relative simplicity, efficiency, and robustness features. The term face recognition stands for identifying a person from his facial gestures and having resemblance with factor analysis in some sense, i.e. extraction of the principal component of an image. Principal component analysis is subjected to some drawbacks, mainly the poor discriminatory power and the large computational load in finding eigenvectors, in particular. These drawbacks can be greatly reduced by combining both wavelet transform decomposition for feature extraction and principal component analysis for pattern representation and classification together, by analyzing the facial gestures into space and time domain, where, frequency and time are used interchangeably. From the experimental results, it is envisaged that this face recognition method has made a significant percentage improvement in recognition rate as well as having a better computational efficiency.
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
NASA Astrophysics Data System (ADS)
Nagai, Toshiki; Mitsutake, Ayori; Takano, Hiroshi
2013-02-01
A new relaxation mode analysis method, which is referred to as the principal component relaxation mode analysis method, has been proposed to handle a large number of degrees of freedom of protein systems. In this method, principal component analysis is carried out first and then relaxation mode analysis is applied to a small number of principal components with large fluctuations. To reduce the contribution of fast relaxation modes in these principal components efficiently, we have also proposed a relaxation mode analysis method using multiple evolution times. The principal component relaxation mode analysis method using two evolution times has been applied to an all-atom molecular dynamics simulation of human lysozyme in aqueous solution. Slow relaxation modes and corresponding relaxation times have been appropriately estimated, demonstrating that the method is applicable to protein systems.
Inventory of File gfs.t06z.pgrb2b.0p25.f000
UGRD analysis U-Component of Wind [m/s] 005 1 mb VGRD analysis V-Component of Wind [m/s] 006 1 mb ABSV Temperature [K] 011 2 mb RH analysis Relative Humidity [%] 012 2 mb UGRD analysis U-Component of Wind [m/s ] 013 2 mb VGRD analysis V-Component of Wind [m/s] 014 2 mb ABSV analysis Absolute Vorticity [1/s] 015 2
Inventory of File gfs.t06z.pgrb2b.1p00.f000
UGRD analysis U-Component of Wind [m/s] 005 1 mb VGRD analysis V-Component of Wind [m/s] 006 1 mb ABSV Temperature [K] 011 2 mb RH analysis Relative Humidity [%] 012 2 mb UGRD analysis U-Component of Wind [m/s ] 013 2 mb VGRD analysis V-Component of Wind [m/s] 014 2 mb ABSV analysis Absolute Vorticity [1/s] 015 2
Inventory of File gfs.t06z.pgrb2b.0p50.f000
UGRD analysis U-Component of Wind [m/s] 005 1 mb VGRD analysis V-Component of Wind [m/s] 006 1 mb ABSV Temperature [K] 011 2 mb RH analysis Relative Humidity [%] 012 2 mb UGRD analysis U-Component of Wind [m/s ] 013 2 mb VGRD analysis V-Component of Wind [m/s] 014 2 mb ABSV analysis Absolute Vorticity [1/s] 015 2
Inventory of File nam.t00z.grbgrd00.tm00.grib2
Humidity [kg/kg] 009.1 1 hybrid level UGRD analysis U-Component of Wind [m/s] 009.2 1 hybrid level VGRD analysis V-Component of Wind [m/s] 010 1 hybrid level TKE analysis Turbulent Kinetic Energy [J/kg] 011.1 2 hybrid level UGRD analysis U-Component of Wind [m/s] 011.2 2 hybrid level VGRD analysis V-Component of
ERIC Educational Resources Information Center
Grochowalski, Joseph H.
2015-01-01
Component Universe Score Profile analysis (CUSP) is introduced in this paper as a psychometric alternative to multivariate profile analysis. The theoretical foundations of CUSP analysis are reviewed, which include multivariate generalizability theory and constrained principal components analysis. Because CUSP is a combination of generalizability…
Key components of financial-analysis education for clinical nurses.
Lim, Ji Young; Noh, Wonjung
2015-09-01
In this study, we identified key components of financial-analysis education for clinical nurses. We used a literature review, focus group discussions, and a content validity index survey to develop key components of financial-analysis education. First, a wide range of references were reviewed, and 55 financial-analysis education components were gathered. Second, two focus group discussions were performed; the participants were 11 nurses who had worked for more than 3 years in a hospital, and nine components were agreed upon. Third, 12 professionals, including professors, nurse executive, nurse managers, and an accountant, participated in the content validity index. Finally, six key components of financial-analysis education were selected. These key components were as follows: understanding the need for financial analysis, introduction to financial analysis, reading and implementing balance sheets, reading and implementing income statements, understanding the concepts of financial ratios, and interpretation and practice of financial ratio analysis. The results of this study will be used to develop an education program to increase financial-management competency among clinical nurses. © 2015 Wiley Publishing Asia Pty Ltd.
Model reduction by weighted Component Cost Analysis
NASA Technical Reports Server (NTRS)
Kim, Jae H.; Skelton, Robert E.
1990-01-01
Component Cost Analysis considers any given system driven by a white noise process as an interconnection of different components, and assigns a metric called 'component cost' to each component. These component costs measure the contribution of each component to a predefined quadratic cost function. A reduced-order model of the given system may be obtained by deleting those components that have the smallest component costs. The theory of Component Cost Analysis is extended to include finite-bandwidth colored noises. The results also apply when actuators have dynamics of their own. Closed-form analytical expressions of component costs are also derived for a mechanical system described by its modal data. This is very useful to compute the modal costs of very high order systems. A numerical example for MINIMAST system is presented.
ERIC Educational Resources Information Center
Adachi, Kohei
2009-01-01
In component analysis solutions, post-multiplying a component score matrix by a nonsingular matrix can be compensated by applying its inverse to the corresponding loading matrix. To eliminate this indeterminacy on nonsingular transformation, we propose Joint Procrustes Analysis (JPA) in which component score and loading matrices are simultaneously…
The Relation between Factor Score Estimates, Image Scores, and Principal Component Scores
ERIC Educational Resources Information Center
Velicer, Wayne F.
1976-01-01
Investigates the relation between factor score estimates, principal component scores, and image scores. The three methods compared are maximum likelihood factor analysis, principal component analysis, and a variant of rescaled image analysis. (RC)
Computing Lives And Reliabilities Of Turboprop Transmissions
NASA Technical Reports Server (NTRS)
Coy, J. J.; Savage, M.; Radil, K. C.; Lewicki, D. G.
1991-01-01
Computer program PSHFT calculates lifetimes of variety of aircraft transmissions. Consists of main program, series of subroutines applying to specific configurations, generic subroutines for analysis of properties of components, subroutines for analysis of system, and common block. Main program selects routines used in analysis and causes them to operate in desired sequence. Series of configuration-specific subroutines put in configuration data, perform force and life analyses for components (with help of generic component-property-analysis subroutines), fill property array, call up system-analysis routines, and finally print out results of analysis for system and components. Written in FORTRAN 77(IV).
Componential distribution analysis of food using near infrared ray image
NASA Astrophysics Data System (ADS)
Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie
2008-11-01
The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.
Jesse, Stephen; Kalinin, Sergei V
2009-02-25
An approach for the analysis of multi-dimensional, spectroscopic-imaging data based on principal component analysis (PCA) is explored. PCA selects and ranks relevant response components based on variance within the data. It is shown that for examples with small relative variations between spectra, the first few PCA components closely coincide with results obtained using model fitting, and this is achieved at rates approximately four orders of magnitude faster. For cases with strong response variations, PCA allows an effective approach to rapidly process, de-noise, and compress data. The prospects for PCA combined with correlation function analysis of component maps as a universal tool for data analysis and representation in microscopy are discussed.
Inventory of File sref_nmb.t03z.pgrb221.ctl.grib2
006 10 m above ground UGRD analysis U-Component of Wind [m/s] ENS=low-res ctl 007 10 m above ground VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl 008 surface WEASD analysis Water Equivalent of -Component of Wind [m/s] ENS=low-res ctl 021 250 mb VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl
Inventory of File sref_nmb.t03z.pgrb132.ctl.grib2
006 10 m above ground UGRD analysis U-Component of Wind [m/s] ENS=low-res ctl 007 10 m above ground VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl 008 surface WEASD analysis Water Equivalent of -Component of Wind [m/s] ENS=low-res ctl 021 250 mb VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl
Inventory of File sref_nmb.t03z.pgrb243.ctl.grib2
006 10 m above ground UGRD analysis U-Component of Wind [m/s] ENS=low-res ctl 007 10 m above ground VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl 008 surface WEASD analysis Water Equivalent of -Component of Wind [m/s] ENS=low-res ctl 021 250 mb VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl
Inventory of File sref_nmb.t03z.pgrb216.ctl.grib2
006 10 m above ground UGRD analysis U-Component of Wind [m/s] ENS=low-res ctl 007 10 m above ground VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl 008 surface WEASD analysis Water Equivalent of -Component of Wind [m/s] ENS=low-res ctl 021 250 mb VGRD analysis V-Component of Wind [m/s] ENS=low-res ctl
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
Inventory of File gfs.t06z.pgrb2.1p00.f000
analysis U-Component of Wind [m/s] 002 planetary boundary layer VGRD analysis V-Component of Wind [m/s] 003 planetary boundary layer VRATE analysis Ventilation Rate [m^2/s] 004 surface GUST analysis Wind Speed (Gust mb RH analysis Relative Humidity [%] 008 10 mb UGRD analysis U-Component of Wind [m/s] 009 10 mb VGRD
Inventory of File gfs.t06z.pgrb2.0p50.f000
analysis U-Component of Wind [m/s] 002 planetary boundary layer VGRD analysis V-Component of Wind [m/s] 003 planetary boundary layer VRATE analysis Ventilation Rate [m^2/s] 004 surface GUST analysis Wind Speed (Gust mb RH analysis Relative Humidity [%] 008 10 mb UGRD analysis U-Component of Wind [m/s] 009 10 mb VGRD
Inventory of File gfs.t06z.pgrb2.0p25.f000
analysis U-Component of Wind [m/s] 002 planetary boundary layer VGRD analysis V-Component of Wind [m/s] 003 planetary boundary layer VRATE analysis Ventilation Rate [m^2/s] 004 surface GUST analysis Wind Speed (Gust mb RH analysis Relative Humidity [%] 008 10 mb UGRD analysis U-Component of Wind [m/s] 009 10 mb VGRD
Inventory of File gfs.t06z.pgrb2.2p50.f000
analysis U-Component of Wind [m/s] 002 planetary boundary layer VGRD analysis V-Component of Wind [m/s] 003 planetary boundary layer VRATE analysis Ventilation Rate [m^2/s] 004 surface GUST analysis Wind Speed (Gust mb RH analysis Relative Humidity [%] 008 10 mb UGRD analysis U-Component of Wind [m/s] 009 10 mb VGRD
Exergo-Economic Analysis of an Experimental Aircraft Turboprop Engine Under Low Torque Condition
NASA Astrophysics Data System (ADS)
Atilgan, Ramazan; Turan, Onder; Aydin, Hakan
Exergo-economic analysis is an unique combination of exergy analysis and cost analysis conducted at the component level. In exergo-economic analysis, cost of each exergy stream is determined. Inlet and outlet exergy streams of the each component are associated to a monetary cost. This is essential to detect cost-ineffective processes and identify technical options which could improve the cost effectiveness of the overall energy system. In this study, exergo-economic analysis is applied to an aircraft turboprop engine. Analysis is based on experimental values at low torque condition (240 N m). Main components of investigated turboprop engine are the compressor, the combustor, the gas generator turbine, the free power turbine and the exhaust. Cost balance equations have been formed for all components individually and exergo-economic parameters including cost rates and unit exergy costs have been calculated for each component.
An Evaluation of the Effects of Variable Sampling on Component, Image, and Factor Analysis.
ERIC Educational Resources Information Center
Velicer, Wayne F.; Fava, Joseph L.
1987-01-01
Principal component analysis, image component analysis, and maximum likelihood factor analysis were compared to assess the effects of variable sampling. Results with respect to degree of saturation and average number of variables per factor were clear and dramatic. Differential effects on boundary cases and nonconvergence problems were also found.…
Generalized Structured Component Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Takane, Yoshio
2004-01-01
We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…
Independent component analysis decomposition of hospital emergency department throughput measures
NASA Astrophysics Data System (ADS)
He, Qiang; Chu, Henry
2016-05-01
We present a method adapted from medical sensor data analysis, viz. independent component analysis of electroencephalography data, to health system analysis. Timely and effective care in a hospital emergency department is measured by throughput measures such as median times patients spent before they were admitted as an inpatient, before they were sent home, before they were seen by a healthcare professional. We consider a set of five such measures collected at 3,086 hospitals distributed across the U.S. One model of the performance of an emergency department is that these correlated throughput measures are linear combinations of some underlying sources. The independent component analysis decomposition of the data set can thus be viewed as transforming a set of performance measures collected at a site to a collection of outputs of spatial filters applied to the whole multi-measure data. We compare the independent component sources with the output of the conventional principal component analysis to show that the independent components are more suitable for understanding the data sets through visualizations.
Inventory of File nam.t00z.awp21100.tm00.grib2
analysis Pressure Reduced to MSL [Pa] 002 surface GUST analysis Wind Speed (Gust) [m/s] 003 100 mb HGT -Component of Wind [m/s] 007.2 100 mb VGRD analysis V-Component of Wind [m/s] 008 150 mb HGT analysis Wind [m/s] 012.2 150 mb VGRD analysis V-Component of Wind [m/s] 013 200 mb HGT analysis Geopotential
40 CFR 1033.645 - Non-OEM component certification program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... needs of your component. (iv) An engineering analysis (including test data in some cases) demonstrating to us that your component will not cause emissions to increase. The analysis must address both low-hour and end-of-useful life emissions. The amount of information required for this analysis is less...
40 CFR 1033.645 - Non-OEM component certification program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... needs of your component. (iv) An engineering analysis (including test data in some cases) demonstrating to us that your component will not cause emissions to increase. The analysis must address both low-hour and end-of-useful life emissions. The amount of information required for this analysis is less...
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
ERIC Educational Resources Information Center
McCormick, Ernest J.; And Others
The study deals with the job component method of establishing compensation rates. The basic job analysis questionnaire used in the study was the Position Analysis Questionnaire (PAQ) (Form B). On the basis of a principal components analysis of PAQ data for a large sample (2,688) of jobs, a number of principal components (job dimensions) were…
Stress Analysis of B-52B and B-52H Air-Launching Systems Failure-Critical Structural Components
NASA Technical Reports Server (NTRS)
Ko, William L.
2005-01-01
The operational life analysis of any airborne failure-critical structural component requires the stress-load equation, which relates the applied load to the maximum tangential tensile stress at the critical stress point. The failure-critical structural components identified are the B-52B Pegasus pylon adapter shackles, B-52B Pegasus pylon hooks, B-52H airplane pylon hooks, B-52H airplane front fittings, B-52H airplane rear pylon fitting, and the B-52H airplane pylon lower sway brace. Finite-element stress analysis was performed on the said structural components, and the critical stress point was located and the stress-load equation was established for each failure-critical structural component. The ultimate load, yield load, and proof load needed for operational life analysis were established for each failure-critical structural component.
Dascălu, Cristina Gena; Antohe, Magda Ecaterina
2009-01-01
Based on the eigenvalues and the eigenvectors analysis, the principal component analysis has the purpose to identify the subspace of the main components from a set of parameters, which are enough to characterize the whole set of parameters. Interpreting the data for analysis as a cloud of points, we find through geometrical transformations the directions where the cloud's dispersion is maximal--the lines that pass through the cloud's center of weight and have a maximal density of points around them (by defining an appropriate criteria function and its minimization. This method can be successfully used in order to simplify the statistical analysis on questionnaires--because it helps us to select from a set of items only the most relevant ones, which cover the variations of the whole set of data. For instance, in the presented sample we started from a questionnaire with 28 items and, applying the principal component analysis we identified 7 principal components--or main items--fact that simplifies significantly the further data statistical analysis.
An Introductory Application of Principal Components to Cricket Data
ERIC Educational Resources Information Center
Manage, Ananda B. W.; Scariano, Stephen M.
2013-01-01
Principal Component Analysis is widely used in applied multivariate data analysis, and this article shows how to motivate student interest in this topic using cricket sports data. Here, principal component analysis is successfully used to rank the cricket batsmen and bowlers who played in the 2012 Indian Premier League (IPL) competition. In…
Meta-Analysis of Mathematic Basic-Fact Fluency Interventions: A Component Analysis
ERIC Educational Resources Information Center
Codding, Robin S.; Burns, Matthew K.; Lukito, Gracia
2011-01-01
Mathematics fluency is a critical component of mathematics learning yet few attempts have been made to synthesize this research base. Seventeen single-case design studies with 55 participants were reviewed using meta-analytic procedures. A component analysis of practice elements was conducted and treatment intensity and feasibility were examined.…
Least Principal Components Analysis (LPCA): An Alternative to Regression Analysis.
ERIC Educational Resources Information Center
Olson, Jeffery E.
Often, all of the variables in a model are latent, random, or subject to measurement error, or there is not an obvious dependent variable. When any of these conditions exist, an appropriate method for estimating the linear relationships among the variables is Least Principal Components Analysis. Least Principal Components are robust, consistent,…
Feng, Xiao-Liang; He, Yun-biao; Liang, Yi-Zeng; Wang, Yu-Lin; Huang, Lan-Fang; Xie, Jian-Wei
2013-01-01
Gas chromatography-mass spectrometry and multivariate curve resolution were applied to the differential analysis of the volatile components in Agrimonia eupatoria specimens from different plant parts. After extracted with water distillation method, the volatile components in Agrimonia eupatoria from leaves and roots were detected by GC-MS. Then the qualitative and quantitative analysis of the volatile components in the main root of Agrimonia eupatoria was completed with the help of subwindow factor analysis resolving two-dimensional original data into mass spectra and chromatograms. 68 of 87 separated constituents in the total ion chromatogram of the volatile components were identified and quantified, accounting for about 87.03% of the total content. Then, the common peaks in leaf were extracted with orthogonal projection resolution method. Among the components determined, there were 52 components coexisting in the studied samples although the relative content of each component showed difference to some extent. The results showed a fair consistency in their GC-MS fingerprint. It was the first time to apply orthogonal projection method to compare different plant parts of Agrimonia eupatoria, and it reduced the burden of qualitative analysis as well as the subjectivity. The obtained results proved the combined approach powerful for the analysis of complex Agrimonia eupatoria samples. The developed method can be used to further study and quality control of Agrimonia eupatoria. PMID:24286016
Feng, Xiao-Liang; He, Yun-Biao; Liang, Yi-Zeng; Wang, Yu-Lin; Huang, Lan-Fang; Xie, Jian-Wei
2013-01-01
Gas chromatography-mass spectrometry and multivariate curve resolution were applied to the differential analysis of the volatile components in Agrimonia eupatoria specimens from different plant parts. After extracted with water distillation method, the volatile components in Agrimonia eupatoria from leaves and roots were detected by GC-MS. Then the qualitative and quantitative analysis of the volatile components in the main root of Agrimonia eupatoria was completed with the help of subwindow factor analysis resolving two-dimensional original data into mass spectra and chromatograms. 68 of 87 separated constituents in the total ion chromatogram of the volatile components were identified and quantified, accounting for about 87.03% of the total content. Then, the common peaks in leaf were extracted with orthogonal projection resolution method. Among the components determined, there were 52 components coexisting in the studied samples although the relative content of each component showed difference to some extent. The results showed a fair consistency in their GC-MS fingerprint. It was the first time to apply orthogonal projection method to compare different plant parts of Agrimonia eupatoria, and it reduced the burden of qualitative analysis as well as the subjectivity. The obtained results proved the combined approach powerful for the analysis of complex Agrimonia eupatoria samples. The developed method can be used to further study and quality control of Agrimonia eupatoria.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
NASA Technical Reports Server (NTRS)
Rajagopal, K. R.
1992-01-01
The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.
Independent Orbiter Assessment (IOA): Weibull analysis report
NASA Technical Reports Server (NTRS)
Raffaelli, Gary G.
1987-01-01
The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.
2016-09-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA JOINT APPLIED PROJECT AN ANALYSIS OF THE ORGANIZATIONAL STRUCTURE OF REDSTONE...AND SUBTITLE AN ANALYSIS OF THE ORGANIZATIONAL STRUCTURE OF REDSTONE TEST CENTER’S ENVIRONMENTAL AND COMPONENTS TEST DIRECTORATE WITH REGARD TO...provides an analysis of the organizational structure of Redstone Test Center’s Environment and Components Test Directorate, with specific regard to
NASA Astrophysics Data System (ADS)
Yang, Yang; Peng, Zhike; Dong, Xingjian; Zhang, Wenming; Clifton, David A.
2018-03-01
A challenge in analysing non-stationary multi-component signals is to isolate nonlinearly time-varying signals especially when they are overlapped in time and frequency plane. In this paper, a framework integrating time-frequency analysis-based demodulation and a non-parametric Gaussian latent feature model is proposed to isolate and recover components of such signals. The former aims to remove high-order frequency modulation (FM) such that the latter is able to infer demodulated components while simultaneously discovering the number of the target components. The proposed method is effective in isolating multiple components that have the same FM behavior. In addition, the results show that the proposed method is superior to generalised demodulation with singular-value decomposition-based method, parametric time-frequency analysis with filter-based method and empirical model decomposition base method, in recovering the amplitude and phase of superimposed components.
Ranking and averaging independent component analysis by reproducibility (RAICAR).
Yang, Zhi; LaConte, Stephen; Weng, Xuchu; Hu, Xiaoping
2008-06-01
Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data. Copyright 2007 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Meksiarun, Phiranuphon; Ishigaki, Mika; Huck-Pezzei, Verena A. C.; Huck, Christian W.; Wongravee, Kanet; Sato, Hidetoshi; Ozaki, Yukihiro
2017-03-01
This study aimed to extract the paraffin component from paraffin-embedded oral cancer tissue spectra using three multivariate analysis (MVA) methods; Independent Component Analysis (ICA), Partial Least Squares (PLS) and Independent Component - Partial Least Square (IC-PLS). The estimated paraffin components were used for removing the contribution of paraffin from the tissue spectra. These three methods were compared in terms of the efficiency of paraffin removal and the ability to retain the tissue information. It was found that ICA, PLS and IC-PLS could remove the paraffin component from the spectra at almost the same level while Principal Component Analysis (PCA) was incapable. In terms of retaining cancer tissue spectral integrity, effects of PLS and IC-PLS on the non-paraffin region were significantly less than that of ICA where cancer tissue spectral areas were deteriorated. The paraffin-removed spectra were used for constructing Raman images of oral cancer tissue and compared with Hematoxylin and Eosin (H&E) stained tissues for verification. This study has demonstrated the capability of Raman spectroscopy together with multivariate analysis methods as a diagnostic tool for the paraffin-embedded tissue section.
Inventory of File sref_nmb.t03z.pgrb212.p1.f00.grib2
Relative Humidity [%] 014.1 10 m above ground UGRD analysis U-Component of Wind [m/s] 014.2 10 m above ground VGRD analysis V-Component of Wind [m/s] 015 surface WEASD analysis Water Equivalent of Accumulated Relative Humidity [%] 033.1 30-0 mb above ground UGRD analysis U-Component of Wind [m/s] 033.2 30-0 mb
Inventory of File sref_nmm.t03z.pgrb212.p1.f00.grib2
Relative Humidity [%] 014.1 10 m above ground UGRD analysis U-Component of Wind [m/s] 014.2 10 m above ground VGRD analysis V-Component of Wind [m/s] 015 surface WEASD analysis Water Equivalent of Accumulated Relative Humidity [%] 033.1 30-0 mb above ground UGRD analysis U-Component of Wind [m/s] 033.2 30-0 mb
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.
2010-01-01
The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284
Dong, Jianghu J; Wang, Liangliang; Gill, Jagbir; Cao, Jiguo
2017-01-01
This article is motivated by some longitudinal clinical data of kidney transplant recipients, where kidney function progression is recorded as the estimated glomerular filtration rates at multiple time points post kidney transplantation. We propose to use the functional principal component analysis method to explore the major source of variations of glomerular filtration rate curves. We find that the estimated functional principal component scores can be used to cluster glomerular filtration rate curves. Ordering functional principal component scores can detect abnormal glomerular filtration rate curves. Finally, functional principal component analysis can effectively estimate missing glomerular filtration rate values and predict future glomerular filtration rate values.
Design component method for sensitivity analysis of built-up structures
NASA Technical Reports Server (NTRS)
Choi, Kyung K.; Seong, Hwai G.
1986-01-01
A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.
Inventory of File sref_em.t03z.pgrb212.p1.f00.grib2
Relative Humidity [%] 014.1 10 m above ground UGRD analysis U-Component of Wind [m/s] 014.2 10 m above ground VGRD analysis V-Component of Wind [m/s] 015 surface WEASD analysis Water Equivalent of Accumulated Wind [m/s] 032.2 30-0 mb above ground VGRD analysis V-Component of Wind [m/s] 033 30-0 mb above ground
Inventory of File sref_em.t03z.pgrb221.p1.f00.grib2
Relative Humidity [%] 014.1 10 m above ground UGRD analysis U-Component of Wind [m/s] 014.2 10 m above ground VGRD analysis V-Component of Wind [m/s] 015 surface WEASD analysis Water Equivalent of Accumulated Wind [m/s] 032.2 30-0 mb above ground VGRD analysis V-Component of Wind [m/s] 033 30-0 mb above ground
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
Research on criticality analysis method of CNC machine tools components under fault rate correlation
NASA Astrophysics Data System (ADS)
Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han
2018-02-01
In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.
EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES.
Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D
2008-05-12
This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component's discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies.
NASA Technical Reports Server (NTRS)
Klein, L. R.
1974-01-01
The free vibrations of elastic structures of arbitrary complexity were analyzed in terms of their component modes. The method was based upon the use of the normal unconstrained modes of the components in a Rayleigh-Ritz analysis. The continuity conditions were enforced by means of Lagrange Multipliers. Examples of the structures considered are: (1) beams with nonuniform properties; (2) airplane structures with high or low aspect ratio lifting surface components; (3) the oblique wing airplane; and (4) plate structures. The method was also applied to the analysis of modal damping of linear elastic structures. Convergence of the method versus the number of modes per component and/or the number of components is discussed and compared to more conventional approaches, ad-hoc methods, and experimental results.
A Cost-Utility Model of Care for Peristomal Skin Complications
Inglese, Gary; Manson, Andrea; Townshend, Arden
2016-01-01
PURPOSE: The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. DESIGN: Cost-utility analysis. METHODS: We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with the use of ostomy components (PSC events avoided; quality-adjusted life days gained) offset the costs associated with their use. RESULTS: Our base case analysis of 1000 hypothetical individuals over 1 year assumes that using ostomy components following a first PSC reduces recurrent events versus PSC management without components. In this analysis, component acquisition costs were largely offset by lower resource use for ostomy supplies (barriers; pouches) and lower clinical utilization to manage PSCs. The overall annual average resource use for individuals using components was about 6.3% ($139) higher versus individuals not using components. Each PSC event avoided yielded, on average, 8 additional quality-adjusted life days over 1 year. CONCLUSIONS: In our analysis, (1) acquisition costs for ostomy components were offset in whole or in part by the use of fewer ostomy supplies to manage PSCs and (2) use of ostomy components to prevent PSCs produced better outcomes (fewer repeat PSC events; more health-related quality-adjusted life days) over 1 year compared to not using components. PMID:26633166
Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.
Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less
Zeng, Rui; Fu, Juan; Wu, La-Bin; Huang, Lin-Fang
2013-07-01
To analyze components of Citrus reticulata and salt-processed C. reticulata by ultra-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UPLC-Q-TOF/MS), and compared the changes in components before and after being processed with salt. Principal component analysis (PCA) and partial least squares discriminant analysis (OPLS-DA) were adopted to analyze the difference in fingerprint between crude and processed C. reticulata, showing increased content of eriocitrin, limonin, nomilin and obacunone increase in salt-processed C. reticulata. Potential chemical markers were identified as limonin, obacunone and nomilin, which could be used for distinguishing index components of crude and processed C. reticulata.
NASA Technical Reports Server (NTRS)
Williams, D. L.; Borden, F. Y.
1977-01-01
Methods to accurately delineate the types of land cover in the urban-rural transition zone of metropolitan areas were considered. The application of principal components analysis to multidate LANDSAT imagery was investigated as a means of reducing the overlap between residential and agricultural spectral signatures. The statistical concepts of principal components analysis were discussed, as well as the results of this analysis when applied to multidate LANDSAT imagery of the Washington, D.C. metropolitan area.
NASGRO(registered trademark): Fracture Mechanics and Fatigue Crack Growth Analysis Software
NASA Technical Reports Server (NTRS)
Forman, Royce; Shivakumar, V.; Mettu, Sambi; Beek, Joachim; Williams, Leonard; Yeh, Feng; McClung, Craig; Cardinal, Joe
2004-01-01
This viewgraph presentation describes NASGRO, which is a fracture mechanics and fatigue crack growth analysis software package that is used to reduce risk of fracture in Space Shuttles. The contents include: 1) Consequences of Fracture; 2) NASA Fracture Control Requirements; 3) NASGRO Reduces Risk; 4) NASGRO Use Inside NASA; 5) NASGRO Components: Crack Growth Module; 6) NASGRO Components:Material Property Module; 7) Typical NASGRO analysis: Crack growth or component life calculation; and 8) NASGRO Sample Application: Orbiter feedline flowliner crack analysis.
Component Analysis of Remanent Magnetization Curves: A Revisit with a New Model Distribution
NASA Astrophysics Data System (ADS)
Zhao, X.; Suganuma, Y.; Fujii, M.
2017-12-01
Geological samples often consist of several magnetic components that have distinct origins. As the magnetic components are often indicative of their underlying geological and environmental processes, it is therefore desirable to identify individual components to extract associated information. This component analysis can be achieved using the so-called unmixing method, which fits a mixture model of certain end-member model distribution to the measured remanent magnetization curve. In earlier studies, the lognormal, skew generalized Gaussian and skewed Gaussian distributions have been used as the end-member model distribution in previous studies, which are performed on the gradient curve of remanent magnetization curves. However, gradient curves are sensitive to measurement noise as the differentiation of the measured curve amplifies noise, which could deteriorate the component analysis. Though either smoothing or filtering can be applied to reduce the noise before differentiation, their effect on biasing component analysis is vaguely addressed. In this study, we investigated a new model function that can be directly applied to the remanent magnetization curves and therefore avoid the differentiation. The new model function can provide more flexible shape than the lognormal distribution, which is a merit for modeling the coercivity distribution of complex magnetic component. We applied the unmixing method both to model and measured data, and compared the results with those obtained using other model distributions to better understand their interchangeability, applicability and limitation. The analyses on model data suggest that unmixing methods are inherently sensitive to noise, especially when the number of component is over two. It is, therefore, recommended to verify the reliability of component analysis by running multiple analyses with synthetic noise. Marine sediments and seafloor rocks are analyzed with the new model distribution. Given the same component number, the new model distribution can provide closer fits than the lognormal distribution evidenced by reduced residuals. Moreover, the new unmixing protocol is automated so that the users are freed from the labor of providing initial guesses for the parameters, which is also helpful to improve the subjectivity of component analysis.
Text analysis devices, articles of manufacture, and text analysis methods
Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C
2015-03-31
Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes a display configured to depict visible images, and processing circuitry coupled with the display and wherein the processing circuitry is configured to access a first vector of a text item and which comprises a plurality of components, to access a second vector of the text item and which comprises a plurality of components, to weight the components of the first vector providing a plurality of weighted values, to weight the components of the second vector providing a plurality of weighted values, and to combine the weighted values of the first vector with the weighted values of the second vector to provide a third vector.
Soleimani, Mohammad Ali; Yaghoobzadeh, Ameneh; Bahrami, Nasim; Sharif, Saeed Pahlevan; Sharif Nia, Hamid
2016-10-01
In this study, 398 Iranian cancer patients completed the 15-item Templer's Death Anxiety Scale (TDAS). Tests of internal consistency, principal components analysis, and confirmatory factor analysis were conducted to assess the internal consistency and factorial validity of the Persian TDAS. The construct reliability statistic and average variance extracted were also calculated to measure construct reliability, convergent validity, and discriminant validity. Principal components analysis indicated a 3-component solution, which was generally supported in the confirmatory analysis. However, acceptable cutoffs for construct reliability, convergent validity, and discriminant validity were not fulfilled for the three subscales that were derived from the principal component analysis. This study demonstrated both the advantages and potential limitations of using the TDAS with Persian-speaking cancer patients.
Constrained Principal Component Analysis: Various Applications.
ERIC Educational Resources Information Center
Hunter, Michael; Takane, Yoshio
2002-01-01
Provides example applications of constrained principal component analysis (CPCA) that illustrate the method on a variety of contexts common to psychological research. Two new analyses, decompositions into finer components and fitting higher order structures, are presented, followed by an illustration of CPCA on contingency tables and the CPCA of…
NASA Technical Reports Server (NTRS)
Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.
2018-01-01
Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid contradictions such as having to choose one sub-component to generate performance diagnostics to another, possibly not fully consistent, component.
Personal Computer Transport Analysis Program
NASA Technical Reports Server (NTRS)
DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter
2012-01-01
The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.
Zhang, Shan; Xu, Lu; Liu, Yang-Xi; Fu, Hai-Yan; Xiao, Zuo-Bing; She, Yuan-Bin
2018-04-01
E-jiao (Colla Corii Asini, CCA) has been widely used as a healthy food and Chinese medicine. Although authentic CCA is characterized by its typical sweet and neutral fragrance, its aroma components have been rarely investigated. This work investigated the aroma-active components and antioxidant activity of 19 CCAs from different geographical origins. CCA extracts obtained by simultaneous distillation and extraction were analyzed by gas chromatography-mass spectrometry (GC-MS), gas chromatography-olfactometry (GC-O) and sensory analysis. The antioxidant activity of CCAs was determined by ABTS and DPPH assays. A total of 65 volatile compounds were identified and quantified by GC-MS and 23 aroma-active compounds were identified by GC-O and aroma extract dilution analysis. The most powerful aroma-active compounds were identified based on the flavor dilution factor and their contents were compared among the 19 CCAs. Principal component analysis of the 23 aroma-active components showed 3 significant clusters. Canonical correlation analysis between antioxidant assays and the 23 aroma-active compounds indicates strong correlation (r = 0.9776, p = 0.0281). Analysis of aroma-active components shows potential for quality evaluation and discrimination of CCAs from different geographical origins.
Minami, Keiichiro; Miyata, Kazunori; Otani, Atsushi; Tokunaga, Tadatoshi; Tokuda, Shouta; Amano, Shiro
2018-05-01
To determine steep increase of corneal irregularity induced by advancement of pterygium. A total of 456 eyes from 456 consecutive patients with primary pterygia were examined for corneal topography and advancement of pterygium with respect to the corneal diameter. Corneal irregularity induced by the pterygium advancement was evaluated by Fourier harmonic analyses of the topographic data that were modified for a series of analysis diameters from 1 mm to 6 mm. Incidences of steep increases in the asymmetry or higher-order irregularity components (inflection points) were determined by using segmented regression analysis for each analysis diameter. The pterygium advancement ranged from 2% to 57%, with a mean of 22.0%. Both components showed steep increases from the inflection points. The inflection points in the higher-order irregularity component altered with the analysis diameter (14.0%-30.6%), while there was no alternation in the asymmetry components (35.5%-36.8%). For the former component, the values at the inflection points were obtained in a range of 0.16 to 0.25 D. The Fourier harmonic analyses for a series of analysis diameters revealed that the higher-order irregularity component increased with the pterygium advancement. The analysis results confirmed the precedence of corneal irregularity due to pterygium advancement.
Factor Analysis via Components Analysis
ERIC Educational Resources Information Center
Bentler, Peter M.; de Leeuw, Jan
2011-01-01
When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…
NASA Technical Reports Server (NTRS)
Sopher, R.; Hallock, D. W.
1985-01-01
A time history analysis for rotorcraft dynamics based on dynamical substructures, and nonstructural mathematical and aerodynamic components is described. The analysis is applied to predict helicopter ground resonance and response to rotor damage. Other applications illustrate the stability and steady vibratory response of stopped and gimballed rotors, representative of new technology. Desirable attributes expected from modern codes are realized, although the analysis does not employ a complete set of techniques identified for advanced software. The analysis is able to handle a comprehensive set of steady state and stability problems with a small library of components.
Separation of β-amyloid binding and white matter uptake of 18F-flutemetamol using spectral analysis
Heurling, Kerstin; Buckley, Christopher; Vandenberghe, Rik; Laere, Koen Van; Lubberink, Mark
2015-01-01
The kinetic components of the β-amyloid ligand 18F-flutemetamol binding in grey and white matter were investigated through spectral analysis, and a method developed for creation of parametric images separating grey and white matter uptake. Tracer uptake in grey and white matter and cerebellar cortex was analyzed through spectral analysis in six subjects, with (n=4) or without (n=2) apparent β-amyloid deposition, having undergone dynamic 18F-flutemetamol scanning with arterial blood sampling. The spectra were divided into three components: slow, intermediate and fast basis function rates. The contribution of each of the components to total volume of distribution (VT) was assessed for different tissue types. The slow component dominated in white matter (average 90%), had a higher contribution to grey matter VT in subjects with β-amyloid deposition (average 44%) than without (average 6%) and was absent in cerebellar cortex, attributing the slow component of 18F-flutemetamol uptake in grey matter to β-amyloid binding. Parametric images of voxel-based spectral analysis were created for VT, the slow component and images segmented based on the slow component contribution; confirming that grey matter and white matter uptake can be discriminated on voxel-level using a threshold for the contribution from the slow component to VT. PMID:26550542
NASA Astrophysics Data System (ADS)
Hristian, L.; Ostafe, M. M.; Manea, L. R.; Apostol, L. L.
2017-06-01
The work pursued the distribution of combed wool fabrics destined to manufacturing of external articles of clothing in terms of the values of durability and physiological comfort indices, using the mathematical model of Principal Component Analysis (PCA). Principal Components Analysis (PCA) applied in this study is a descriptive method of the multivariate analysis/multi-dimensional data, and aims to reduce, under control, the number of variables (columns) of the matrix data as much as possible to two or three. Therefore, based on the information about each group/assortment of fabrics, it is desired that, instead of nine inter-correlated variables, to have only two or three new variables called components. The PCA target is to extract the smallest number of components which recover the most of the total information contained in the initial data.
Cyber Power Potential of the Army’s Reserve Component
2017-01-01
and could extend logically to include electric power, water, food, railway, gas pipelines , and so forth. One consideration to note is that in cases...29 CHAPTER FOUR Army Reserve Component Cyber Inventory Analysis .......................... 31...Background and Analytical Framework ........................................................... 31 Army Reserve Component Cyber Inventory Analysis , 2015
Generalized Structured Component Analysis with Latent Interactions
ERIC Educational Resources Information Center
Hwang, Heungsun; Ho, Moon-Ho Ringo; Lee, Jonathan
2010-01-01
Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling. In practice, researchers may often be interested in examining the interaction effects of latent variables. However, GSCA has been geared only for the specification and testing of the main effects of variables. Thus, an extension of GSCA…
Nonlinear Principal Components Analysis: Introduction and Application
ERIC Educational Resources Information Center
Linting, Marielle; Meulman, Jacqueline J.; Groenen, Patrick J. F.; van der Koojj, Anita J.
2007-01-01
The authors provide a didactic treatment of nonlinear (categorical) principal components analysis (PCA). This method is the nonlinear equivalent of standard PCA and reduces the observed variables to a number of uncorrelated principal components. The most important advantages of nonlinear over linear PCA are that it incorporates nominal and ordinal…
USDA-ARS?s Scientific Manuscript database
Selective principal component regression analysis (SPCR) uses a subset of the original image bands for principal component transformation and regression. For optimal band selection before the transformation, this paper used genetic algorithms (GA). In this case, the GA process used the regression co...
Regularized Generalized Structured Component Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun
2009-01-01
Generalized structured component analysis (GSCA) has been proposed as a component-based approach to structural equation modeling. In practice, GSCA may suffer from multi-collinearity, i.e., high correlations among exogenous variables. GSCA has yet no remedy for this problem. Thus, a regularized extension of GSCA is proposed that integrates a ridge…
Why Does Behavioral Instruction Work? A Component Analysis of Performance and Motivational Outcomes.
ERIC Educational Resources Information Center
Omelich, Carol L.; Covington, Martin V.
Two fundamental components of behavioral instruction were investigated: the reported testing feature and absolute performance standards. The component analysis was conducted by offering an undergraduate psychology course simultaneously along two dimensions: grading systems and number of study/test cycles. The 425 college student subjects were…
Liu, Hui-lin; Wan, Xia; Yang, Gong-huan
2013-02-01
To explore the relationship between the strength of tobacco control and the effectiveness of creating smoke-free hospital, and summarize the main factors that affect the program of creating smoke-free hospitals. A total of 210 hospitals from 7 provinces/municipalities directly under the central government were enrolled in this study using stratified random sampling method. Principle component analysis and regression analysis were conducted to analyze the strength of tobacco control and the effectiveness of creating smoke-free hospitals. Two principal components were extracted in the strength of tobacco control index, which respectively reflected the tobacco control policies and efforts, and the willingness and leadership of hospital managers regarding tobacco control. The regression analysis indicated that only the first principal component was significantly correlated with the progression in creating smoke-free hospital (P<0.001), i.e. hospitals with higher scores on the first principal component had better achievements in smoke-free environment creation. Tobacco control policies and efforts are critical in creating smoke-free hospitals. The principal component analysis provides a comprehensive and objective tool for evaluating the creation of smoke-free hospitals.
Zhao, Xiao-Mei; Pu, Shi-Biao; Zhao, Qing-Guo; Gong, Man; Wang, Jia-Bo; Ma, Zhi-Jie; Xiao, Xiao-He; Zhao, Kui-Jun
2016-08-01
In this paper, the spectrum-effect correlation analysis method was used to explore the main effective components of Tripterygium wilfordii for liver toxicity, and provide reference for promoting the quality control of T. wilfordii. Chinese medicine T.wilfordii was taken as the study object, and LC-Q-TOF-MS was used to characterize the chemical components in T. wilfordii samples from different areas, and their main components were initially identified after referring to the literature. With the normal human hepatocytes (LO2 cell line)as the carrier, acetaminophen as positive medicine, and cell inhibition rate as testing index, the simple correlation analysis and multivariate linear correlation analysis methods were used to screen the main components of T. wilfordii for liver toxicity. As a result, 10 kinds of main components were identified, and the spectrum-effect correlation analysis showed that triptolide may be the toxic component, which was consistent with previous results of traditional literature. Meanwhile it was found that tripterine and demethylzeylasteral may greatly contribute to liver toxicity in multivariate linear correlation analysis. T. wilfordii samples of different varieties or different origins showed large difference in quality, and the T. wilfordii from southwest China showed lower liver toxicity, while those from Hunan and Anhui province showed higher liver toxicity. This study will provide data support for further rational use of T. wilfordii and research on its liver toxicity ingredients. Copyright© by the Chinese Pharmaceutical Association.
Relaxation mode analysis of a peptide system: comparison with principal component analysis.
Mitsutake, Ayori; Iijima, Hiromitsu; Takano, Hiroshi
2011-10-28
This article reports the first attempt to apply the relaxation mode analysis method to a simulation of a biomolecular system. In biomolecular systems, the principal component analysis is a well-known method for analyzing the static properties of fluctuations of structures obtained by a simulation and classifying the structures into some groups. On the other hand, the relaxation mode analysis has been used to analyze the dynamic properties of homopolymer systems. In this article, a long Monte Carlo simulation of Met-enkephalin in gas phase has been performed. The results are analyzed by the principal component analysis and relaxation mode analysis methods. We compare the results of both methods and show the effectiveness of the relaxation mode analysis.
RSA prediction of high failure rate for the uncoated Interax TKA confirmed by meta-analysis.
Pijls, Bart G; Nieuwenhuijse, Marc J; Schoones, Jan W; Middeldorp, Saskia; Valstar, Edward R; Nelissen, Rob G H H
2012-04-01
In a previous radiostereometric (RSA) trial the uncoated, uncemented, Interax tibial components showed excessive migration within 2 years compared to HA-coated and cemented tibial components. It was predicted that this type of fixation would have a high failure rate. The purpose of this systematic review and meta-analysis was to investigate whether this RSA prediction was correct. We performed a systematic review and meta-analysis to determine the revision rate for aseptic loosening of the uncoated and cemented Interax tibial components. 3 studies were included, involving 349 Interax total knee arthroplasties (TKAs) for the comparison of uncoated and cemented fixation. There were 30 revisions: 27 uncoated and 3 cemented components. There was a 3-times higher revision rate for the uncoated Interax components than that for cemented Interax components (OR = 3; 95% CI: 1.4-7.2). This meta-analysis confirms the prediction of a previous RSA trial. The uncoated Interax components showed the highest migration and turned out to have the highest revision rate for aseptic loosening. RSA appears to enable efficient detection of an inferior design as early as 2 years postoperatively in a small group of patients.
Hooper, R.P.; Peters, N.E.
1989-01-01
A principal-components analysis was performed on the major solutes in wet deposition collected from 194 stations in the United States and its territories. Approximately 90% of the components derived could be interpreted as falling into one of three categories - acid, salt, or an agricultural/soil association. The total mass, or the mass of any one solute, was apportioned among these components by multiple linear regression techniques. The use of multisolute components for determining trends or spatial distribution represents a substantial improvement over single-solute analysis in that these components are more directly related to the sources of the deposition. The geographic patterns displayed by the components in this analysis indicate a far more important role for acid deposition in the Southeast and intermountain regions of the United States than would be indicated by maps of sulfate or nitrate deposition alone. In the Northeast and Midwest, the acid component is not declining at most stations, as would be expected from trends in sulfate deposition, but is holding constant or increasing. This is due, in part, to a decline in the agriculture/soil factor throughout this region, which would help to neutralize the acidity.
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Independent component analysis for automatic note extraction from musical trills
NASA Astrophysics Data System (ADS)
Brown, Judith C.; Smaragdis, Paris
2004-05-01
The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.
Finite element analysis of helicopter structures
NASA Technical Reports Server (NTRS)
Rich, M. J.
1978-01-01
Application of the finite element analysis is now being expanded to three dimensional analysis of mechanical components. Examples are presented for airframe, mechanical components, and composite structure calculations. Data are detailed on the increase of model size, computer usage, and the effect on reducing stress analysis costs. Future applications for use of finite element analysis for helicopter structures are projected.
CO Component Estimation Based on the Independent Component Analysis
NASA Astrophysics Data System (ADS)
Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki; Takeuchi, Tsutomu T.; Fukui, Yasuo
2014-01-01
Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independent component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.
CO component estimation based on the independent component analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki
2014-01-01
Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independentmore » component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.« less
Ramli, Saifullah; Ismail, Noryati; Alkarkhi, Abbas Fadhl Mubarek; Easa, Azhar Mat
2010-08-01
Banana peel flour (BPF) prepared from green or ripe Cavendish and Dream banana fruits were assessed for their total starch (TS), digestible starch (DS), resistant starch (RS), total dietary fibre (TDF), soluble dietary fibre (SDF) and insoluble dietary fibre (IDF). Principal component analysis (PCA) identified that only 1 component was responsible for 93.74% of the total variance in the starch and dietary fibre components that differentiated ripe and green banana flours. Cluster analysis (CA) applied to similar data obtained two statistically significant clusters (green and ripe bananas) to indicate difference in behaviours according to the stages of ripeness based on starch and dietary fibre components. We concluded that the starch and dietary fibre components could be used to discriminate between flours prepared from peels obtained from fruits of different ripeness. The results were also suggestive of the potential of green and ripe BPF as functional ingredients in food.
Ramli, Saifullah; Ismail, Noryati; Alkarkhi, Abbas Fadhl Mubarek; Easa, Azhar Mat
2010-01-01
Banana peel flour (BPF) prepared from green or ripe Cavendish and Dream banana fruits were assessed for their total starch (TS), digestible starch (DS), resistant starch (RS), total dietary fibre (TDF), soluble dietary fibre (SDF) and insoluble dietary fibre (IDF). Principal component analysis (PCA) identified that only 1 component was responsible for 93.74% of the total variance in the starch and dietary fibre components that differentiated ripe and green banana flours. Cluster analysis (CA) applied to similar data obtained two statistically significant clusters (green and ripe bananas) to indicate difference in behaviours according to the stages of ripeness based on starch and dietary fibre components. We concluded that the starch and dietary fibre components could be used to discriminate between flours prepared from peels obtained from fruits of different ripeness. The results were also suggestive of the potential of green and ripe BPF as functional ingredients in food. PMID:24575193
NASA Astrophysics Data System (ADS)
Dafu, Shen; Leihong, Zhang; Dong, Liang; Bei, Li; Yi, Kang
2017-07-01
The purpose of this study is to improve the reconstruction precision and better copy the color of spectral image surfaces. A new spectral reflectance reconstruction algorithm based on an iterative threshold combined with weighted principal component space is presented in this paper, and the principal component with weighted visual features is the sparse basis. Different numbers of color cards are selected as the training samples, a multispectral image is the testing sample, and the color differences in the reconstructions are compared. The channel response value is obtained by a Mega Vision high-accuracy, multi-channel imaging system. The results show that spectral reconstruction based on weighted principal component space is superior in performance to that based on traditional principal component space. Therefore, the color difference obtained using the compressive-sensing algorithm with weighted principal component analysis is less than that obtained using the algorithm with traditional principal component analysis, and better reconstructed color consistency with human eye vision is achieved.
EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES
Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D.
2009-01-01
This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component’s discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies. PMID:20582334
System diagnostics using qualitative analysis and component functional classification
Reifman, J.; Wei, T.Y.C.
1993-11-23
A method for detecting and identifying faulty component candidates during off-normal operations of nuclear power plants involves the qualitative analysis of macroscopic imbalances in the conservation equations of mass, energy and momentum in thermal-hydraulic control volumes associated with one or more plant components and the functional classification of components. The qualitative analysis of mass and energy is performed through the associated equations of state, while imbalances in momentum are obtained by tracking mass flow rates which are incorporated into a first knowledge base. The plant components are functionally classified, according to their type, as sources or sinks of mass, energy and momentum, depending upon which of the three balance equations is most strongly affected by a faulty component which is incorporated into a second knowledge base. Information describing the connections among the components of the system forms a third knowledge base. The method is particularly adapted for use in a diagnostic expert system to detect and identify faulty component candidates in the presence of component failures and is not limited to use in a nuclear power plant, but may be used with virtually any type of thermal-hydraulic operating system. 5 figures.
System diagnostics using qualitative analysis and component functional classification
Reifman, Jaques; Wei, Thomas Y. C.
1993-01-01
A method for detecting and identifying faulty component candidates during off-normal operations of nuclear power plants involves the qualitative analysis of macroscopic imbalances in the conservation equations of mass, energy and momentum in thermal-hydraulic control volumes associated with one or more plant components and the functional classification of components. The qualitative analysis of mass and energy is performed through the associated equations of state, while imbalances in momentum are obtained by tracking mass flow rates which are incorporated into a first knowledge base. The plant components are functionally classified, according to their type, as sources or sinks of mass, energy and momentum, depending upon which of the three balance equations is most strongly affected by a faulty component which is incorporated into a second knowledge base. Information describing the connections among the components of the system forms a third knowledge base. The method is particularly adapted for use in a diagnostic expert system to detect and identify faulty component candidates in the presence of component failures and is not limited to use in a nuclear power plant, but may be used with virtually any type of thermal-hydraulic operating system.
NASA Astrophysics Data System (ADS)
Wojciechowski, Adam
2017-04-01
In order to assess ecodiversity understood as a comprehensive natural landscape factor (Jedicke 2001), it is necessary to apply research methods which recognize the environment in a holistic way. Principal component analysis may be considered as one of such methods as it allows to distinguish the main factors determining landscape diversity on the one hand, and enables to discover regularities shaping the relationships between various elements of the environment under study on the other hand. The procedure adopted to assess ecodiversity with the use of principal component analysis involves: a) determining and selecting appropriate factors of the assessed environment qualities (hypsometric, geological, hydrographic, plant, and others); b) calculating the absolute value of individual qualities for the basic areas under analysis (e.g. river length, forest area, altitude differences, etc.); c) principal components analysis and obtaining factor maps (maps of selected components); d) generating a resultant, detailed map and isolating several classes of ecodiversity. An assessment of ecodiversity with the use of principal component analysis was conducted in the test area of 299,67 km2 in Debnica Kaszubska commune. The whole commune is situated in the Weichselian glaciation area of high hypsometric and morphological diversity as well as high geo- and biodiversity. The analysis was based on topographical maps of the commune area in scale 1:25000 and maps of forest habitats. Consequently, nine factors reflecting basic environment elements were calculated: maximum height (m), minimum height (m), average height (m), the length of watercourses (km), the area of water reservoirs (m2), total forest area (ha), coniferous forests habitats area (ha), deciduous forest habitats area (ha), alder habitats area (ha). The values for individual factors were analysed for 358 grid cells of 1 km2. Based on the principal components analysis, four major factors affecting commune ecodiversity were distinguished: hypsometric component (PC1), deciduous forest habitats component (PC2), river valleys and alder habitats component (PC3), and lakes component (PC4). The distinguished factors characterise natural qualities of postglacial area and reflect well the role of the four most important groups of environment components in shaping ecodiversity of the area under study. The map of ecodiversity of Debnica Kaszubska commune was created on the basis of the first four principal component scores and then five classes of diversity were isolated: very low, low, average, high and very high. As a result of the assessment, five commune regions of very high ecodiversity were separated. These regions are also very attractive for tourists and valuable in terms of their rich nature which include protected areas such as Slupia Valley Landscape Park. The suggested method of ecodiversity assessment with the use of principal component analysis may constitute an alternative methodological proposition to other research methods used so far. Literature Jedicke E., 2001. Biodiversität, Geodiversität, Ökodiversität. Kriterien zur Analyse der Landschaftsstruktur - ein konzeptioneller Diskussionsbeitrag. Naturschutz und Landschaftsplanung, 33(2/3), 59-68.
Energy efficient engine component development and integration program
NASA Technical Reports Server (NTRS)
1981-01-01
Accomplishments in the Energy Efficient Engine Component Development and Integration program during the period of April 1, 1981 through September 30, 1981 are discussed. The major topics considered are: (1) propulsion system analysis, design, and integration; (2) engine component analysis, design, and development; (3) core engine tests; and (4) integrated core/low spool testing.
A Note on McDonald's Generalization of Principal Components Analysis
ERIC Educational Resources Information Center
Shine, Lester C., II
1972-01-01
It is shown that McDonald's generalization of Classical Principal Components Analysis to groups of variables maximally channels the totalvariance of the original variables through the groups of variables acting as groups. An equation is obtained for determining the vectors of correlations of the L2 components with the original variables.…
14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis
Code of Federal Regulations, 2012 CFR
2012-01-01
... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...
14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis
Code of Federal Regulations, 2010 CFR
2010-01-01
... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...
14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis
Code of Federal Regulations, 2013 CFR
2013-01-01
... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...
14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis
Code of Federal Regulations, 2014 CFR
2014-01-01
... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...
14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis
Code of Federal Regulations, 2011 CFR
2011-01-01
... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...
Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong
2017-04-01
An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.
Ekdahl, Anja; Johansson, Maria C; Ahnoff, Martin
2013-04-01
Matrix effects on electrospray ionization were investigated for plasma samples analysed by hydrophilic interaction chromatography (HILIC) in gradient elution mode, and HILIC columns of different chemistries were tested for separation of plasma components and model analytes. By combining mass spectral data with post-column infusion traces, the following components of protein-precipitated plasma were identified and found to have significant effect on ionization: urea, creatinine, phosphocholine, lysophosphocholine, sphingomyelin, sodium ion, chloride ion, choline and proline betaine. The observed effect on ionization was both matrix-component and analyte dependent. The separation of identified plasma components and model analytes on eight columns was compared, using pair-wise linear correlation analysis and principal component analysis (PCA). Large changes in selectivity could be obtained by change of column, while smaller changes were seen when the mobile phase buffer was changed from ammonium formate pH 3.0 to ammonium acetate pH 4.5. While results from PCA and linear correlation analysis were largely in accord, linear correlation analysis was judged to be more straight-forward in terms of conduction and interpretation.
Multivariate analysis for scanning tunneling spectroscopy data
NASA Astrophysics Data System (ADS)
Yamanishi, Junsuke; Iwase, Shigeru; Ishida, Nobuyuki; Fujita, Daisuke
2018-01-01
We applied principal component analysis (PCA) to two-dimensional tunneling spectroscopy (2DTS) data obtained on a Si(111)-(7 × 7) surface to explore the effectiveness of multivariate analysis for interpreting 2DTS data. We demonstrated that several components that originated mainly from specific atoms at the Si(111)-(7 × 7) surface can be extracted by PCA. Furthermore, we showed that hidden components in the tunneling spectra can be decomposed (peak separation), which is difficult to achieve with normal 2DTS analysis without the support of theoretical calculations. Our analysis showed that multivariate analysis can be an additional powerful way to analyze 2DTS data and extract hidden information from a large amount of spectroscopic data.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
Puniya, Bhanwar Lal; Allen, Laura; Hochfelder, Colleen; Majumder, Mahbubul; Helikar, Tomáš
2016-01-01
Dysregulation in signal transduction pathways can lead to a variety of complex disorders, including cancer. Computational approaches such as network analysis are important tools to understand system dynamics as well as to identify critical components that could be further explored as therapeutic targets. Here, we performed perturbation analysis of a large-scale signal transduction model in extracellular environments that stimulate cell death, growth, motility, and quiescence. Each of the model’s components was perturbed under both loss-of-function and gain-of-function mutations. Using 1,300 simulations under both types of perturbations across various extracellular conditions, we identified the most and least influential components based on the magnitude of their influence on the rest of the system. Based on the premise that the most influential components might serve as better drug targets, we characterized them for biological functions, housekeeping genes, essential genes, and druggable proteins. The most influential components under all environmental conditions were enriched with several biological processes. The inositol pathway was found as most influential under inactivating perturbations, whereas the kinase and small lung cancer pathways were identified as the most influential under activating perturbations. The most influential components were enriched with essential genes and druggable proteins. Moreover, known cancer drug targets were also classified in influential components based on the affected components in the network. Additionally, the systemic perturbation analysis of the model revealed a network motif of most influential components which affect each other. Furthermore, our analysis predicted novel combinations of cancer drug targets with various effects on other most influential components. We found that the combinatorial perturbation consisting of PI3K inactivation and overactivation of IP3R1 can lead to increased activity levels of apoptosis-related components and tumor-suppressor genes, suggesting that this combinatorial perturbation may lead to a better target for decreasing cell proliferation and inducing apoptosis. Finally, our approach shows a potential to identify and prioritize therapeutic targets through systemic perturbation analysis of large-scale computational models of signal transduction. Although some components of the presented computational results have been validated against independent gene expression data sets, more laboratory experiments are warranted to more comprehensively validate the presented results. PMID:26904540
A Comparison of Component and Factor Patterns: A Monte Carlo Approach.
ERIC Educational Resources Information Center
Velicer, Wayne F.; And Others
1982-01-01
Factor analysis, image analysis, and principal component analysis are compared with respect to the factor patterns they would produce under various conditions. The general conclusion that is reached is that the three methods produce results that are equivalent. (Author/JKS)
Real-space analysis of radiation-induced specific changes with independent component analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borek, Dominika; Bromberg, Raquel; Hattne, Johan
A method of analysis is presented that allows for the separation of specific radiation-induced changes into distinct components in real space. The method relies on independent component analysis (ICA) and can be effectively applied to electron density maps and other types of maps, provided that they can be represented as sets of numbers on a grid. Here, for glucose isomerase crystals, ICA was used in a proof-of-concept analysis to separate temperature-dependent and temperature-independent components of specific radiation-induced changes for data sets acquired from multiple crystals across multiple temperatures. ICA identified two components, with the temperature-independent component being responsible for themore » majority of specific radiation-induced changes at temperatures below 130 K. The patterns of specific temperature-independent radiation-induced changes suggest a contribution from the tunnelling of electron holes as a possible explanation. In the second case, where a group of 22 data sets was collected on a single thaumatin crystal, ICA was used in another type of analysis to separate specific radiation-induced effects happening on different exposure-level scales. Here, ICA identified two components of specific radiation-induced changes that likely result from radiation-induced chemical reactions progressing with different rates at different locations in the structure. In addition, ICA unexpectedly identified the radiation-damage state corresponding to reduced disulfide bridges rather than the zero-dose extrapolated state as the highest contrast structure. The application of ICA to the analysis of specific radiation-induced changes in real space and the data pre-processing for ICA that relies on singular value decomposition, which was used previously in data space to validate a two-component physical model of X-ray radiation-induced changes, are discussed in detail. This work lays a foundation for a better understanding of protein-specific radiation chemistries and provides a framework for analysing effects of specific radiation damage in crystallographic and cryo-EM experiments.« less
NASA Astrophysics Data System (ADS)
Mahmoudishadi, S.; Malian, A.; Hosseinali, F.
2017-09-01
The image processing techniques in transform domain are employed as analysis tools for enhancing the detection of mineral deposits. The process of decomposing the image into important components increases the probability of mineral extraction. In this study, the performance of Principal Component Analysis (PCA) and Independent Component Analysis (ICA) has been evaluated for the visible and near-infrared (VNIR) and Shortwave infrared (SWIR) subsystems of ASTER data. Ardestan is located in part of Central Iranian Volcanic Belt that hosts many well-known porphyry copper deposits. This research investigated the propylitic and argillic alteration zones and outer mineralogy zone in part of Ardestan region. The two mentioned approaches were applied to discriminate alteration zones from igneous bedrock using the major absorption of indicator minerals from alteration and mineralogy zones in spectral rang of ASTER bands. Specialized PC components (PC2, PC3 and PC6) were used to identify pyrite and argillic and propylitic zones that distinguish from igneous bedrock in RGB color composite image. Due to the eigenvalues, the components 2, 3 and 6 account for 4.26% ,0.9% and 0.09% of the total variance of the data for Ardestan scene, respectively. For the purpose of discriminating the alteration and mineralogy zones of porphyry copper deposit from bedrocks, those mentioned percentages of data in ICA independent components of IC2, IC3 and IC6 are more accurately separated than noisy bands of PCA. The results of ICA method conform to location of lithological units of Ardestan region, as well.
Surzhikov, V D; Surzhikov, D V
2014-01-01
The search and measurement of causal relationships between exposure to air pollution and health state of the population is based on the system analysis and risk assessment to improve the quality of research. With this purpose there is applied the modern statistical analysis with the use of criteria of independence, principal component analysis and discriminate function analysis. As a result of analysis out of all atmospheric pollutants there were separated four main components: for diseases of the circulatory system main principal component is implied with concentrations of suspended solids, nitrogen dioxide, carbon monoxide, hydrogen fluoride, for the respiratory diseases the main c principal component is closely associated with suspended solids, sulfur dioxide and nitrogen dioxide, charcoal black. The discriminant function was shown to be used as a measure of the level of air pollution.
Priority of VHS Development Based in Potential Area using Principal Component Analysis
NASA Astrophysics Data System (ADS)
Meirawan, D.; Ana, A.; Saripudin, S.
2018-02-01
The current condition of VHS is still inadequate in quality, quantity and relevance. The purpose of this research is to analyse the development of VHS based on the development of regional potential by using principal component analysis (PCA) in Bandung, Indonesia. This study used descriptive qualitative data analysis using the principle of secondary data reduction component. The method used is Principal Component Analysis (PCA) analysis with Minitab Statistics Software tool. The results of this study indicate the value of the lowest requirement is a priority of the construction of development VHS with a program of majors in accordance with the development of regional potential. Based on the PCA score found that the main priority in the development of VHS in Bandung is in Saguling, which has the lowest PCA value of 416.92 in area 1, Cihampelas with the lowest PCA value in region 2 and Padalarang with the lowest PCA value.
Ghosh, Debasree; Chattopadhyay, Parimal
2012-06-01
The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.
Combination of PCA and LORETA for sources analysis of ERP data: an emotional processing study
NASA Astrophysics Data System (ADS)
Hu, Jin; Tian, Jie; Yang, Lei; Pan, Xiaohong; Liu, Jiangang
2006-03-01
The purpose of this paper is to study spatiotemporal patterns of neuronal activity in emotional processing by analysis of ERP data. 108 pictures (categorized as positive, negative and neutral) were presented to 24 healthy, right-handed subjects while 128-channel EEG data were recorded. An analysis of two steps was applied to the ERP data. First, principal component analysis was performed to obtain significant ERP components. Then LORETA was applied to each component to localize their brain sources. The first six principal components were extracted, each of which showed different spatiotemporal patterns of neuronal activity. The results agree with other emotional study by fMRI or PET. The combination of PCA and LORETA can be used to analyze spatiotemporal patterns of ERP data in emotional processing.
Multibody model reduction by component mode synthesis and component cost analysis
NASA Technical Reports Server (NTRS)
Spanos, J. T.; Mingori, D. L.
1990-01-01
The classical assumed-modes method is widely used in modeling the dynamics of flexible multibody systems. According to the method, the elastic deformation of each component in the system is expanded in a series of spatial and temporal functions known as modes and modal coordinates, respectively. This paper focuses on the selection of component modes used in the assumed-modes expansion. A two-stage component modal reduction method is proposed combining Component Mode Synthesis (CMS) with Component Cost Analysis (CCA). First, each component model is truncated such that the contribution of the high frequency subsystem to the static response is preserved. Second, a new CMS procedure is employed to assemble the system model and CCA is used to further truncate component modes in accordance with their contribution to a quadratic cost function of the system output. The proposed method is demonstrated with a simple example of a flexible two-body system.
Component Cost Analysis of Large Scale Systems
NASA Technical Reports Server (NTRS)
Skelton, R. E.; Yousuff, A.
1982-01-01
The ideas of cost decomposition is summarized to aid in the determination of the relative cost (or 'price') of each component of a linear dynamic system using quadratic performance criteria. In addition to the insights into system behavior that are afforded by such a component cost analysis CCA, these CCA ideas naturally lead to a theory for cost-equivalent realizations.
78 FR 8150 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... three components: the ``Design and Implementation Study,'' the ``Performance Analysis Study,'' and the...- Component Evaluation--Data Collection Related to the Performance Analysis Study and the Impact and the In-depth Implementation Study. OMB No.: 0970-0398 Description: The Office of Data Analysis, Research, and...
Proportional counter device for detecting electronegative species in an air sample
Allman, Steve L.; Chen, Fang C.; Chen, Chung-Hsuan
1994-01-01
Apparatus for detecting an electronegative species comprises an analysis chamber, an inlet communicating with the analysis chamber for admitting a sample containing the electronegative species and an ionizable component, a radioactive source within the analysis chamber for emitting radioactive energy for ionizing a component of the sample, a proportional electron detector within the analysis chamber for detecting electrons emitted from the ionized component, and a circuit for measuring the electrons and determining the presence of the electronegative species by detecting a reduction in the number of available electrons due to capture of electrons by the electronegative species.
Proportional counter device for detecting electronegative species in an air sample
Allman, S.L.; Chen, F.C.; Chen, C.H.
1994-03-08
Apparatus for detecting an electronegative species comprises an analysis chamber, an inlet communicating with the analysis chamber for admitting a sample containing the electronegative species and an ionizable component, a radioactive source within the analysis chamber for emitting radioactive energy for ionizing a component of the sample, a proportional electron detector within the analysis chamber for detecting electrons emitted from the ionized component, and a circuit for measuring the electrons and determining the presence of the electronegative species by detecting a reduction in the number of available electrons due to capture of electrons by the electronegative species. 2 figures.
Verma, Priyanka; Kumar, Manoj; Mishra, Girish; Sahoo, Dinabandhu
2017-02-01
In the present study bio prospecting of thirty seaweeds from Indian coasts was analyzed for their biochemical components including pigments, fatty acid and ash content. Multivariate analysis of biochemical components and fatty acids was done using Principal Component Analysis (PCA) and Agglomerative hierarchical clustering (AHC) to manifest chemotaxonomic relationship among various seaweeds. The overall analysis suggests that these seaweeds have multi-functional properties and can be utilized as promising bioresource for proteins, lipids, pigments and carbohydrates for the food/feed and biofuel industry. Copyright © 2016. Published by Elsevier Ltd.
Multi-component separation and analysis of bat echolocation calls.
DiCecco, John; Gaudette, Jason E; Simmons, James A
2013-01-01
The vast majority of animal vocalizations contain multiple frequency modulated (FM) components with varying amounts of non-linear modulation and harmonic instability. This is especially true of biosonar sounds where precise time-frequency templates are essential for neural information processing of echoes. Understanding the dynamic waveform design by bats and other echolocating animals may help to improve the efficacy of man-made sonar through biomimetic design. Bats are known to adapt their call structure based on the echolocation task, proximity to nearby objects, and density of acoustic clutter. To interpret the significance of these changes, a method was developed for component separation and analysis of biosonar waveforms. Techniques for imaging in the time-frequency plane are typically limited due to the uncertainty principle and interference cross terms. This problem is addressed by extending the use of the fractional Fourier transform to isolate each non-linear component for separate analysis. Once separated, empirical mode decomposition can be used to further examine each component. The Hilbert transform may then successfully extract detailed time-frequency information from each isolated component. This multi-component analysis method is applied to the sonar signals of four species of bats recorded in-flight by radiotelemetry along with a comparison of other common time-frequency representations.
NASA Technical Reports Server (NTRS)
1992-01-01
The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.
32 CFR 651.34 - EA components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... and comparison of impacts should provide sufficient analysis to reach a conclusion regarding the... ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Environmental Assessment § 651.34 EA components. EAs should be...
32 CFR 651.34 - EA components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and comparison of impacts should provide sufficient analysis to reach a conclusion regarding the... ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Environmental Assessment § 651.34 EA components. EAs should be...
Yoo, Minjae; Shin, Jimin; Kim, Hyunmin; Kim, Jihye; Kang, Jaewoo; Tan, Aik Choon
2018-04-04
Traditional Chinese Medicine (TCM) has been practiced over thousands of years in China and other Asian countries for treating various symptoms and diseases. However, the underlying molecular mechanisms of TCM are poorly understood, partly due to the "multi-component, multi-target" nature of TCM. To uncover the molecular mechanisms of TCM, we perform comprehensive gene expression analysis using connectivity map. We interrogated gene expression signatures obtained 102 TCM components using the next generation Connectivity Map (CMap) resource. We performed systematic data mining and analysis on the mechanism of action (MoA) of these TCM components based on the CMap results. We clustered the 102 TCM components into four groups based on their MoAs using next generation CMap resource. We performed gene set enrichment analysis on these components to provide additional supports for explaining these molecular mechanisms. We also provided literature evidence to validate the MoAs identified through this bioinformatics analysis. Finally, we developed the Traditional Chinese Medicine Drug Repurposing Hub (TCM Hub) - a connectivity map resource to facilitate the elucidation of TCM MoA for drug repurposing research. TCMHub is freely available in http://tanlab.ucdenver.edu/TCMHub. Molecular mechanisms of TCM could be uncovered by using gene expression signatures and connectivity map. Through this analysis, we identified many of the TCM components possess diverse MoAs, this may explain the applications of TCM in treating various symptoms and diseases. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Shiokari, T.
1975-01-01
The feasibility and cost savings of using flight-proven components in designing spacecraft were investigated. The components analyzed were (1) large space telescope, (2) stratospheric aerosol and gas equipment, (3) mapping mission, (4) solar maximum mission, and (5) Tiros-N. It is concluded that flight-proven hardware can be used with not-too-extensive modification, and significant savings can be realized. The cost savings for each component are presented.
Vibration signature analysis of multistage gear transmission
NASA Technical Reports Server (NTRS)
Choy, F. K.; Tu, Y. K.; Savage, M.; Townsend, D. P.
1989-01-01
An analysis is presented for multistage multimesh gear transmission systems. The analysis predicts the overall system dynamics and the transmissibility to the gear box or the enclosed structure. The modal synthesis approach of the analysis treats the uncoupled lateral/torsional model characteristics of each stage or component independently. The vibration signature analysis evaluates the global dynamics coupling in the system. The method synthesizes the interaction of each modal component or stage with the nonlinear gear mesh dynamics and the modal support geometry characteristics. The analysis simulates transient and steady state vibration events to determine the resulting torque variations, speeds, changes, rotor imbalances, and support gear box motion excitations. A vibration signature analysis examines the overall dynamic characteristics of the system, and the individual model component responses. The gear box vibration analysis also examines the spectral characteristics of the support system.
Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis
NASA Astrophysics Data System (ADS)
Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang
2017-07-01
In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.
SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.
Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan
2017-09-01
With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
NASA Astrophysics Data System (ADS)
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Liu, Xiang; Guo, Ling-Peng; Zhang, Fei-Yun; Ma, Jie; Mu, Shu-Yong; Zhao, Xin; Li, Lan-Hai
2015-02-01
Eight physical and chemical indicators related to water quality were monitored from nineteen sampling sites along the Kunes River at the end of snowmelt season in spring. To investigate the spatial distribution characteristics of water physical and chemical properties, cluster analysis (CA), discriminant analysis (DA) and principal component analysis (PCA) are employed. The result of cluster analysis showed that the Kunes River could be divided into three reaches according to the similarities of water physical and chemical properties among sampling sites, representing the upstream, midstream and downstream of the river, respectively; The result of discriminant analysis demonstrated that the reliability of such a classification was high, and DO, Cl- and BOD5 were the significant indexes leading to this classification; Three principal components were extracted on the basis of the principal component analysis, in which accumulative variance contribution could reach 86.90%. The result of principal component analysis also indicated that water physical and chemical properties were mostly affected by EC, ORP, NO3(-) -N, NH4(+) -N, Cl- and BOD5. The sorted results of principal component scores in each sampling sites showed that the water quality was mainly influenced by DO in upstream, by pH in midstream, and by the rest of indicators in downstream. The order of comprehensive scores for principal components revealed that the water quality degraded from the upstream to downstream, i.e., the upstream had the best water quality, followed by the midstream, while the water quality at downstream was the worst. This result corresponded exactly to the three reaches classified using cluster analysis. Anthropogenic activity and the accumulation of pollutants along the river were probably the main reasons leading to this spatial difference.
ERIC Educational Resources Information Center
Florida State Univ., Tallahassee. Program of Vocational Education.
Part of a system by which local education agency (LEA) personnel may evaluate secondary and postsecondary vocational education programs, this fifth of eight components focuses on an analysis of the utilization of community resources. Utilization of the component is designed to open communication channels among all segments of the community so that…
ERIC Educational Resources Information Center
Mellard, Daryl F.; Anthony, Jason L.; Woods, Kari L.
2012-01-01
This study extends the literature on the component skills involved in oral reading fluency. Dominance analysis was applied to assess the relative importance of seven reading-related component skills in the prediction of the oral reading fluency of 272 adult literacy learners. The best predictors of oral reading fluency when text difficulty was…
ERIC Educational Resources Information Center
Kronenberger, William G.; Thompson, Robert J., Jr.; Morrow, Catherine
1997-01-01
A principal components analysis of the Family Environment Scale (FES) (R. Moos and B. Moos, 1994) was performed using 113 undergraduates. Research supported 3 broad components encompassing the 10 FES subscales. These results supported previous research and the generalization of the FES to college samples. (SLD)
Genetic association of impulsivity in young adults: a multivariate study
Khadka, S; Narayanan, B; Meda, S A; Gelernter, J; Han, S; Sawyer, B; Aslanzadeh, F; Stevens, M C; Hawkins, K A; Anticevic, A; Potenza, M N; Pearlson, G D
2014-01-01
Impulsivity is a heritable, multifaceted construct with clinically relevant links to multiple psychopathologies. We assessed impulsivity in young adult (N~2100) participants in a longitudinal study, using self-report questionnaires and computer-based behavioral tasks. Analysis was restricted to the subset (N=426) who underwent genotyping. Multivariate association between impulsivity measures and single-nucleotide polymorphism data was implemented using parallel independent component analysis (Para-ICA). Pathways associated with multiple genes in components that correlated significantly with impulsivity phenotypes were then identified using a pathway enrichment analysis. Para-ICA revealed two significantly correlated genotype–phenotype component pairs. One impulsivity component included the reward responsiveness subscale and behavioral inhibition scale of the Behavioral-Inhibition System/Behavioral-Activation System scale, and the second impulsivity component included the non-planning subscale of the Barratt Impulsiveness Scale and the Experiential Discounting Task. Pathway analysis identified processes related to neurogenesis, nervous system signal generation/amplification, neurotransmission and immune response. We identified various genes and gene regulatory pathways associated with empirically derived impulsivity components. Our study suggests that gene networks implicated previously in brain development, neurotransmission and immune response are related to impulsive tendencies and behaviors. PMID:25268255
Principal Component Clustering Approach to Teaching Quality Discriminant Analysis
ERIC Educational Resources Information Center
Xian, Sidong; Xia, Haibo; Yin, Yubo; Zhai, Zhansheng; Shang, Yan
2016-01-01
Teaching quality is the lifeline of the higher education. Many universities have made some effective achievement about evaluating the teaching quality. In this paper, we establish the Students' evaluation of teaching (SET) discriminant analysis model and algorithm based on principal component clustering analysis. Additionally, we classify the SET…
Analysis of the principal component algorithm in phase-shifting interferometry.
Vargas, J; Quiroga, J Antonio; Belenguer, T
2011-06-15
We recently presented a new asynchronous demodulation method for phase-sampling interferometry. The method is based in the principal component analysis (PCA) technique. In the former work, the PCA method was derived heuristically. In this work, we present an in-depth analysis of the PCA demodulation method.
NASA Astrophysics Data System (ADS)
Biswal, Milan; Mishra, Srikanta
2018-05-01
The limited information on origin and nature of stimulus frequency otoacoustic emissions (SFOAEs) necessitates a thorough reexamination into SFOAE analysis procedures. This will lead to a better understanding of the generation of SFOAEs. The SFOAE response waveform in the time domain can be interpreted as a summation of amplitude modulated and frequency modulated component waveforms. The efficiency of a technique to segregate these components is critical to describe the nature of SFOAEs. Recent advancements in robust time-frequency analysis algorithms have staked claims on the more accurate extraction of these components, from composite signals buried in noise. However, their potential has not been fully explored for SFOAEs analysis. Indifference to distinct information, due to nature of these analysis techniques, may impact the scientific conclusions. This paper attempts to bridge this gap in literature by evaluating the performance of three linear time-frequency analysis algorithms: short-time Fourier transform (STFT), continuous Wavelet transform (CWT), S-transform (ST) and two nonlinear algorithms: Hilbert-Huang Transform (HHT), synchrosqueezed Wavelet transform (SWT). We revisit the extraction of constituent components and estimation of their magnitude and delay, by carefully evaluating the impact of variation in analysis parameters. The performance of HHT and SWT from the perspective of time-frequency filtering and delay estimation were found to be relatively less efficient for analyzing SFOAEs. The intrinsic mode functions of HHT does not completely characterize the reflection components and hence IMF based filtering alone, is not recommended for segregating principal emission from multiple reflection components. We found STFT, WT, and ST to be suitable for canceling multiple internal reflection components with marginal altering in SFOAE.
Selection of independent components based on cortical mapping of electromagnetic activity
NASA Astrophysics Data System (ADS)
Chan, Hui-Ling; Chen, Yong-Sheng; Chen, Li-Fen
2012-10-01
Independent component analysis (ICA) has been widely used to attenuate interference caused by noise components from the electromagnetic recordings of brain activity. However, the scalp topographies and associated temporal waveforms provided by ICA may be insufficient to distinguish functional components from artifactual ones. In this work, we proposed two component selection methods, both of which first estimate the cortical distribution of the brain activity for each component, and then determine the functional components based on the parcellation of brain activity mapped onto the cortical surface. Among all independent components, the first method can identify the dominant components, which have strong activity in the selected dominant brain regions, whereas the second method can identify those inter-regional associating components, which have similar component spectra between a pair of regions. For a targeted region, its component spectrum enumerates the amplitudes of its parceled brain activity across all components. The selected functional components can be remixed to reconstruct the focused electromagnetic signals for further analysis, such as source estimation. Moreover, the inter-regional associating components can be used to estimate the functional brain network. The accuracy of the cortical activation estimation was evaluated on the data from simulation studies, whereas the usefulness and feasibility of the component selection methods were demonstrated on the magnetoencephalography data recorded from a gender discrimination study.
Integrated fluorescence analysis system
Buican, Tudor N.; Yoshida, Thomas M.
1992-01-01
An integrated fluorescence analysis system enables a component part of a sample to be virtually sorted within a sample volume after a spectrum of the component part has been identified from a fluorescence spectrum of the entire sample in a flow cytometer. Birefringent optics enables the entire spectrum to be resolved into a set of numbers representing the intensity of spectral components of the spectrum. One or more spectral components are selected to program a scanning laser microscope, preferably a confocal microscope, whereby the spectrum from individual pixels or voxels in the sample can be compared. Individual pixels or voxels containing the selected spectral components are identified and an image may be formed to show the morphology of the sample with respect to only those components having the selected spectral components. There is no need for any physical sorting of the sample components to obtain the morphological information.
Amitsuka, Takahiko; Okamura, Maya; Mukuta, Kei; Shiibashi, Hiroko; Haraguchi, Kenji; Saito, Tsukasa; Inoue, Kazuo; Fushiki, Tohru
2017-08-01
Katsuodashi, a dried bonito broth, is very basic and indispensable in Japanese cuisine and contains taste-exhibiting components and unique aroma. We previously reported that its unique aroma contributes to the preference and reinforcement effect associated with dried bonito. This study aims to elucidate the contribution of aromatic components in Katsuobushi to preference formation and reinforcement effect. Volatile components obtained from dried bonito were fractionated and the fractions were subjected to two-bottle choice test. The fractionation test suggested that the component responsible for the preference is not one but comprises multiple components. In the GC-MS analysis/reconstruction test, solution with aromatic flavor narrowed down to 125 compounds had preference, and also had reinforcement effect. Moreover, GC-MS-olfactometry analysis narrowed down the candidate components to 28 out of 125. Mice showed preference for the test solution with aromatic flavor reconstructed with 28 components but did not show reinforcement behavior.
Peleato, Nicolás M; Andrews, Robert C
2015-01-01
This work investigated the application of several fluorescence excitation-emission matrix analysis methods as natural organic matter (NOM) indicators for use in predicting the formation of trihalomethanes (THMs) and haloacetic acids (HAAs). Waters from four different sources (two rivers and two lakes) were subjected to jar testing followed by 24hr disinfection by-product formation tests using chlorine. NOM was quantified using three common measures: dissolved organic carbon, ultraviolet absorbance at 254 nm, and specific ultraviolet absorbance as well as by principal component analysis, peak picking, and parallel factor analysis of fluorescence spectra. Based on multi-linear modeling of THMs and HAAs, principle component (PC) scores resulted in the lowest mean squared prediction error of cross-folded test sets (THMs: 43.7 (μg/L)(2), HAAs: 233.3 (μg/L)(2)). Inclusion of principle components representative of protein-like material significantly decreased prediction error for both THMs and HAAs. Parallel factor analysis did not identify a protein-like component and resulted in prediction errors similar to traditional NOM surrogates as well as fluorescence peak picking. These results support the value of fluorescence excitation-emission matrix-principal component analysis as a suitable NOM indicator in predicting the formation of THMs and HAAs for the water sources studied. Copyright © 2014. Published by Elsevier B.V.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
NASA Astrophysics Data System (ADS)
Chen, Si; Jiang, Hailong; Cao, Yan; Wang, Yun; Hu, Ziheng; Zhu, Zhenyu; Chai, Yifeng
2016-04-01
Identifying the molecular targets for the beneficial effects of active small-molecule compounds simultaneously is an important and currently unmet challenge. In this study, we firstly proposed network analysis by integrating data from network pharmacology and metabolomics to identify targets of active components in sini decoction (SND) simultaneously against heart failure. To begin with, 48 potential active components in SND against heart failure were predicted by serum pharmacochemistry, text mining and similarity match. Then, we employed network pharmacology including text mining and molecular docking to identify the potential targets of these components. The key enriched processes, pathways and related diseases of these target proteins were analyzed by STRING database. At last, network analysis was conducted to identify most possible targets of components in SND. Among the 25 targets predicted by network analysis, tumor necrosis factor α (TNF-α) was firstly experimentally validated in molecular and cellular level. Results indicated that hypaconitine, mesaconitine, higenamine and quercetin in SND can directly bind to TNF-α, reduce the TNF-α-mediated cytotoxicity on L929 cells and exert anti-myocardial cell apoptosis effects. We envisage that network analysis will also be useful in target identification of a bioactive compound.
Chen, Si; Jiang, Hailong; Cao, Yan; Wang, Yun; Hu, Ziheng; Zhu, Zhenyu; Chai, Yifeng
2016-01-01
Identifying the molecular targets for the beneficial effects of active small-molecule compounds simultaneously is an important and currently unmet challenge. In this study, we firstly proposed network analysis by integrating data from network pharmacology and metabolomics to identify targets of active components in sini decoction (SND) simultaneously against heart failure. To begin with, 48 potential active components in SND against heart failure were predicted by serum pharmacochemistry, text mining and similarity match. Then, we employed network pharmacology including text mining and molecular docking to identify the potential targets of these components. The key enriched processes, pathways and related diseases of these target proteins were analyzed by STRING database. At last, network analysis was conducted to identify most possible targets of components in SND. Among the 25 targets predicted by network analysis, tumor necrosis factor α (TNF-α) was firstly experimentally validated in molecular and cellular level. Results indicated that hypaconitine, mesaconitine, higenamine and quercetin in SND can directly bind to TNF-α, reduce the TNF-α-mediated cytotoxicity on L929 cells and exert anti-myocardial cell apoptosis effects. We envisage that network analysis will also be useful in target identification of a bioactive compound. PMID:27095146
Engine structures analysis software: Component Specific Modeling (COSMO)
NASA Astrophysics Data System (ADS)
McKnight, R. L.; Maffeo, R. J.; Schwartz, S.
1994-08-01
A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.
OCSEGen: Open Components and Systems Environment Generator
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana
2014-01-01
To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.
Engine Structures Analysis Software: Component Specific Modeling (COSMO)
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.
1994-01-01
A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.
Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2012-01-01
This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.
NASA Astrophysics Data System (ADS)
Kistenev, Yu. V.; Shapovalov, A. V.; Borisov, A. V.; Vrazhnov, D. A.; Nikolaev, V. V.; Nikiforova, O. Y.
2015-12-01
The results of numerical simulation of application principal component analysis to absorption spectra of breath air of patients with pulmonary diseases are presented. Various methods of experimental data preprocessing are analyzed.
Gu, Chaochao; Gao, Pin; Yang, Fan; An, Dongxuan; Munir, Mariya; Jia, Hanzhong; Xue, Gang; Ma, Chunyan
2017-05-01
The presence of antibiotic residues in the environment has been regarded as an emerging concern due to their potential adverse environmental consequences such as antibiotic resistance. However, the interaction between antibiotics and extracellular polymeric substances (EPSs) of biofilms in wastewater treatment systems is not entirely clear. In this study, the effect of ciprofloxacin (CIP) antibiotic on biofilm EPS matrix was investigated and characterized using fluorescence excitation-emission matrix (EEM) and parallel factor (PARAFAC) analysis. Physicochemical analysis showed that the proteins were the major EPS fraction, and their contents increased gradually with an increase in CIP concentration (0-300 μg/L). Based on the characterization of biofilm tightly bound EPS (TB-EPS) by EEM, three fluorescent components were identified by PARAFAC analysis. Component C1 was associated with protein-like substances, and components C2 and C3 belonged to humic-like substances. Component C1 exhibited an increasing trend as the CIP addition increased. Pearson's correlation results showed that CIP correlated significantly with the protein contents and component C1, while strong correlations were also found among UV 254 , dissolved organic carbon, humic acids, and component C3. A combined use of EEM-PARAFAC analysis and chemical measurements was demonstrated as a favorable approach for the characterization of variations in biofilm EPS in the presence of CIP antibiotic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradonjic, Milan; Hagberg, Aric; Hengartner, Nick
We analyze component evolution in general random intersection graphs (RIGs) and give conditions on existence and uniqueness of the giant component. Our techniques generalize the existing methods for analysis on component evolution in RIGs. That is, we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts from the study on component evolution in Erdos-Renyi graphs. The main challenge becomes from the underlying structure of RIGs, when the number of offsprings follows a binomial distribution with a different number of nodes andmore » different rate at each step during the evolution. RIGs can be interpreted as a model for large randomly formed non-metric data sets. Besides the mathematical analysis on component evolution, which we provide in this work, we perceive RIGs as an important random structure which has already found applications in social networks, epidemic networks, blog readership, or wireless sensor networks.« less
[Discrimination of Red Tide algae by fluorescence spectra and principle component analysis].
Su, Rong-guo; Hu, Xu-peng; Zhang, Chuan-song; Wang, Xiu-lin
2007-07-01
Fluorescence discrimination technology for 11 species of the Red Tide algae at genus level was constructed by principle component analysis and non-negative least squares. Rayleigh and Raman scattering peaks of 3D fluorescence spectra were eliminated by Delaunay triangulation method. According to the results of Fisher linear discrimination, the first principle component score and the second component score of 3D fluorescence spectra were chosen as discriminant feature and the feature base was established. The 11 algae species were tested, and more than 85% samples were accurately determinated, especially for Prorocentrum donghaiense, Skeletonema costatum, Gymnodinium sp., which have frequently brought Red tide in the East China Sea. More than 95% samples were right discriminated. The results showed that the genus discriminant feature of 3D fluorescence spectra of Red Tide algae given by principle component analysis could work well.
Krueger, Alexander P; Singh, Gurpal; Beil, Frank Timo; Feuerstein, Bernd; Ruether, Wolfgang; Lohmann, Christoph H
2014-05-01
Ceramic components in total knee arthroplasty (TKA) are evolving. We analyze the first case of BIOLOX delta ceramic femoral component fracture. A longitudinal midline fracture in the patellar groove was present, with an intact cement mantle and no bony defects. Fractographic analysis with laser scanning microscopy and white light interferometry showed no evidence of arrest lines, hackles, wake hackles, material flaws, fatigue or crack propagation. Analysis of periprosthetic tissues with Fourier-transform infrared (FT-IR) microscopy, contact radiography, histology, and subsequent digestion and high-speed centrifugation did not show ceramic debris. A macrophage-dominated response was present around polyethylene debris. We conclude that ceramic femoral component failure in this case was related to a traumatic event. Further research is needed to determine the suitability of ceramic components in TKA. Copyright © 2014 Elsevier Inc. All rights reserved.
Kakio, Tomoko; Nagase, Hitomi; Takaoka, Takashi; Yoshida, Naoko; Hirakawa, Junichi; Macha, Susan; Hiroshima, Takashi; Ikeda, Yukihiro; Tsuboi, Hirohito; Kimura, Kazuko
2018-06-01
The World Health Organization has warned that substandard and falsified medical products (SFs) can harm patients and fail to treat the diseases for which they were intended, and they affect every region of the world, leading to loss of confidence in medicines, health-care providers, and health systems. Therefore, development of analytical procedures to detect SFs is extremely important. In this study, we investigated the quality of pharmaceutical tablets containing the antihypertensive candesartan cilexetil, collected in China, Indonesia, Japan, and Myanmar, using the Japanese pharmacopeial analytical procedures for quality control, together with principal component analysis (PCA) of Raman spectrum obtained with handheld Raman spectrometer. Some samples showed delayed dissolution and failed to meet the pharmacopeial specification, whereas others failed the assay test. These products appeared to be substandard. Principal component analysis showed that all Raman spectra could be explained in terms of two components: the amount of the active pharmaceutical ingredient and the kinds of excipients. Principal component analysis score plot indicated one substandard, and the falsified tablets have similar principal components in Raman spectra, in contrast to authentic products. The locations of samples within the PCA score plot varied according to the source country, suggesting that manufacturers in different countries use different excipients. Our results indicate that the handheld Raman device will be useful for detection of SFs in the field. Principal component analysis of that Raman data clarify the difference in chemical properties between good quality products and SFs that circulate in the Asian market.
A 5 year (2002-2006) simulation of CMAQ covering the eastern United States is evaluated using principle component analysis in order to identify and characterize statistically significant patterns of model bias. Such analysis is useful in that in can identify areas of poor model ...
Accuracy of the Parallel Analysis Procedure with Polychoric Correlations
ERIC Educational Resources Information Center
Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah
2009-01-01
The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…
10 CFR 52.79 - Contents of applications; technical information in final safety analysis report.
Code of Federal Regulations, 2010 CFR
2010-01-01
... assurance program will be implemented; (26) The applicant's organizational structure, allocations or... presents a safety analysis of the structures, systems, and components of the facility as a whole. The final... contain an analysis and evaluation of the major structures, systems, and components of the facility that...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
NASA Astrophysics Data System (ADS)
Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan
2011-08-01
Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.
NASA Astrophysics Data System (ADS)
Ji, Yi; Sun, Shanlin; Xie, Hong-Bo
2017-06-01
Discrete wavelet transform (WT) followed by principal component analysis (PCA) has been a powerful approach for the analysis of biomedical signals. Wavelet coefficients at various scales and channels were usually transformed into a one-dimensional array, causing issues such as the curse of dimensionality dilemma and small sample size problem. In addition, lack of time-shift invariance of WT coefficients can be modeled as noise and degrades the classifier performance. In this study, we present a stationary wavelet-based two-directional two-dimensional principal component analysis (SW2D2PCA) method for the efficient and effective extraction of essential feature information from signals. Time-invariant multi-scale matrices are constructed in the first step. The two-directional two-dimensional principal component analysis then operates on the multi-scale matrices to reduce the dimension, rather than vectors in conventional PCA. Results are presented from an experiment to classify eight hand motions using 4-channel electromyographic (EMG) signals recorded in healthy subjects and amputees, which illustrates the efficiency and effectiveness of the proposed method for biomedical signal analysis.
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1988-01-01
This annual status report presents the results of work performed during the fourth year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes permitting more accurate and efficient 3-D analysis of selected hot section components, i.e., combustor liners, turbine blades and turbine vanes. The computer codes embody a progression of math models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. Volume 1 of this report discusses the special finite element models developed during the fourth year of the contract.
Mixture modelling for cluster analysis.
McLachlan, G J; Chang, S U
2004-10-01
Cluster analysis via a finite mixture model approach is considered. With this approach to clustering, the data can be partitioned into a specified number of clusters g by first fitting a mixture model with g components. An outright clustering of the data is then obtained by assigning an observation to the component to which it has the highest estimated posterior probability of belonging; that is, the ith cluster consists of those observations assigned to the ith component (i = 1,..., g). The focus is on the use of mixtures of normal components for the cluster analysis of data that can be regarded as being continuous. But attention is also given to the case of mixed data, where the observations consist of both continuous and discrete variables.
Principal components analysis in clinical studies.
Zhang, Zhongheng; Castelló, Adela
2017-09-01
In multivariate analysis, independent variables are usually correlated to each other which can introduce multicollinearity in the regression models. One approach to solve this problem is to apply principal components analysis (PCA) over these variables. This method uses orthogonal transformation to represent sets of potentially correlated variables with principal components (PC) that are linearly uncorrelated. PCs are ordered so that the first PC has the largest possible variance and only some components are selected to represent the correlated variables. As a result, the dimension of the variable space is reduced. This tutorial illustrates how to perform PCA in R environment, the example is a simulated dataset in which two PCs are responsible for the majority of the variance in the data. Furthermore, the visualization of PCA is highlighted.
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error
Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee
2017-01-01
Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
Effect of noise in principal component analysis with an application to ozone pollution
NASA Astrophysics Data System (ADS)
Tsakiri, Katerina G.
This thesis analyzes the effect of independent noise in principal components of k normally distributed random variables defined by a covariance matrix. We prove that the principal components as well as the canonical variate pairs determined from joint distribution of original sample affected by noise can be essentially different in comparison with those determined from the original sample. However when the differences between the eigenvalues of the original covariance matrix are sufficiently large compared to the level of the noise, the effect of noise in principal components and canonical variate pairs proved to be negligible. The theoretical results are supported by simulation study and examples. Moreover, we compare our results about the eigenvalues and eigenvectors in the two dimensional case with other models examined before. This theory can be applied in any field for the decomposition of the components in multivariate analysis. One application is the detection and prediction of the main atmospheric factor of ozone concentrations on the example of Albany, New York. Using daily ozone, solar radiation, temperature, wind speed and precipitation data, we determine the main atmospheric factor for the explanation and prediction of ozone concentrations. A methodology is described for the decomposition of the time series of ozone and other atmospheric variables into the global term component which describes the long term trend and the seasonal variations, and the synoptic scale component which describes the short term variations. By using the Canonical Correlation Analysis, we show that solar radiation is the only main factor between the atmospheric variables considered here for the explanation and prediction of the global and synoptic scale component of ozone. The global term components are modeled by a linear regression model, while the synoptic scale components by a vector autoregressive model and the Kalman filter. The coefficient of determination, R2, for the prediction of the synoptic scale ozone component was found to be the highest when we consider the synoptic scale component of the time series for solar radiation and temperature. KEY WORDS: multivariate analysis; principal component; canonical variate pairs; eigenvalue; eigenvector; ozone; solar radiation; spectral decomposition; Kalman filter; time series prediction
ERIC Educational Resources Information Center
McCormick, Ernest J.; And Others
The Position Analysis Questionnaire (PAQ), a structured job analysis questionnaire that provides for the analysis of individual jobs in terms of each of 187 job elements, was used to establish the job component validity of certain commercially-available vocational aptitude tests. Prior to the general analyses reported here, a statistical analysis…
Phenomenology of mixed states: a principal component analysis study.
Bertschy, G; Gervasoni, N; Favre, S; Liberek, C; Ragama-Pardos, E; Aubry, J-M; Gex-Fabry, M; Dayer, A
2007-12-01
To contribute to the definition of external and internal limits of mixed states and study the place of dysphoric symptoms in the psychopathology of mixed states. One hundred and sixty-five inpatients with major mood episodes were diagnosed as presenting with either pure depression, mixed depression (depression plus at least three manic symptoms), full mixed state (full depression and full mania), mixed mania (mania plus at least three depressive symptoms) or pure mania, using an adapted version of the Mini International Neuropsychiatric Interview (DSM-IV version). They were evaluated using a 33-item inventory of depressive, manic and mixed affective signs and symptoms. Principal component analysis without rotation yielded three components that together explained 43.6% of the variance. The first component (24.3% of the variance) contrasted typical depressive symptoms with typical euphoric, manic symptoms. The second component, labeled 'dysphoria', (13.8%) had strong positive loadings for irritability, distressing sensitivity to light and noise, impulsivity and inner tension. The third component (5.5%) included symptoms of insomnia. Median scores for the first component significantly decreased from the pure depression group to the pure mania group. For the dysphoria component, scores were highest among patients with full mixed states and decreased towards both patients with pure depression and those with pure mania. Principal component analysis revealed that dysphoria represents an important dimension of mixed states.
Bailón, Raquel; Garatachea, Nuria; de la Iglesia, Ignacio; Casajús, Jose Antonio; Laguna, Pablo
2013-07-01
The analysis and interpretation of heart rate variability (HRV) during exercise is challenging not only because of the nonstationary nature of exercise, the time-varying mean heart rate, and the fact that respiratory frequency exceeds 0.4 Hz, but there are also other factors, such as the component centered at the pedaling frequency observed in maximal cycling tests, which may confuse the interpretation of HRV analysis. The objectives of this study are to test the hypothesis that a component centered at the running stride frequency (SF) appears in the HRV of subjects during maximal treadmill exercise testing, and to study its influence in the interpretation of the low-frequency (LF) and high-frequency (HF) components of HRV during exercise. The HRV of 23 subjects during maximal treadmill exercise testing is analyzed. The instantaneous power of different HRV components is computed from the smoothed pseudo-Wigner-Ville distribution of the modulating signal assumed to carry information from the autonomic nervous system, which is estimated based on the time-varying integral pulse frequency modulation model. Besides the LF and HF components, the appearance is revealed of a component centered at the running SF as well as its aliases. The power associated with the SF component and its aliases represents 22±7% (median±median absolute deviation) of the total HRV power in all the subjects. Normalized LF power decreases as the exercise intensity increases, while normalized HF power increases. The power associated with the SF does not change significantly with exercise intensity. Consideration of the running SF component and its aliases is very important in HRV analysis since stride frequency aliases may overlap with LF and HF components.
Nguyen, Phuong H
2007-05-15
Principal component analysis is a powerful method for projecting multidimensional conformational space of peptides or proteins onto lower dimensional subspaces in which the main conformations are present, making it easier to reveal the structures of molecules from e.g. molecular dynamics simulation trajectories. However, the identification of all conformational states is still difficult if the subspaces consist of more than two dimensions. This is mainly due to the fact that the principal components are not independent with each other, and states in the subspaces cannot be visualized. In this work, we propose a simple and fast scheme that allows one to obtain all conformational states in the subspaces. The basic idea is that instead of directly identifying the states in the subspace spanned by principal components, we first transform this subspace into another subspace formed by components that are independent of one other. These independent components are obtained from the principal components by employing the independent component analysis method. Because of independence between components, all states in this new subspace are defined as all possible combinations of the states obtained from each single independent component. This makes the conformational analysis much simpler. We test the performance of the method by analyzing the conformations of the glycine tripeptide and the alanine hexapeptide. The analyses show that our method is simple and quickly reveal all conformational states in the subspaces. The folding pathways between the identified states of the alanine hexapeptide are analyzed and discussed in some detail. 2007 Wiley-Liss, Inc.
MULTI-COMPONENT ANALYSIS OF POSITION-VELOCITY CUBES OF THE HH 34 JET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez-Gonzalez, A.; Esquivel, A.; Raga, A. C.
We present an analysis of H{alpha} spectra of the HH 34 jet with two-dimensional spectral resolution. We carry out multi-Gaussian fits to the spatially resolved line profiles and derive maps of the intensity, radial velocity, and velocity width of each of the components. We find that close to the outflow source we have three components: a high (negative) radial velocity component with a well-collimated, jet-like morphology; an intermediate velocity component with a broader morphology; and a positive radial velocity component with a non-collimated morphology and large linewidth. We suggest that this positive velocity component is associated with jet emission scatteredmore » in stationary dust present in the circumstellar environment. Farther away from the outflow source, we find only two components (a high, negative radial velocity component, which has a narrower spatial distribution than an intermediate velocity component). The fitting procedure was carried out with the new AGA-V1 code, which is available online and is described in detail in this paper.« less
Principal Component and Linkage Analysis of Cardiovascular Risk Traits in the Norfolk Isolate
Cox, Hannah C.; Bellis, Claire; Lea, Rod A.; Quinlan, Sharon; Hughes, Roger; Dyer, Thomas; Charlesworth, Jac; Blangero, John; Griffiths, Lyn R.
2009-01-01
Objective(s) An individual's risk of developing cardiovascular disease (CVD) is influenced by genetic factors. This study focussed on mapping genetic loci for CVD-risk traits in a unique population isolate derived from Norfolk Island. Methods This investigation focussed on 377 individuals descended from the population founders. Principal component analysis was used to extract orthogonal components from 11 cardiovascular risk traits. Multipoint variance component methods were used to assess genome-wide linkage using SOLAR to the derived factors. A total of 285 of the 377 related individuals were informative for linkage analysis. Results A total of 4 principal components accounting for 83% of the total variance were derived. Principal component 1 was loaded with body size indicators; principal component 2 with body size, cholesterol and triglyceride levels; principal component 3 with the blood pressures; and principal component 4 with LDL-cholesterol and total cholesterol levels. Suggestive evidence of linkage for principal component 2 (h2 = 0.35) was observed on chromosome 5q35 (LOD = 1.85; p = 0.0008). While peak regions on chromosome 10p11.2 (LOD = 1.27; p = 0.005) and 12q13 (LOD = 1.63; p = 0.003) were observed to segregate with principal components 1 (h2 = 0.33) and 4 (h2 = 0.42), respectively. Conclusion(s): This study investigated a number of CVD risk traits in a unique isolated population. Findings support the clustering of CVD risk traits and provide interesting evidence of a region on chromosome 5q35 segregating with weight, waist circumference, HDL-c and total triglyceride levels. PMID:19339786
ADAPTION OF NONSTANDARD PIPING COMPONENTS INTO PRESENT DAY SEISMIC CODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. T. Clark; M. J. Russell; R. E. Spears
2009-07-01
With spiraling energy demand and flat energy supply, there is a need to extend the life of older nuclear reactors. This sometimes requires that existing systems be evaluated to present day seismic codes. Older reactors built in the 1960s and early 1970s often used fabricated piping components that were code compliant during their initial construction time period, but are outside the standard parameters of present-day piping codes. There are several approaches available to the analyst in evaluating these non-standard components to modern codes. The simplest approach is to use the flexibility factors and stress indices for similar standard components withmore » the assumption that the non-standard component’s flexibility factors and stress indices will be very similar. This approach can require significant engineering judgment. A more rational approach available in Section III of the ASME Boiler and Pressure Vessel Code, which is the subject of this paper, involves calculation of flexibility factors using finite element analysis of the non-standard component. Such analysis allows modeling of geometric and material nonlinearities. Flexibility factors based on these analyses are sensitive to the load magnitudes used in their calculation, load magnitudes that need to be consistent with those produced by the linear system analyses where the flexibility factors are applied. This can lead to iteration, since the magnitude of the loads produced by the linear system analysis depend on the magnitude of the flexibility factors. After the loading applied to the nonstandard component finite element model has been matched to loads produced by the associated linear system model, the component finite element model can then be used to evaluate the performance of the component under the loads with the nonlinear analysis provisions of the Code, should the load levels lead to calculated stresses in excess of Allowable stresses. This paper details the application of component-level finite element modeling to account for geometric and material nonlinear component behavior in a linear elastic piping system model. Note that this technique can be applied to the analysis of B31 piping systems.« less
Conceptual model of iCAL4LA: Proposing the components using comparative analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zulaiha; Mutalib, Ariffin Abdul
2016-08-01
This paper discusses an on-going study that initiates an initial process in determining the common components for a conceptual model of interactive computer-assisted learning that is specifically designed for low achieving children. This group of children needs a specific learning support that can be used as an alternative learning material in their learning environment. In order to develop the conceptual model, this study extracts the common components from 15 strongly justified computer assisted learning studies. A comparative analysis has been conducted to determine the most appropriate components by using a set of specific indication classification to prioritize the applicability. The results of the extraction process reveal 17 common components for consideration. Later, based on scientific justifications, 16 of them were selected as the proposed components for the model.
The Importance of Engine External's Health
NASA Technical Reports Server (NTRS)
Stoner, Barry L.
2006-01-01
Engine external components include all the fluid carrying, electron carrying, and support devices that are needed to operate the propulsion system. These components are varied and include: pumps, valves, actuators, solenoids, sensors, switches, heat exchangers, electrical generators, electrical harnesses, tubes, ducts, clamps and brackets. The failure of any component to perform its intended function will result in a maintenance action, a dispatch delay, or an engine in flight shutdown. The life of each component, in addition to its basic functional design, is closely tied to its thermal and dynamic environment .Therefore, to reach a mature design life, the component's thermal and dynamic environment must be understood and controlled, which can only be accomplished by attention to design analysis and testing. The purpose of this paper is to review analysis and test techniques toward achieving good component health.
14 CFR 35.43 - Propeller hydraulic components.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Propeller hydraulic components. 35.43... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.43 Propeller hydraulic components. Applicants must show by test, validated analysis, or both, that propeller components that contain hydraulic...
14 CFR 35.43 - Propeller hydraulic components.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Propeller hydraulic components. 35.43... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.43 Propeller hydraulic components. Applicants must show by test, validated analysis, or both, that propeller components that contain hydraulic...
14 CFR 35.43 - Propeller hydraulic components.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Propeller hydraulic components. 35.43... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.43 Propeller hydraulic components. Applicants must show by test, validated analysis, or both, that propeller components that contain hydraulic...
14 CFR 35.43 - Propeller hydraulic components.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Propeller hydraulic components. 35.43... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.43 Propeller hydraulic components. Applicants must show by test, validated analysis, or both, that propeller components that contain hydraulic...
14 CFR 35.43 - Propeller hydraulic components.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Propeller hydraulic components. 35.43... AIRWORTHINESS STANDARDS: PROPELLERS Tests and Inspections § 35.43 Propeller hydraulic components. Applicants must show by test, validated analysis, or both, that propeller components that contain hydraulic...
Is Stacking Intervention Components Cost-Effective? An Analysis of the Incredible Years Program
ERIC Educational Resources Information Center
Foster, E. Michael; Olchowski, Allison E.; Webster-Stratton, Carolyn H.
2007-01-01
The cost-effectiveness of delivering stacked multiple intervention components for children is compared to implementing single intervention by analyzing the Incredible Years Series program. The result suggests multiple intervention components are more cost-effective than single intervention components.
Zeng, Yanling; Lu, Yang; Chen, Zhao; Tan, Jiawei; Bai, Jie; Li, Pengyue; Wang, Zhixin; Du, Shouying
2018-05-11
Bolbostemma paniculatum is a traditional Chinese medicine (TCM) showed various therapeutic effects. Owing to its complex chemical composition, few investigations have acquired a comprehensive cognition for the chemical profiles of this herb and explicated the differences between samples collected from different places. In this study, a strategy based on UPLC tandem LTQ-Orbitrap MS n was established for characterizing chemical components of B. paniculatum . Through a systematic identification strategy, a total of 60 components in B. paniculatum were rapidly separated in 30 min and identified. Then based on peak intensities of all the characterized components, principle component analysis (PCA) and hierarchical cluster analysis (HCA) were employed to classify 18 batches of B. paniculatum into four groups, which were highly consistent with the four climate types of their original places. And five compounds were finally screened out as chemical markers to discriminate the internal quality of B. paniculatum . As the first study to systematically characterize the chemical components of B. paniculatum by UPLC-MS n , the above results could offer essential data for its pharmacological research. And the current strategy could provide useful reference for future investigations on discovery of important chemical constituents in TCM, as well as establishment of quality control and evaluation method.
Motegi, Hiromi; Tsuboi, Yuuri; Saga, Ayako; Kagami, Tomoko; Inoue, Maki; Toki, Hideaki; Minowa, Osamu; Noda, Tetsuo; Kikuchi, Jun
2015-11-04
There is an increasing need to use multivariate statistical methods for understanding biological functions, identifying the mechanisms of diseases, and exploring biomarkers. In addition to classical analyses such as hierarchical cluster analysis, principal component analysis, and partial least squares discriminant analysis, various multivariate strategies, including independent component analysis, non-negative matrix factorization, and multivariate curve resolution, have recently been proposed. However, determining the number of components is problematic. Despite the proposal of several different methods, no satisfactory approach has yet been reported. To resolve this problem, we implemented a new idea: classifying a component as "reliable" or "unreliable" based on the reproducibility of its appearance, regardless of the number of components in the calculation. Using the clustering method for classification, we applied this idea to multivariate curve resolution-alternating least squares (MCR-ALS). Comparisons between conventional and modified methods applied to proton nuclear magnetic resonance ((1)H-NMR) spectral datasets derived from known standard mixtures and biological mixtures (urine and feces of mice) revealed that more plausible results are obtained by the modified method. In particular, clusters containing little information were detected with reliability. This strategy, named "cluster-aided MCR-ALS," will facilitate the attainment of more reliable results in the metabolomics datasets.
Caprihan, A; Pearlson, G D; Calhoun, V D
2008-08-15
Principal component analysis (PCA) is often used to reduce the dimension of data before applying more sophisticated data analysis methods such as non-linear classification algorithms or independent component analysis. This practice is based on selecting components corresponding to the largest eigenvalues. If the ultimate goal is separation of data in two groups, then these set of components need not have the most discriminatory power. We measured the distance between two such populations using Mahalanobis distance and chose the eigenvectors to maximize it, a modified PCA method, which we call the discriminant PCA (DPCA). DPCA was applied to diffusion tensor-based fractional anisotropy images to distinguish age-matched schizophrenia subjects from healthy controls. The performance of the proposed method was evaluated by the one-leave-out method. We show that for this fractional anisotropy data set, the classification error with 60 components was close to the minimum error and that the Mahalanobis distance was twice as large with DPCA, than with PCA. Finally, by masking the discriminant function with the white matter tracts of the Johns Hopkins University atlas, we identified left superior longitudinal fasciculus as the tract which gave the least classification error. In addition, with six optimally chosen tracts the classification error was zero.
Ibrahim, George M; Morgan, Benjamin R; Macdonald, R Loch
2014-03-01
Predictors of outcome after aneurysmal subarachnoid hemorrhage have been determined previously through hypothesis-driven methods that often exclude putative covariates and require a priori knowledge of potential confounders. Here, we apply a data-driven approach, principal component analysis, to identify baseline patient phenotypes that may predict neurological outcomes. Principal component analysis was performed on 120 subjects enrolled in a prospective randomized trial of clazosentan for the prevention of angiographic vasospasm. Correlation matrices were created using a combination of Pearson, polyserial, and polychoric regressions among 46 variables. Scores of significant components (with eigenvalues>1) were included in multivariate logistic regression models with incidence of severe angiographic vasospasm, delayed ischemic neurological deficit, and long-term outcome as outcomes of interest. Sixteen significant principal components accounting for 74.6% of the variance were identified. A single component dominated by the patients' initial hemodynamic status, World Federation of Neurosurgical Societies score, neurological injury, and initial neutrophil/leukocyte counts was significantly associated with poor outcome. Two additional components were associated with angiographic vasospasm, of which one was also associated with delayed ischemic neurological deficit. The first was dominated by the aneurysm-securing procedure, subarachnoid clot clearance, and intracerebral hemorrhage, whereas the second had high contributions from markers of anemia and albumin levels. Principal component analysis, a data-driven approach, identified patient phenotypes that are associated with worse neurological outcomes. Such data reduction methods may provide a better approximation of unique patient phenotypes and may inform clinical care as well as patient recruitment into clinical trials. http://www.clinicaltrials.gov. Unique identifier: NCT00111085.
Hyperspectral functional imaging of the human brain
NASA Astrophysics Data System (ADS)
Toronov, Vladislav; Schelkanova, Irina
2013-03-01
We performed the independent component analysis of the hyperspectral functional near-infrared data acquired on humans during exercise and rest. We found that the hyperspectral functional data acquired on the human brain requires only two physiologically meaningful components to cover more than 50% o the temporal variance in hundreds of wavelengths. The analysis of the spectra of independent components showed that these components could be interpreted as results of changes in the cerebral blood volume and blood flow. Also, we found significant contributions of water and cytochrome c oxydase into changes associated with the independent components. Another remarkable effect of ICA was its good performance in terms of the filtering of the data noise.
[Identification of two varieties of Citri Fructus by fingerprint and chemometrics].
Su, Jing-hua; Zhang, Chao; Sun, Lei; Gu, Bing-ren; Ma, Shuang-cheng
2015-06-01
Citri Fructus identification by fingerprint and chemometrics was investigated in this paper. Twenty-three Citri Fructus samples were collected which referred to two varieties as Cirtus wilsonii and C. medica recorded in Chinese Pharmacopoeia. HPLC chromatograms were obtained. The components were partly identified by reference substances, and then common pattern was established for chemometrics analysis. Similarity analysis, principal component analysis (PCA) , partial least squares-discriminant analysis (PLS-DA) and hierarchical cluster analysis heatmap were applied. The results indicated that C. wilsonii and C. medica could be ideally classified with common pattern contained twenty-five characteristic peaks. Besides, preliminary pattern recognition had verified the chemometrics analytical results. Absolute peak area (APA) was used for relevant quantitative analysis, results showed the differences between two varieties and it was valuable for further quality control as selection of characteristic components.
NASA Astrophysics Data System (ADS)
Golub, M. A.; Sisakyan, I. N.; Soĭfer, V. A.; Uvarov, G. V.
1989-04-01
Theoretical and experimental investigations are reported of new mode optical components (elements) which are analogs of sinusoidal phase diffraction gratings with a variable modulation depth. Expressions are derived for nonlinear predistortion and depth of modulation, which are essential for effective operation of amplitude and phase mode optical components in devices used for analysis and formation of the transverse mode composition of coherent radiation. An estimate is obtained of the energy efficiency of phase and amplitude mode optical components, and a comparison is made with the results of an experimental investigation of a set of phase optical components matched to Gauss-Laguerre modes. It is shown that the improvement in the energy efficiency of phase mode components, compared with amplitude components, is the same as the improvement achieved using a phase diifraction grating, compared with amplitude grating with the same depth of modulation.
NASA Astrophysics Data System (ADS)
Vidic, Nataša. J.; TenPas, Jeff D.; Verosub, Kenneth L.; Singer, Michael J.
2000-08-01
Magnetic susceptibility variations in the Chinese loess/palaeosol sequences have been used extensively for palaeoclimatic interpretations. The magnetic signal of these sequences must be divided into lithogenic and pedogenic components because the palaeoclimatic record is primarily reflected in the pedogenic component. In this paper we compare two methods for separating the pedogenic and lithogenic components of the magnetic susceptibility signal: the citrate-bicarbonate-dithionite (CBD) extraction procedure, and a mixing analysis. Both methods yield good estimates of the pedogenic component, especially for the palaeosols. The CBD procedure underestimates the lithogenic component and overestimates the pedogenic component. The magnitude of this effect is moderately high in loess layers but almost negligible in palaeosols. The mixing model overestimates the lithogenic component and underestimates the pedogenic component. Both methods can be adjusted to yield better estimates of both components. The lithogenic susceptibility, as determined by either method, suggests that palaeoclimatic interpretations based only on total susceptibility will be in error and that a single estimate of the average lithogenic susceptibility is not an accurate basis for adjusting the total susceptibility. A long-term decline in lithogenic susceptibility with depth in the section suggests more intense or prolonged periods of weathering associated with the formation of the older palaeosols. The CBD procedure provides the most comprehensive information on the magnitude of the components and magnetic mineralogy of loess and palaeosols. However, the mixing analysis provides a sensitive, rapid, and easily applied alternative to the CBD procedure. A combination of the two approaches provides the most powerful and perhaps the most accurate way of separating the magnetic susceptibility components.
Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei
2017-09-11
Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.
Monakhova, Yulia B; Godelmann, Rolf; Kuballa, Thomas; Mushtakova, Svetlana P; Rutledge, Douglas N
2015-08-15
Discriminant analysis (DA) methods, such as linear discriminant analysis (LDA) or factorial discriminant analysis (FDA), are well-known chemometric approaches for solving classification problems in chemistry. In most applications, principle components analysis (PCA) is used as the first step to generate orthogonal eigenvectors and the corresponding sample scores are utilized to generate discriminant features for the discrimination. Independent components analysis (ICA) based on the minimization of mutual information can be used as an alternative to PCA as a preprocessing tool for LDA and FDA classification. To illustrate the performance of this ICA/DA methodology, four representative nuclear magnetic resonance (NMR) data sets of wine samples were used. The classification was performed regarding grape variety, year of vintage and geographical origin. The average increase for ICA/DA in comparison with PCA/DA in the percentage of correct classification varied between 6±1% and 8±2%. The maximum increase in classification efficiency of 11±2% was observed for discrimination of the year of vintage (ICA/FDA) and geographical origin (ICA/LDA). The procedure to determine the number of extracted features (PCs, ICs) for the optimum DA models was discussed. The use of independent components (ICs) instead of principle components (PCs) resulted in improved classification performance of DA methods. The ICA/LDA method is preferable to ICA/FDA for recognition tasks based on NMR spectroscopic measurements. Copyright © 2015 Elsevier B.V. All rights reserved.
Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li
2015-09-01
The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nee, K.; Bryan, S.; Levitskaia, T.
The reliability of chemical processes can be greatly improved by implementing inline monitoring systems. Combining multivariate analysis with non-destructive sensors can enhance the process without interfering with the operation. Here, we present here hierarchical models using both principal component analysis and partial least square analysis developed for different chemical components representative of solvent extraction process streams. A training set of 380 samples and an external validation set of 95 samples were prepared and Near infrared and Raman spectral data as well as conductivity under variable temperature conditions were collected. The results from the models indicate that careful selection of themore » spectral range is important. By compressing the data through Principal Component Analysis (PCA), we lower the rank of the data set to its most dominant features while maintaining the key principal components to be used in the regression analysis. Within the studied data set, concentration of five chemical components were modeled; total nitrate (NO 3 -), total acid (H +), neodymium (Nd 3+), sodium (Na +), and ionic strength (I.S.). The best overall model prediction for each of the species studied used a combined data set comprised of complementary techniques including NIR, Raman, and conductivity. Finally, our study shows that chemometric models are powerful but requires significant amount of carefully analyzed data to capture variations in the chemistry.« less
Nee, K.; Bryan, S.; Levitskaia, T.; ...
2017-12-28
The reliability of chemical processes can be greatly improved by implementing inline monitoring systems. Combining multivariate analysis with non-destructive sensors can enhance the process without interfering with the operation. Here, we present here hierarchical models using both principal component analysis and partial least square analysis developed for different chemical components representative of solvent extraction process streams. A training set of 380 samples and an external validation set of 95 samples were prepared and Near infrared and Raman spectral data as well as conductivity under variable temperature conditions were collected. The results from the models indicate that careful selection of themore » spectral range is important. By compressing the data through Principal Component Analysis (PCA), we lower the rank of the data set to its most dominant features while maintaining the key principal components to be used in the regression analysis. Within the studied data set, concentration of five chemical components were modeled; total nitrate (NO 3 -), total acid (H +), neodymium (Nd 3+), sodium (Na +), and ionic strength (I.S.). The best overall model prediction for each of the species studied used a combined data set comprised of complementary techniques including NIR, Raman, and conductivity. Finally, our study shows that chemometric models are powerful but requires significant amount of carefully analyzed data to capture variations in the chemistry.« less
PSHFT - COMPUTERIZED LIFE AND RELIABILITY MODELLING FOR TURBOPROP TRANSMISSIONS
NASA Technical Reports Server (NTRS)
Savage, M.
1994-01-01
The computer program PSHFT calculates the life of a variety of aircraft transmissions. A generalized life and reliability model is presented for turboprop and parallel shaft geared prop-fan aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on the statistical two parameter Weibull failure distribution method and classical fatigue theories. The computer program developed to calculate the transmission model is modular. In its present form, the program can analyze five different transmissions arrangements. Moreover, the program can be easily modified to include additional transmission arrangements. PSHFT uses the properties of a common block two-dimensional array to separate the component and transmission property values from the analysis subroutines. The rows correspond to specific components with the first row containing the values for the entire transmission. Columns contain the values for specific properties. Since the subroutines (which determine the transmission life and dynamic capacity) interface solely with this property array, they are separated from any specific transmission configuration. The system analysis subroutines work in an identical manner for all transmission configurations considered. Thus, other configurations can be added to the program by simply adding component property determination subroutines. PSHFT consists of a main program, a series of configuration specific subroutines, generic component property analysis subroutines, systems analysis subroutines, and a common block. The main program selects the routines to be used in the analysis and sequences their operation. The series of configuration specific subroutines input the configuration data, perform the component force and life analyses (with the help of the generic component property analysis subroutines), fill the property array, call up the system analysis routines, and finally print out the analysis results for the system and components. PSHFT is written in FORTRAN 77 and compiled on a MicroSoft FORTRAN compiler. The program will run on an IBM PC AT compatible with at least 104k bytes of memory. The program was developed in 1988.
ERIC Educational Resources Information Center
Isemonger, Ian; Watanabe, Kaoru
2007-01-01
This study examines the psychometrics of the perceptual component of the Style Analysis Survey (SAS) [Oxford, R.L., 1993a. "Style Analysis Survey (SAS)." University of Alabama, Tuscaloosa, AL]. The study is conducted in the context of questions over another perceptual learning-styles instrument, the "Perceptual Learning Styles Preferences…
Valuing a Protected Tropical Forest: A Case Study in Madagascar
Randall Kramer; Mohan Munasinghe; Narendra Sharma; Evan Mercer; Priya Shyamsundar
1994-01-01
Economic analysis can provide useful infor mation for these difficult decisions. Of course, economic analysis should only constitute one component of the process of deciding whether to create a national park (other components would include sociopolitical and ecological considerations). Traditional economic cost-benefit analysis for national parks, how ever, is...
Stress analysis of 27% scale model of AH-64 main rotor hub
NASA Technical Reports Server (NTRS)
Hodges, R. V.
1985-01-01
Stress analysis of an AH-64 27% scale model rotor hub was performed. Component loads and stresses were calculated based upon blade root loads and motions. The static and fatigue analysis indicates positive margins of safety in all components checked. Using the format developed here, the hub can be stress checked for future application.
On the Extraction of Components and the Applicability of the Factor Model.
ERIC Educational Resources Information Center
Dziuban, Charles D.; Harris, Chester W.
A reanalysis of Shaycroft's matrix of intercorrelations of 10 test variables plus 4 random variables is discussed. Three different procedures were used in the reanalysis: (1) Image Component Analysis, (2) Uniqueness Rescaling Factor Analysis, and (3) Alpha Factor Analysis. The results of these analyses are presented in tables. It is concluded from…
Intelligence, Surveillance, and Reconnaissance Fusion for Coalition Operations
2008-07-01
classification of the targets of interest. The MMI features extracted in this manner have two properties that provide a sound justification for...are generalizations of well- known feature extraction methods such as Principal Components Analysis (PCA) and Independent Component Analysis (ICA...augment (without degrading performance) a large class of generic fusion processes. Ontologies Classifications Feature extraction Feature analysis
Least-dependent-component analysis based on mutual information
NASA Astrophysics Data System (ADS)
Stögbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter
2004-12-01
We propose to use precise estimators of mutual information (MI) to find the least dependent components in a linearly mixed signal. On the one hand, this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand, it has the advantage, compared to other implementations of “independent” component analysis (ICA), some of which are based on crude approximations for MI, that the numerical values of the MI can be used for (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output by comparing the pairwise MIs with those of remixed components; and (iii) clustering the output according to the residual interdependencies. For the MI estimator, we use a recently proposed k -nearest-neighbor-based algorithm. For time sequences, we combine this with delay embedding, in order to take into account nontrivial time correlations. After several tests with artificial data, we apply the resulting MILCA (mutual-information-based least dependent component analysis) algorithm to a real-world dataset, the ECG of a pregnant woman.
NASA Astrophysics Data System (ADS)
Ojima, Nobutoshi; Fujiwara, Izumi; Inoue, Yayoi; Tsumura, Norimichi; Nakaguchi, Toshiya; Iwata, Kayoko
2011-03-01
Uneven distribution of skin color is one of the biggest concerns about facial skin appearance. Recently several techniques to analyze skin color have been introduced by separating skin color information into chromophore components, such as melanin and hemoglobin. However, there are not many reports on quantitative analysis of unevenness of skin color by considering type of chromophore, clusters of different sizes and concentration of the each chromophore. We propose a new image analysis and simulation method based on chromophore analysis and spatial frequency analysis. This method is mainly composed of three techniques: independent component analysis (ICA) to extract hemoglobin and melanin chromophores from a single skin color image, an image pyramid technique which decomposes each chromophore into multi-resolution images, which can be used for identifying different sizes of clusters or spatial frequencies, and analysis of the histogram obtained from each multi-resolution image to extract unevenness parameters. As the application of the method, we also introduce an image processing technique to change unevenness of melanin component. As the result, the method showed high capabilities to analyze unevenness of each skin chromophore: 1) Vague unevenness on skin could be discriminated from noticeable pigmentation such as freckles or acne. 2) By analyzing the unevenness parameters obtained from each multi-resolution image for Japanese ladies, agerelated changes were observed in the parameters of middle spatial frequency. 3) An image processing system modulating the parameters was proposed to change unevenness of skin images along the axis of the obtained age-related change in real time.
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
We developed a novel quantitative analysis method based on ultra high performance liquid chromatography coupled with diode array detection for the simultaneous determination of the 14 main active components in Yinchenhao decoction. All components were separated on an Agilent SB-C18 column by using a gradient solvent system of acetonitrile/0.1% phosphoric acid solution at a flow rate of 0.4 mL/min for 35 min. Subsequently, linearity, precision, repeatability, and accuracy tests were implemented to validate the method. Furthermore, the method has been applied for compositional difference analysis of 14 components in eight normal-extraction Yinchenhao decoction samples, accompanied by hierarchical clustering analysis and similarity analysis. The result that all samples were divided into three groups based on different contents of components demonstrated that extraction methods of decocting, refluxing, ultrasonication and extraction solvents of water or ethanol affected component differentiation, and should be related to its clinical applications. The results also indicated that the sample prepared by patients in the family by using water extraction employing a casserole was almost same to that prepared using a stainless-steel kettle, which is mostly used in pharmaceutical factories. This research would help patients to select the best and most convenient method for preparing Yinchenhao decoction. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Goldstein, Arthur W; Alpert, Sumner; Beede, William; Kovach, Karl
1949-01-01
In order to understand the operation and the interaction of jet-engine components during engine operation and to determine how component characteristics may be used to compute engine performance, a method to analyze and to estimate performance of such engines was devised and applied to the study of the characteristics of a research turbojet engine built for this investigation. An attempt was made to correlate turbine performance obtained from engine experiments with that obtained by the simpler procedure of separately calibrating the turbine with cold air as a driving fluid in order to investigate the applicability of component calibration. The system of analysis was also applied to prediction of the engine and component performance with assumed modifications of the burner and bearing characteristics, to prediction of component and engine operation during engine acceleration, and to estimates of the performance of the engine and the components when the exhaust gas was used to drive a power turbine.
Design of microstrip components by computer
NASA Technical Reports Server (NTRS)
Cisco, T. C.
1972-01-01
Development of computer programs for component analysis and design aids used in production of microstrip components is discussed. System includes designs for couplers, filters, circulators, transformers, power splitters, diode switches, and attenuators.
Estimating the number of pure chemical components in a mixture by X-ray absorption spectroscopy.
Manceau, Alain; Marcus, Matthew; Lenoir, Thomas
2014-09-01
Principal component analysis (PCA) is a multivariate data analysis approach commonly used in X-ray absorption spectroscopy to estimate the number of pure compounds in multicomponent mixtures. This approach seeks to describe a large number of multicomponent spectra as weighted sums of a smaller number of component spectra. These component spectra are in turn considered to be linear combinations of the spectra from the actual species present in the system from which the experimental spectra were taken. The dimension of the experimental dataset is given by the number of meaningful abstract components, as estimated by the cascade or variance of the eigenvalues (EVs), the factor indicator function (IND), or the F-test on reduced EVs. It is shown on synthetic and real spectral mixtures that the performance of the IND and F-test critically depends on the amount of noise in the data, and may result in considerable underestimation or overestimation of the number of components even for a signal-to-noise (s/n) ratio of the order of 80 (σ = 20) in a XANES dataset. For a given s/n ratio, the accuracy of the component recovery from a random mixture depends on the size of the dataset and number of components, which is not known in advance, and deteriorates for larger datasets because the analysis picks up more noise components. The scree plot of the EVs for the components yields one or two values close to the significant number of components, but the result can be ambiguous and its uncertainty is unknown. A new estimator, NSS-stat, which includes the experimental error to XANES data analysis, is introduced and tested. It is shown that NSS-stat produces superior results compared with the three traditional forms of PCA-based component-number estimation. A graphical user-friendly interface for the calculation of EVs, IND, F-test and NSS-stat from a XANES dataset has been developed under LabVIEW for Windows and is supplied in the supporting information. Its possible application to EXAFS data is discussed, and several XANES and EXAFS datasets are also included for download.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2002-01-01
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Stress analysis under component relative interference fit
NASA Technical Reports Server (NTRS)
Taylor, C. M.
1978-01-01
Finite-element computer program enables analysis of distortions and stresses occurring in components having relative interference. Program restricts itself to simple elements and axisymmetric loading situations. External inertial and thermal loads may be applied in addition to forces arising from interference conditions.
Principal component analysis of phenolic acid spectra
USDA-ARS?s Scientific Manuscript database
Phenolic acids are common plant metabolites that exhibit bioactive properties and have applications in functional food and animal feed formulations. The ultraviolet (UV) and infrared (IR) spectra of four closely related phenolic acid structures were evaluated by principal component analysis (PCA) to...
Independent Component Analysis of Textures
NASA Technical Reports Server (NTRS)
Manduchi, Roberto; Portilla, Javier
2000-01-01
A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.
[HPLC fingerprint of flavonoids in Sophora flavescens and determination of five components].
Ma, Hong-Yan; Zhou, Wan-Shan; Chu, Fu-Jiang; Wang, Dong; Liang, Sheng-Wang; Li, Shao
2013-08-01
A simple and reliable method of high-performance liquid chromatography with photodiode array detection (HPLC-DAD) was developed to evaluate the quality of a traditional Chinese medicine Sophora flavescens through establishing chromatographic fingerprint and simultaneous determination of five flavonoids, including trifolirhizin, maackiain, kushenol I, kurarinone and sophoraflavanone G. The optimal conditions of separation and detection were achieved on an ULTIMATE XB-C18 column (4.6 mm x 250 mm, 5 microm) with a gradient of acetonitrile and water, detected at 295 nm. In the chromatographic fingerprint, 13 peaks were selected as the characteristic peaks to assess the similarities of different samples collected from different origins in China according to similarity evaluation for chromatographic fingerprint of traditional chinese medicine (2004AB) and principal component analysis (PCA) were used in data analysis. There were significant differences in the fingerprint chromatograms between S. flavescens and S. tonkinensis. Principal component analysis showed that kurarinone and sophoraflavanone G were the most important component. In quantitative analysis, the five components showed good regression (R > 0.999) with linear ranges, and their recoveries were in the range of 96.3% - 102.3%. This study indicated that the combination of quantitative and chromatographic fingerprint analysis can be readily utilized as a quality control method for S. flavescens and its related traditional Chinese medicinal preparations.
Lörincz, András; Póczos, Barnabás
2003-06-01
In optimizations the dimension of the problem may severely, sometimes exponentially increase optimization time. Parametric function approximatiors (FAPPs) have been suggested to overcome this problem. Here, a novel FAPP, cost component analysis (CCA) is described. In CCA, the search space is resampled according to the Boltzmann distribution generated by the energy landscape. That is, CCA converts the optimization problem to density estimation. Structure of the induced density is searched by independent component analysis (ICA). The advantage of CCA is that each independent ICA component can be optimized separately. In turn, (i) CCA intends to partition the original problem into subproblems and (ii) separating (partitioning) the original optimization problem into subproblems may serve interpretation. Most importantly, (iii) CCA may give rise to high gains in optimization time. Numerical simulations illustrate the working of the algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkinson, V.K.; Young, J.M.
1995-07-01
The US Army`s Project Manager, Advanced Field Artillery System/Future Armored Resupply Vehicle (PM-AFAS/FARV) is sponsoring the development of technologies that can be applied to the resupply vehicle for the Advanced Field Artillery System. The Engineering Technology Division of the Oak Ridge National Laboratory has proposed adding diagnostics/prognostics systems to four components of the Ammunition Transfer Arm of this vehicle, and a cost-benefit analysis was performed on the diagnostics/prognostics to show the potential savings that may be gained by incorporating these systems onto the vehicle. Possible savings could be in the form of reduced downtime, less unexpected or unnecessary maintenance, fewermore » regular maintenance checks. and/or tower collateral damage or loss. The diagnostics/prognostics systems are used to (1) help determine component problems, (2) determine the condition of the components, and (3) estimate the remaining life of the monitored components. The four components on the arm that are targeted for diagnostics/prognostics are (1) the electromechanical brakes, (2) the linear actuators, (3) the wheel/roller bearings, and (4) the conveyor drive system. These would be monitored using electrical signature analysis, vibration analysis, or a combination of both. Annual failure rates for the four components were obtained along with specifications for vehicle costs, crews, number of missions, etc. Accident scenarios based on component failures were postulated, and event trees for these scenarios were constructed to estimate the annual loss of the resupply vehicle, crew, arm. or mission aborts. A levelized cost-benefit analysis was then performed to examine the costs of such failures, both with and without some level of failure reduction due to the diagnostics/prognostics systems. Any savings resulting from using diagnostics/prognostics were calculated.« less
Liu, Shuqiang; Tan, Zhibin; Li, Pingting; Gao, Xiaoling; Zeng, Yuaner; Wang, Shuling
2016-03-20
HepG2 cells biospecific extraction method and high performance liquid chromatography-electrospray ionization-mass spectrometry (HPLC-ESI-MS) analysis was proposed for screening of potential antiatherosclerotic active components in Bupeuri radix, a well-known Traditional Chinese Medicine (TCM). The hypothesis suggested that when cells are incubated together with the extracts of TCM, the potential bioactive components in the TCM should selectively combine with the receptor or channel of HepG2 cells, then the eluate which contained biospecific component binding to HepG2 cells was identified using HPLC-ESI-MS analysis. The potential bioactive components of Bupeuri radix were investigated using the proposed approach. Five compounds in the saikosaponins of Bupeuri radix were detected as these components selectively combined with HepG2 cells, among these compounds, two potentially bioactive compounds namely saikosaponin b1 and saikosaponin b2 (SSb2) were identified by comparing with the chromatography of the standard sample and analysis of the structural clearance characterization of MS. Then SSb2 was used to assess the uptake of DiI-high density lipoprotein (HDL) in HepG2 cells for antiatherosclerotic activity. The results have showed that SSb2, with indicated concentrations (5, 15, 25, and 40 μM) could remarkably uptake dioctadecylindocarbocyanine labeled- (DiI) -HDL in HepG2 cells (Vs control group, *P<0.01). In conclusion, the application of HepG2 biospecific extraction coupled with HPLC-ESI-MS analysis is a rapid, convenient, and reliable method for screening potential bioactive components in TCM and SSb2 may be a valuable novel drug agent for the treatment of atherosclerosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Giesen, E B W; Ding, M; Dalstra, M; van Eijden, T M G J
2003-09-01
As several morphological parameters of cancellous bone express more or less the same architectural measure, we applied principal components analysis to group these measures and correlated these to the mechanical properties. Cylindrical specimens (n = 24) were obtained in different orientations from embalmed mandibular condyles; the angle of the first principal direction and the axis of the specimen, expressing the orientation of the trabeculae, ranged from 10 degrees to 87 degrees. Morphological parameters were determined by a method based on Archimedes' principle and by micro-CT scanning, and the mechanical properties were obtained by mechanical testing. The principal components analysis was used to obtain a set of independent components to describe the morphology. This set was entered into linear regression analyses for explaining the variance in mechanical properties. The principal components analysis revealed four components: amount of bone, number of trabeculae, trabecular orientation, and miscellaneous. They accounted for about 90% of the variance in the morphological variables. The component loadings indicated that a higher amount of bone was primarily associated with more plate-like trabeculae, and not with more or thicker trabeculae. The trabecular orientation was most determinative (about 50%) in explaining stiffness, strength, and failure energy. The amount of bone was second most determinative and increased the explained variance to about 72%. These results suggest that trabecular orientation and amount of bone are important in explaining the anisotropic mechanical properties of the cancellous bone of the mandibular condyle.
Transportation of Large Wind Components: A Review of Existing Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mooney, Meghan; Maclaurin, Galen
2016-09-01
This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.
Respiratory protective device design using control system techniques
NASA Technical Reports Server (NTRS)
Burgess, W. A.; Yankovich, D.
1972-01-01
The feasibility of a control system analysis approach to provide a design base for respiratory protective devices is considered. A system design approach requires that all functions and components of the system be mathematically identified in a model of the RPD. The mathematical notations describe the operation of the components as closely as possible. The individual component mathematical descriptions are then combined to describe the complete RPD. Finally, analysis of the mathematical notation by control system theory is used to derive compensating component values that force the system to operate in a stable and predictable manner.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne R.
2009-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC - NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2015-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
ERIC Educational Resources Information Center
Rodrigue, Christine M.
2011-01-01
This paper presents a laboratory exercise used to teach principal components analysis (PCA) as a means of surface zonation. The lab was built around abundance data for 16 oxides and elements collected by the Mars Exploration Rover Spirit in Gusev Crater between Sol 14 and Sol 470. Students used PCA to reduce 15 of these into 3 components, which,…
Sullivan, Karen A; Lurie, Janine K
2017-01-01
The study examined the component structure of the Neurobehavioral Symptom Inventory (NSI) under five different models. The evaluated models comprised the full NSI (NSI-22) and the NSI-20 (NSI minus two orphan items). A civilian nonclinical sample was used. The 575 volunteers were predominantly university students who screened negative for mild TBI. The study design was cross-sectional, with questionnaires administered online. The main measure was the Neurobehavioral Symptom Inventory. Subscale, total and embedded validity scores were derived (the Validity-10, the LOW6, and the NIM5). In both models, the principal components analysis yielded two intercorrelated components (psychological and somatic/sensory) with acceptable internal consistency (alphas > 0.80). In this civilian nonclinical sample, the NSI had two underlying components. These components represent psychological and somatic/sensory neurobehavioral symptoms.
Genome-scale analysis of the high-efficient protein secretion system of Aspergillus oryzae
2014-01-01
Background The koji mold, Aspergillus oryzae is widely used for the production of industrial enzymes due to its particularly high protein secretion capacity and ability to perform post-translational modifications. However, systemic analysis of its secretion system is lacking, generally due to the poorly annotated proteome. Results Here we defined a functional protein secretory component list of A. oryzae using a previously reported secretory model of S. cerevisiae as scaffold. Additional secretory components were obtained by blast search with the functional components reported in other closely related fungal species such as Aspergillus nidulans and Aspergillus niger. To evaluate the defined component list, we performed transcriptome analysis on three α-amylase over-producing strains with varying levels of secretion capacities. Specifically, secretory components involved in the ER-associated processes (including components involved in the regulation of transport between ER and Golgi) were significantly up-regulated, with many of them never been identified for A. oryzae before. Furthermore, we defined a complete list of the putative A. oryzae secretome and monitored how it was affected by overproducing amylase. Conclusion In combination with the transcriptome data, the most complete secretory component list and the putative secretome, we improved the systemic understanding of the secretory machinery of A. oryzae in response to high levels of protein secretion. The roles of many newly predicted secretory components were experimentally validated and the enriched component list provides a better platform for driving more mechanistic studies of the protein secretory pathway in this industrially important fungus. PMID:24961398
Genome-scale analysis of the high-efficient protein secretion system of Aspergillus oryzae.
Liu, Lifang; Feizi, Amir; Österlund, Tobias; Hjort, Carsten; Nielsen, Jens
2014-06-24
The koji mold, Aspergillus oryzae is widely used for the production of industrial enzymes due to its particularly high protein secretion capacity and ability to perform post-translational modifications. However, systemic analysis of its secretion system is lacking, generally due to the poorly annotated proteome. Here we defined a functional protein secretory component list of A. oryzae using a previously reported secretory model of S. cerevisiae as scaffold. Additional secretory components were obtained by blast search with the functional components reported in other closely related fungal species such as Aspergillus nidulans and Aspergillus niger. To evaluate the defined component list, we performed transcriptome analysis on three α-amylase over-producing strains with varying levels of secretion capacities. Specifically, secretory components involved in the ER-associated processes (including components involved in the regulation of transport between ER and Golgi) were significantly up-regulated, with many of them never been identified for A. oryzae before. Furthermore, we defined a complete list of the putative A. oryzae secretome and monitored how it was affected by overproducing amylase. In combination with the transcriptome data, the most complete secretory component list and the putative secretome, we improved the systemic understanding of the secretory machinery of A. oryzae in response to high levels of protein secretion. The roles of many newly predicted secretory components were experimentally validated and the enriched component list provides a better platform for driving more mechanistic studies of the protein secretory pathway in this industrially important fungus.
How multi segmental patterns deviate in spastic diplegia from typical developed.
Zago, Matteo; Sforza, Chiarella; Bona, Alessia; Cimolin, Veronica; Costici, Pier Francesco; Condoluci, Claudia; Galli, Manuela
2017-10-01
The relationship between gait features and coordination in children with Cerebral Palsy is not sufficiently analyzed yet. Principal Component Analysis can help in understanding motion patterns decomposing movement into its fundamental components (Principal Movements). This study aims at quantitatively characterizing the functional connections between multi-joint gait patterns in Cerebral Palsy. 65 children with spastic diplegia aged 10.6 (SD 3.7) years participated in standardized gait analysis trials; 31 typically developing adolescents aged 13.6 (4.4) years were also tested. To determine if posture affects gait patterns, patients were split into Crouch and knee Hyperextension group according to knee flexion angle at standing. 3D coordinates of hips, knees, ankles, metatarsal joints, pelvis and shoulders were submitted to Principal Component Analysis. Four Principal Movements accounted for 99% of global variance; components 1-3 explained major sagittal patterns, components 4-5 referred to movements on frontal plane and component 6 to additional movement refinements. Dimensionality was higher in patients than in controls (p<0.01), and the Crouch group significantly differed from controls in the application of components 1 and 4-6 (p<0.05), while the knee Hyperextension group in components 1-2 and 5 (p<0.05). Compensatory strategies of children with Cerebral Palsy (interactions between main and secondary movement patterns), were objectively determined. Principal Movements can reduce the effort in interpreting gait reports, providing an immediate and quantitative picture of the connections between movement components. Copyright © 2017 Elsevier Ltd. All rights reserved.
Butler, Rebecca A.
2014-01-01
Stroke aphasia is a multidimensional disorder in which patient profiles reflect variation along multiple behavioural continua. We present a novel approach to separating the principal aspects of chronic aphasic performance and isolating their neural bases. Principal components analysis was used to extract core factors underlying performance of 31 participants with chronic stroke aphasia on a large, detailed battery of behavioural assessments. The rotated principle components analysis revealed three key factors, which we labelled as phonology, semantic and executive/cognition on the basis of the common elements in the tests that loaded most strongly on each component. The phonology factor explained the most variance, followed by the semantic factor and then the executive-cognition factor. The use of principle components analysis rendered participants’ scores on these three factors orthogonal and therefore ideal for use as simultaneous continuous predictors in a voxel-based correlational methodology analysis of high resolution structural scans. Phonological processing ability was uniquely related to left posterior perisylvian regions including Heschl’s gyrus, posterior middle and superior temporal gyri and superior temporal sulcus, as well as the white matter underlying the posterior superior temporal gyrus. The semantic factor was uniquely related to left anterior middle temporal gyrus and the underlying temporal stem. The executive-cognition factor was not correlated selectively with the structural integrity of any particular region, as might be expected in light of the widely-distributed and multi-functional nature of the regions that support executive functions. The identified phonological and semantic areas align well with those highlighted by other methodologies such as functional neuroimaging and neurostimulation. The use of principle components analysis allowed us to characterize the neural bases of participants’ behavioural performance more robustly and selectively than the use of raw assessment scores or diagnostic classifications because principle components analysis extracts statistically unique, orthogonal behavioural components of interest. As such, in addition to improving our understanding of lesion–symptom mapping in stroke aphasia, the same approach could be used to clarify brain–behaviour relationships in other neurological disorders. PMID:25348632
Improvement of Binary Analysis Components in Automated Malware Analysis Framework
2017-02-21
analyze malicious software (malware) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program...AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...currently valid OMB control number . PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1. REPORT DATE (DD-MM-YYYY) 21-02-2017 2. REPORT
ERIC Educational Resources Information Center
Marquardt, Lloyd D.; McCormick, Ernest J.
The study involved the use of a structured job analysis instrument called the Position Analysis Questionnaire (PAQ) as the direct basis for the establishment of the job component validity of aptitude tests (that is, a procedure for estimating the aptitude requirements for jobs strictly on the basis of job analysis data). The sample of jobs used…
ERIC Educational Resources Information Center
Kucan, Linda; Palincsar, Annemarie Sullivan
2018-01-01
This investigation focuses on a tool used in a reading methods course to introduce reading specialist candidates to text analysis as a critical component of planning for text-based discussions. Unlike planning that focuses mainly on important text content or information, a text analysis approach focuses both on content and how that content is…
ERIC Educational Resources Information Center
Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S.
2012-01-01
We propose a new method of structural equation modeling (SEM) for longitudinal and time series data, named Dynamic GSCA (Generalized Structured Component Analysis). The proposed method extends the original GSCA by incorporating a multivariate autoregressive model to account for the dynamic nature of data taken over time. Dynamic GSCA also…
NAC Off-Vehicle Brake Testing Project
2007-05-01
disc pads/rotors and drum shoe assemblies/ drums - Must use vehicle “OEM” brake /hub-end hardware, or ESA... brake component comparison analysis (primary)* - brake system design analysis - brake system component failure analysis - (*) limited to disc pads...e.g. disc pads/rotors, drum shoe assemblies/ drums . - Not limited to “OEM” brake /hub-end hardware as there is none ! - Weight transfer, plumbing,
Wang, Chun-Hua; Zhong, Yi; Zhang, Yan; Liu, Jin-Ping; Wang, Yue-Fei; Jia, Wei-Na; Wang, Guo-Cai; Li, Zheng; Zhu, Yan; Gao, Xiu-Mei
2016-02-01
Chinese medicine is known to treat complex diseases with multiple components and multiple targets. However, the main effective components and their related key targets and functions remain to be identified. Herein, a network analysis method was developed to identify the main effective components and key targets of a Chinese medicine, Lianhua-Qingwen Formula (LQF). The LQF is commonly used for the prevention and treatment of viral influenza in China. It is composed of 11 herbs, gypsum and menthol with 61 compounds being identified in our previous work. In this paper, these 61 candidate compounds were used to find their related targets and construct the predicted-target (PT) network. An influenza-related protein-protein interaction (PPI) network was constructed and integrated with the PT network. Then the compound-effective target (CET) network and compound-ineffective target network (CIT) were extracted, respectively. A novel approach was developed to identify effective components by comparing CET and CIT networks. As a result, 15 main effective components were identified along with 61 corresponding targets. 7 of these main effective components were further experimentally validated to have antivirus efficacy in vitro. The main effective component-target (MECT) network was further constructed with main effective components and their key targets. Gene Ontology (GO) analysis of the MECT network predicted key functions such as NO production being modulated by the LQF. Interestingly, five effective components were experimentally tested and exhibited inhibitory effects on NO production in the LPS induced RAW 264.7 cell. In summary, we have developed a novel approach to identify the main effective components in a Chinese medicine LQF and experimentally validated some of the predictions.
Multi-spectrometer calibration transfer based on independent component analysis.
Liu, Yan; Xu, Hao; Xia, Zhenzhen; Gong, Zhiyong
2018-02-26
Calibration transfer is indispensable for practical applications of near infrared (NIR) spectroscopy due to the need for precise and consistent measurements across different spectrometers. In this work, a method for multi-spectrometer calibration transfer is described based on independent component analysis (ICA). A spectral matrix is first obtained by aligning the spectra measured on different spectrometers. Then, by using independent component analysis, the aligned spectral matrix is decomposed into the mixing matrix and the independent components of different spectrometers. These differing measurements between spectrometers can then be standardized by correcting the coefficients within the independent components. Two NIR datasets of corn and edible oil samples measured with three and four spectrometers, respectively, were used to test the reliability of this method. The results of both datasets reveal that spectra measurements across different spectrometers can be transferred simultaneously and that the partial least squares (PLS) models built with the measurements on one spectrometer can predict that the spectra can be transferred correctly on another.
Computer analysis of the leaf movements of pinto beans.
Hoshizaki, T; Hamner, K C
1969-07-01
Computer analysis was used for the detection of rhythmic components and the estimation of period length in leaf movement records. The results of this study indicated that spectral analysis can be profitably used to determine rhythmic components in leaf movements.In Pinto bean plants (Phaseolus vulgaris L.) grown for 28 days under continuous light of 750 ft-c and at a constant temperature of 28 degrees , there was only 1 highly significant rhythmic component in the leaf movements. The period of this rhythm was 27.3 hr. In plants grown at 20 degrees , there were 2 highly significant rhythmic components: 1 of 13.8 hr and a much stronger 1 of 27.3 hr. At 15 degrees , the highly significant rhythmic components were also 27.3 and 13.8 hr in length but were of equal intensity. Random movements less than 9 hr in length became very pronounced at this temperature. At 10 degrees , no significant rhythm was found in the leaf movements. At 5 degrees , the leaf movements ceased within 1 day.
Comparing sugar components of cereal and pseudocereal flour by GC-MS analysis.
Ačanski, Marijana M; Vujić, Djura N
2014-02-15
Gas chromatography with mass spectrometry was used for carrying out a qualitative analysis of the ethanol soluble flour extract of different types of cereals bread wheat and spelt and pseudocereals (amaranth and buckwheat). TMSI (trimethylsilylimidazole) was used as a reagent for the derivatisation of carbohydrates into trimethylsilyl ethers. All samples were first defatted with hexane. (In our earlier investigations, hexane extracts were used for the analysis of fatty acid of lipid components.) Many components of pentoses, hexoses and disaccharides were identified using 73 and 217 Da mass and the Wiley Online Library search. The aim of this paper is not to identify and find new components, but to compare sugar components of tested samples of flour of cereals bread wheat and spelt and pseudocereals (amarnath and buckwheat). Results were analysed using descriptive statistics (dendrograms and PCA). The results show that this method can be used for making a distinction among different types of flour. Copyright © 2013 Elsevier Ltd. All rights reserved.
Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica
2016-04-19
The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.
Long-life mission reliability for outer planet atmospheric entry probes
NASA Technical Reports Server (NTRS)
Mccall, M. T.; Rouch, L.; Maycock, J. N.
1976-01-01
The results of a literature analysis on the effects of prolonged exposure to deep space environment on the properties of outer planet atmospheric entry probe components are presented. Materials considered included elastomers and plastics, pyrotechnic devices, thermal control components, metal springs and electronic components. The rates of degradation of each component were determined and extrapolation techniques were used to predict the effects of exposure for up to eight years to deep space. Pyrotechnic devices were aged under accelerated conditions to an equivalent of eight years in space and functionally tested. Results of the literature analysis of the selected components and testing of the devices indicated that no severe degradation should be expected during an eight year space mission.
Determining the Number of Components from the Matrix of Partial Correlations
ERIC Educational Resources Information Center
Velicer, Wayne F.
1976-01-01
A method is presented for determining the number of components to retain in a principal components or image components analysis which utilizes a matrix of partial correlations. Advantages and uses of the method are discussed and a comparison of the proposed method with existing methods is presented. (JKS)
Multiple Component Event-Related Potential (mcERP) Estimation
NASA Technical Reports Server (NTRS)
Knuth, K. H.; Clanton, S. T.; Shah, A. S.; Truccolo, W. A.; Ding, M.; Bressler, S. L.; Trejo, L. J.; Schroeder, C. E.; Clancy, Daniel (Technical Monitor)
2002-01-01
We show how model-based estimation of the neural sources responsible for transient neuroelectric signals can be improved by the analysis of single trial data. Previously, we showed that a multiple component event-related potential (mcERP) algorithm can extract the responses of individual sources from recordings of a mixture of multiple, possibly interacting, neural ensembles. McERP also estimated single-trial amplitudes and onset latencies, thus allowing more accurate estimation of ongoing neural activity during an experimental trial. The mcERP algorithm is related to informax independent component analysis (ICA); however, the underlying signal model is more physiologically realistic in that a component is modeled as a stereotypic waveshape varying both in amplitude and onset latency from trial to trial. The result is a model that reflects quantities of interest to the neuroscientist. Here we demonstrate that the mcERP algorithm provides more accurate results than more traditional methods such as factor analysis and the more recent ICA. Whereas factor analysis assumes the sources are orthogonal and ICA assumes the sources are statistically independent, the mcERP algorithm makes no such assumptions thus allowing investigators to examine interactions among components by estimating the properties of single-trial responses.
Computational Fatigue Life Analysis of Carbon Fiber Laminate
NASA Astrophysics Data System (ADS)
Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.
2018-02-01
In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.
Morin, R.H.
1997-01-01
Returns from drilling in unconsolidated cobble and sand aquifers commonly do not identify lithologic changes that may be meaningful for Hydrogeologic investigations. Vertical resolution of saturated, Quaternary, coarse braided-slream deposits is significantly improved by interpreting natural gamma (G), epithermal neutron (N), and electromagnetically induced resistivity (IR) logs obtained from wells at the Capital Station site in Boise, Idaho. Interpretation of these geophysical logs is simplified because these sediments are derived largely from high-gamma-producing source rocks (granitics of the Boise River drainage), contain few clays, and have undergone little diagenesis. Analysis of G, N, and IR data from these deposits with principal components analysis provides an objective means to determine if units can be recognized within the braided-stream deposits. In particular, performing principal components analysis on G, N, and IR data from eight wells at Capital Station (1) allows the variable system dimensionality to be reduced from three to two by selecting the two eigenvectors with the greatest variance as axes for principal component scatterplots, (2) generates principal components with interpretable physical meanings, (3) distinguishes sand from cobble-dominated units, and (4) provides a means to distinguish between cobble-dominated units.
NASA Astrophysics Data System (ADS)
Khelifa, S.
2014-12-01
Using wavelet transform and Allan variance, we have analysed the solutions of weekly position residuals of 09 high latitude DORIS stations in STCD (STation Coordinate Difference) format provided from the three Analysis Centres : IGN-JPL (solution ign11wd01), INASAN (solution ina10wd01) and CNES-CLS (solution lca11wd02), in order to compare the spectral characteristics of their residual noise. The temporal correlations between the three solutions, two by two and station by station, for each component (North, East and Vertical) reveal a high correlation in the horizontal components (North and East). For the North component, the correlation average is about 0.88, 0.81 and 0.79 between, respectively, IGN-INA, IGN-LCA and INA-LCA solutions, then for the East component it is about 0.84, 0.82 and 0.76, respectively. However, the correlations for the Vertical component are moderate with an average of 0.64, 0.57 and 0.58 in, respectively, IGN-INA, IGN-LCA and INA-LCA solutions. After removing the trends and seasonal components from the analysed time series, the Allan variance analysis shows that the three solutions are dominated by a white noise in the all three components (North, East and Vertical). The wavelet transform analysis, using the VisuShrink method with soft thresholding, reveals that the noise level in the LCA solution is less important compared to IGN and INA solutions. Indeed, the standard deviation of the noise for the three components is in the range of 5-11, 5-12 and 4-9mm in the IGN, INA, and LCA solutions, respectively.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Swearingen, Matthew C; DiBartola, Alex C; Dusane, Devendra; Granger, Jeffrey; Stoodley, Paul
2016-10-01
Bacterial biofilms are the main etiological agent of periprosthetic joint infections (PJI); however, it is unclear if biofilms colonize one or multiple components. Because biofilms can colonize a variety of surfaces, we hypothesized that biofilms would be present on all components. 16S ribosomal RNA (rRNA) gene sequencing analysis was used to identify bacteria recovered from individual components and non-absorbable suture material recovered from three PJI total knee revision cases. Bray-Curtis non-metric multidimensional scaling analysis revealed no significant differences in similarity when factoring component, material type, or suture versus non-suture material, but did reveal significant differences in organism profile between patients (P < 0.001) and negative controls (P < 0.001). Confocal microscopy and a novel agar encasement culturing method also confirmed biofilm growth on a subset of components. While 16S sequencing suggested that the microbiology was more complex than revealed by culture contaminating, bacterial DNA generates a risk of false positives. This report highlights that biofilm bacteria may colonize all infected prosthetic components including braided suture material, and provides further evidence that clinical culture can fail to sufficiently identify the full pathogen profile in PJI cases. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Function Invariant and Parameter Scale-Free Transformation Methods
ERIC Educational Resources Information Center
Bentler, P. M.; Wingard, Joseph A.
1977-01-01
A scale-invariant simple structure function of previously studied function components for principal component analysis and factor analysis is defined. First and second partial derivatives are obtained, and Newton-Raphson iterations are utilized. The resulting solutions are locally optimal and subjectively pleasing. (Author/JKS)
Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong
2015-11-17
We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.
Steinhauser, Marco; Hübner, Ronald
2009-10-01
It has been suggested that performance in the Stroop task is influenced by response conflict as well as task conflict. The present study investigated the idea that both conflict types can be isolated by applying ex-Gaussian distribution analysis which decomposes response time into a Gaussian and an exponential component. Two experiments were conducted in which manual versions of a standard Stroop task (Experiment 1) and a separated Stroop task (Experiment 2) were performed under task-switching conditions. Effects of response congruency and stimulus bivalency were used to measure response conflict and task conflict, respectively. Ex-Gaussian analysis revealed that response conflict was mainly observed in the Gaussian component, whereas task conflict was stronger in the exponential component. Moreover, task conflict in the exponential component was selectively enhanced under task-switching conditions. The results suggest that ex-Gaussian analysis can be used as a tool to isolate different conflict types in the Stroop task. PsycINFO Database Record (c) 2009 APA, all rights reserved.
EMD-WVD time-frequency distribution for analysis of multi-component signals
NASA Astrophysics Data System (ADS)
Chai, Yunzi; Zhang, Xudong
2016-10-01
Time-frequency distribution (TFD) is two-dimensional function that indicates the time-varying frequency content of one-dimensional signals. And The Wigner-Ville distribution (WVD) is an important and effective time-frequency analysis method. The WVD can efficiently show the characteristic of a mono-component signal. However, a major drawback is the extra cross-terms when multi-component signals are analyzed by WVD. In order to eliminating the cross-terms, we decompose signals into single frequency components - Intrinsic Mode Function (IMF) - by using the Empirical Mode decomposition (EMD) first, then use WVD to analyze each single IMF. In this paper, we define this new time-frequency distribution as EMD-WVD. And the experiment results show that the proposed time-frequency method can solve the cross-terms problem effectively and improve the accuracy of WVD time-frequency analysis.
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
Non-linear principal component analysis applied to Lorenz models and to North Atlantic SLP
NASA Astrophysics Data System (ADS)
Russo, A.; Trigo, R. M.
2003-04-01
A non-linear generalisation of Principal Component Analysis (PCA), denoted Non-Linear Principal Component Analysis (NLPCA), is introduced and applied to the analysis of three data sets. Non-Linear Principal Component Analysis allows for the detection and characterisation of low-dimensional non-linear structure in multivariate data sets. This method is implemented using a 5-layer feed-forward neural network introduced originally in the chemical engineering literature (Kramer, 1991). The method is described and details of its implementation are addressed. Non-Linear Principal Component Analysis is first applied to a data set sampled from the Lorenz attractor (1963). It is found that the NLPCA approximations are more representative of the data than are the corresponding PCA approximations. The same methodology was applied to the less known Lorenz attractor (1984). However, the results obtained weren't as good as those attained with the famous 'Butterfly' attractor. Further work with this model is underway in order to assess if NLPCA techniques can be more representative of the data characteristics than are the corresponding PCA approximations. The application of NLPCA to relatively 'simple' dynamical systems, such as those proposed by Lorenz, is well understood. However, the application of NLPCA to a large climatic data set is much more challenging. Here, we have applied NLPCA to the sea level pressure (SLP) field for the entire North Atlantic area and the results show a slight imcrement of explained variance associated. Finally, directions for future work are presented.%}
Noninvasive deep Raman detection with 2D correlation analysis
NASA Astrophysics Data System (ADS)
Kim, Hyung Min; Park, Hyo Sun; Cho, Youngho; Jin, Seung Min; Lee, Kang Taek; Jung, Young Mee; Suh, Yung Doug
2014-07-01
The detection of poisonous chemicals enclosed in daily necessaries is prerequisite essential for homeland security with the increasing threat of terrorism. For the detection of toxic chemicals, we combined a sensitive deep Raman spectroscopic method with 2D correlation analysis. We obtained the Raman spectra from concealed chemicals employing spatially offset Raman spectroscopy in which incident line-shaped light experiences multiple scatterings before being delivered to inner component and yielding deep Raman signal. Furthermore, we restored the pure Raman spectrum of each component using 2D correlation spectroscopic analysis with chemical inspection. Using this method, we could elucidate subsurface component under thick powder and packed contents in a bottle.
Lattice Independent Component Analysis for Mobile Robot Localization
NASA Astrophysics Data System (ADS)
Villaverde, Ivan; Fernandez-Gauna, Borja; Zulueta, Ekaitz
This paper introduces an approach to appearance based mobile robot localization using Lattice Independent Component Analysis (LICA). The Endmember Induction Heuristic Algorithm (EIHA) is used to select a set of Strong Lattice Independent (SLI) vectors, which can be assumed to be Affine Independent, and therefore candidates to be the endmembers of the data. Selected endmembers are used to compute the linear unmixing of the robot's acquired images. The resulting mixing coefficients are used as feature vectors for view recognition through classification. We show on a sample path experiment that our approach can recognise the localization of the robot and we compare the results with the Independent Component Analysis (ICA).
Chen, Pei; Jin, Hong-Yu; Sun, Lei; Ma, Shuang-Cheng
2016-09-01
Multi-source analysis of traditional Chinese medicine is key to ensuring its safety and efficacy. Compared with traditional experimental differentiation, chemometric analysis is a simpler strategy to identify traditional Chinese medicines. Multi-component analysis plays an increasingly vital role in the quality control of traditional Chinese medicines. A novel strategy, based on chemometric analysis and quantitative analysis of multiple components, was proposed to easily and effectively control the quality of traditional Chinese medicines such as Chonglou. Ultra high performance liquid chromatography was more convenient and efficient. Five species of Chonglou were distinguished by chemometric analysis and nine saponins, including Chonglou saponins I, II, V, VI, VII, D, and H, as well as dioscin and gracillin, were determined in 18 min. The method is feasible and credible, and enables to improve quality control of traditional Chinese medicines and natural products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Inventory of File sref.t03z.pgrb212_SPC.prob_3hrly.gri
-GWD analysis Zonal Flux of Gravity Wave Stress [prob] prob =1 002 entire atmosphere (considered as a as a single layer) VUCSH analysis Vertical U-Component Shear [prob] prob =2 004 entire atmosphere (considered as a single layer) VUCSH analysis Vertical U-Component Shear [prob] prob =3 005 surface APCP 0-3
Inventory of File sref.t03z.pgrb216_SPC.prob_3hrly.gri
-GWD analysis Zonal Flux of Gravity Wave Stress [prob] prob =1 002 entire atmosphere (considered as a as a single layer) VUCSH analysis Vertical U-Component Shear [prob] prob =2 004 entire atmosphere (considered as a single layer) VUCSH analysis Vertical U-Component Shear [prob] prob =3 005 surface APCP 0-3
Inventory of File sref.t03z.pgrb243_SPC.prob_3hrly.gri
-GWD analysis Zonal Flux of Gravity Wave Stress [prob] prob =1 002 entire atmosphere (considered as a as a single layer) VUCSH analysis Vertical U-Component Shear [prob] prob =2 004 entire atmosphere (considered as a single layer) VUCSH analysis Vertical U-Component Shear [prob] prob =3 005 surface APCP 0-3
49 CFR Appendix B to Part 236 - Risk Assessment Criteria
Code of Federal Regulations, 2012 CFR
2012-10-01
... availability calculations for subsystems and components, Fault Tree Analysis (FTA) of the subsystems, and... upper bound, as estimated with a sensitivity analysis, and the risk value selected must be demonstrated... interconnected subsystems/components? The risk assessment of each safety-critical system (product) must account...
49 CFR Appendix B to Part 236 - Risk Assessment Criteria
Code of Federal Regulations, 2014 CFR
2014-10-01
... availability calculations for subsystems and components, Fault Tree Analysis (FTA) of the subsystems, and... upper bound, as estimated with a sensitivity analysis, and the risk value selected must be demonstrated... interconnected subsystems/components? The risk assessment of each safety-critical system (product) must account...
Component analysis and initial validity of the exercise fear avoidance scale.
Wingo, Brooks C; Baskin, Monica; Ard, Jamy D; Evans, Retta; Roy, Jane; Vogtle, Laura; Grimley, Diane; Snyder, Scott
2013-01-01
To develop the Exercise Fear Avoidance Scale (EFAS) to measure fear of exercise-induced discomfort. We conducted principal component analysis to determine component structure and Cronbach's alpha to assess internal consistency of the EFAS. Relationships between EFAS scores, BMI, physical activity, and pain were analyzed using multivariate regression. The best fit was a 3-component structure: weight-specific fears, cardiorespiratory fears, and musculoskeletal fears. Cronbach's alpha for the EFAS was α=.86. EFAS scores significantly predicted BMI, physical activity, and PDI scores. Psychometric properties of this scale suggest it may be useful for tailoring exercise prescriptions to address fear of exercise-related discomfort.
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.
Probabilistic evaluation of SSME structural components
NASA Astrophysics Data System (ADS)
Rajagopal, K. R.; Newell, J. F.; Ho, H.
1991-05-01
The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.
Measurement analysis of two radials with a common-origin point and its application.
Liu, Zhenyao; Yang, Jidong; Zhu, Weiwei; Zhou, Shang; Tan, Xuanping
2017-08-01
In spectral analysis, a chemical component is usually identified by its characteristic spectra, especially the peaks. If two components have overlapping spectral peaks, they are generally considered to be indiscriminate in current analytical chemistry textbooks and related literature. However, if the intensities of the overlapping major spectral peaks are additive, and have different rates of change with respect to variations in the concentration of the individual components, a simple method, named the 'common-origin ray', for the simultaneous determination of two components can be established. Several case studies highlighting its applications are presented. Copyright © 2017 John Wiley & Sons, Ltd.
A first application of independent component analysis to extracting structure from stock returns.
Back, A D; Weigend, A S
1997-08-01
This paper explores the application of a signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. ICA is shown to be a potentially powerful method of analyzing and understanding driving mechanisms in financial time series. The application to portfolio optimization is described in Chin and Weigend (1998).
Reliability analysis of component-level redundant topologies for solid-state fault current limiter
NASA Astrophysics Data System (ADS)
Farhadi, Masoud; Abapour, Mehdi; Mohammadi-Ivatloo, Behnam
2018-04-01
Experience shows that semiconductor switches in power electronics systems are the most vulnerable components. One of the most common ways to solve this reliability challenge is component-level redundant design. There are four possible configurations for the redundant design in component level. This article presents a comparative reliability analysis between different component-level redundant designs for solid-state fault current limiter. The aim of the proposed analysis is to determine the more reliable component-level redundant configuration. The mean time to failure (MTTF) is used as the reliability parameter. Considering both fault types (open circuit and short circuit), the MTTFs of different configurations are calculated. It is demonstrated that more reliable configuration depends on the junction temperature of the semiconductor switches in the steady state. That junction temperature is a function of (i) ambient temperature, (ii) power loss of the semiconductor switch and (iii) thermal resistance of heat sink. Also, results' sensitivity to each parameter is investigated. The results show that in different conditions, various configurations have higher reliability. The experimental results are presented to clarify the theory and feasibility of the proposed approaches. At last, levelised costs of different configurations are analysed for a fair comparison.
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
Semi-blind Bayesian inference of CMB map and power spectrum
NASA Astrophysics Data System (ADS)
Vansyngel, Flavien; Wandelt, Benjamin D.; Cardoso, Jean-François; Benabed, Karim
2016-04-01
We present a new blind formulation of the cosmic microwave background (CMB) inference problem. The approach relies on a phenomenological model of the multifrequency microwave sky without the need for physical models of the individual components. For all-sky and high resolution data, it unifies parts of the analysis that had previously been treated separately such as component separation and power spectrum inference. We describe an efficient sampling scheme that fully explores the component separation uncertainties on the inferred CMB products such as maps and/or power spectra. External information about individual components can be incorporated as a prior giving a flexible way to progressively and continuously introduce physical component separation from a maximally blind approach. We connect our Bayesian formalism to existing approaches such as Commander, spectral mismatch independent component analysis (SMICA), and internal linear combination (ILC), and discuss possible future extensions.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Liu, Wei; Wang, Dongmei; Hou, Xiaogai; Yang, Yueqin; Xue, Xian; Jia, Qishi; Zhang, Lixia; Zhao, Wei; Yin, Dongxue
2018-05-17
Traditional Chinese medicine (TCM) plays a very important role in the health system of China. The content and activity of active component are main indexes that evaluate the quality of TCM, however they may vary with environmental factors in their growing locations. In this study, effects of environmental factors on the contents of active components and antioxidant activity of Dasiphora fruticosa from the five main production areas of China were investigated. The contents of tannin, total flavonoid and rutin were determined and varied within the range of 7.65-10.69%, 2.30-5.39% and 0.18-0.81%, respectively. Antioxidant activity was determined by DPPH assay, with the DPPH IC 50 values ranged from 8.791 to 32.534μg mL -1 . In order to further explore the cause of these significant geographical variations, the chemometric methods including correlation analysis, principal component analysis, gray correlation analysis, and path analysis were conducted. The results showed environmental factors had significant effect on the active component contents and antioxidant activity. Rapidly available phosphorus (RAP) and rapidly available nitrogen (RAN) were common dominant factors, and a significant positive correlation was observed between RAP and active components and antioxidant activity (P<0.05). Contributed by their high active components and strong antioxidant activity, Bange in Tibet and Geermu in Qinghai Province was selected as a favorable growing location, respectively. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Yamamoto, Shinya; Bamba, Takeshi; Sano, Atsushi; Kodama, Yukako; Imamura, Miho; Obata, Akio; Fukusaki, Eiichiro
2012-08-01
Soy sauces, produced from different ingredients and brewing processes, have variations in components and quality. Therefore, it is extremely important to comprehend the relationship between components and the sensory attributes of soy sauces. The current study sought to perform metabolite profiling in order to devise a method of assessing the attributes of soy sauces. Quantitative descriptive analysis (QDA) data for 24 soy sauce samples were obtained from well selected sensory panelists. Metabolite profiles primarily concerning low-molecular-weight hydrophilic components were based on gas chromatography with time-of-flightmass spectrometry (GC/TOFMS). QDA data for soy sauces were accurately predicted by projection to latent structure (PLS), with metabolite profiles serving as explanatory variables and QDA data set serving as a response variable. Moreover, analysis of correlation between matrices of metabolite profiles and QDA data indicated contributing compounds that were highly correlated with QDA data. Especially, it was indicated that sugars are important components of the tastes of soy sauces. This new approach which combines metabolite profiling with QDA is applicable to analysis of sensory attributes of food as a result of the complex interaction between its components. This approach is effective to search important compounds that contribute to the attributes. Copyright © 2012 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Aoxue; Wang, Jingjuan; Guo, Yizhen; Xiao, Yao; Wang, Yue; Sun, Suqin; Chen, Jianbo
2018-03-01
As a kind of common prescriptions, Shaoyao-Gancao-Tang (SGT) contains two Chinese herbs with four different proportions which have different clinical efficacy because of their various components. In order to investigate the herb-herb interaction mechanisms, we used the method of tri-level infrared macro-fingerprint spectroscopy to evaluate the concentration change of active components of four SGTs in this research. Fourier transform infrared spectroscopy (FT-IR) and Second derivative infrared spectroscopy (SD-IR) can recognize the multiple prescriptions directly and simultaneously. 2D-IR spectra enhance the spectral resolution and obtain much new information for discriminating the similar complicated samples of SGT. Furthermore, the whole analysis method from the analysis of the main components to the specific components and the relative content of the components may evaluate the quality of TCM better. Then we concluded that paeoniflorin and glycyrrhizic acid were the highest proportion in active ingredients in SGT-12:1 and the lowest one in SGT-12:12, which matched the HPLC-DAD results. It is demonstrated that the method composed by the tri-level infrared macro-fingerprint spectroscopy and the whole analysis can be applicable for effective, visual and accurate analysis and identification of very complicated and similar mixture systems of traditional Chinese medicine.
Space tug propulsion system failure mode, effects and criticality analysis
NASA Technical Reports Server (NTRS)
Boyd, J. W.; Hardison, E. P.; Heard, C. B.; Orourke, J. C.; Osborne, F.; Wakefield, L. T.
1972-01-01
For purposes of the study, the propulsion system was considered as consisting of the following: (1) main engine system, (2) auxiliary propulsion system, (3) pneumatic system, (4) hydrogen feed, fill, drain and vent system, (5) oxygen feed, fill, drain and vent system, and (6) helium reentry purge system. Each component was critically examined to identify possible failure modes and the subsequent effect on mission success. Each space tug mission consists of three phases: launch to separation from shuttle, separation to redocking, and redocking to landing. The analysis considered the results of failure of a component during each phase of the mission. After the failure modes of each component were tabulated, those components whose failure would result in possible or certain loss of mission or inability to return the Tug to ground were identified as critical components and a criticality number determined for each. The criticality number of a component denotes the number of mission failures in one million missions due to the loss of that component. A total of 68 components were identified as critical with criticality numbers ranging from 1 to 2990.
Singh, Shatrughan; D'Sa, Eurico J; Swenson, Erick M
2010-07-15
Chromophoric dissolved organic matter (CDOM) variability in Barataria Basin, Louisiana, USA,was examined by excitation emission matrix (EEM) fluorescence combined with parallel factor analysis (PARAFAC). CDOM optical properties of absorption and fluorescence at 355nm along an axial transect (36 stations) during March, April, and May 2008 showed an increasing trend from the marine end member to the upper basin with mean CDOM absorption of 11.06 + or - 5.01, 10.05 + or - 4.23, 11.67 + or - 6.03 (m(-)(1)) and fluorescence 0.80 + or - 0.37, 0.78 + or - 0.39, 0.75 + or - 0.51 (RU), respectively. PARAFAC analysis identified two terrestrial humic-like (component 1 and 2), one non-humic like (component 3), and one soil derived humic acid like (component 4) components. The spatial variation of the components showed an increasing trend from station 1 (near the mouth of basin) to station 36 (end member of bay; upper basin). Deviations from this increasing trend were observed at a bayou channel with very high chlorophyll-a concentrations especially for component 3 in May 2008 that suggested autochthonous production of CDOM. The variability of components with salinity indicated conservative mixing along the middle part of the transect. Component 1 and 4 were found to be relatively constant, while components 2 and 3 revealed an inverse relationship for the sampling period. Total organic carbon showed increasing trend for each of the components. An increase in humification and a decrease in fluorescence indices along the transect indicated an increase in terrestrial derived organic matter and reduced microbial activity from lower to upper basin. The use of these indices along with PARAFAC results improved dissolved organic matter characterization in the Barataria Basin. Copyright 2010 Elsevier B.V. All rights reserved.
Multivariate Statistical Analysis of MSL APXS Bulk Geochemical Data
NASA Astrophysics Data System (ADS)
Hamilton, V. E.; Edwards, C. S.; Thompson, L. M.; Schmidt, M. E.
2014-12-01
We apply cluster and factor analyses to bulk chemical data of 130 soil and rock samples measured by the Alpha Particle X-ray Spectrometer (APXS) on the Mars Science Laboratory (MSL) rover Curiosity through sol 650. Multivariate approaches such as principal components analysis (PCA), cluster analysis, and factor analysis compliment more traditional approaches (e.g., Harker diagrams), with the advantage of simultaneously examining the relationships between multiple variables for large numbers of samples. Principal components analysis has been applied with success to APXS, Pancam, and Mössbauer data from the Mars Exploration Rovers. Factor analysis and cluster analysis have been applied with success to thermal infrared (TIR) spectral data of Mars. Cluster analyses group the input data by similarity, where there are a number of different methods for defining similarity (hierarchical, density, distribution, etc.). For example, without any assumptions about the chemical contributions of surface dust, preliminary hierarchical and K-means cluster analyses clearly distinguish the physically adjacent rock targets Windjana and Stephen as being distinctly different than lithologies observed prior to Curiosity's arrival at The Kimberley. In addition, they are separated from each other, consistent with chemical trends observed in variation diagrams but without requiring assumptions about chemical relationships. We will discuss the variation in cluster analysis results as a function of clustering method and pre-processing (e.g., log transformation, correction for dust cover) and implications for interpreting chemical data. Factor analysis shares some similarities with PCA, and examines the variability among observed components of a dataset so as to reveal variations attributable to unobserved components. Factor analysis has been used to extract the TIR spectra of components that are typically observed in mixtures and only rarely in isolation; there is the potential for similar results with data from APXS. These techniques offer new ways to understand the chemical relationships between the materials interrogated by Curiosity, and potentially their relation to materials observed by APXS instruments on other landed missions.
Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.
1989-01-01
A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.
Retest of a Principal Components Analysis of Two Household Environmental Risk Instruments.
Oneal, Gail A; Postma, Julie; Odom-Maryon, Tamara; Butterfield, Patricia
2016-08-01
Household Risk Perception (HRP) and Self-Efficacy in Environmental Risk Reduction (SEERR) instruments were developed for a public health nurse-delivered intervention designed to reduce home-based, environmental health risks among rural, low-income families. The purpose of this study was to test both instruments in a second low-income population that differed geographically and economically from the original sample. Participants (N = 199) were recruited from the Women, Infants, and Children (WIC) program. Paper and pencil surveys were collected at WIC sites by research-trained student nurses. Exploratory principal components analysis (PCA) was conducted, and comparisons were made to the original PCA for the purpose of data reduction. Instruments showed satisfactory Cronbach alpha values for all components. HRP components were reduced from five to four, which explained 70% of variance. The components were labeled sensed risks, unseen risks, severity of risks, and knowledge. In contrast to the original testing, environmental tobacco smoke (ETS) items was not a separate component of the HRP. The SEERR analysis demonstrated four components explaining 71% of variance, with similar patterns of items as in the first study, including a component on ETS, but some differences in item location. Although low-income populations constituted both samples, differences in demographics and risk exposures may have played a role in component and item locations. Findings provided justification for changing or reducing items, and for tailoring the instruments to population-level risks and behaviors. Although analytic refinement will continue, both instruments advance the measurement of environmental health risk perception and self-efficacy. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Nishio, Jun; Iwasaki, Hiroshi; Nabeshima, Kazuki; Naito, Masatoshi
2015-01-01
Dedifferentiated liposarcoma (DDLS) is a malignant adipocytic tumor showing transition from an atypical lipomatous tumor (ALT)/well-differentiated liposarcoma (WDLS) to a non-lipogenic sarcoma of variable histological grades. We present the immunohistochemical, cytogenetic, and molecular cytogenetic findings of DDLS arising in the right chest wall of a 76-year-old man. Magnetic resonance imaging exhibited a large mass composed of two components with heterogeneous signal intensities, suggesting the coexistence of a fatty area and another soft tissue component. The grossly heterogeneous mass was histologically composed of an ALT/WDLS component transitioning abruptly into a dedifferentiated component. Immunohistochemistry was positive for murine double-minute 2 (MDM2), cyclin-dependent kinase 4 (CDK4), and p16 in both components, although a more strong and diffuse staining was found in the dedifferentiated area. The MIB-1 labeling index was extremely higher in the dedifferentiated area compared to the ALT/WDLS area. Cytogenetic analysis of the ALT/WDLS component revealed the following karyotype: 46,X,-Y,+r. Notably, cytogenetic analysis of the dedifferentiated component revealed a similar but more complex karyotype. Spectral karyotyping demonstrated that the ring chromosome was entirely composed of material from chromosome 12. Interphase fluorescence in situ hybridization analysis revealed amplification of MDM2 and CDK4 in both components. These findings suggest that multiple abnormal clones derived from a single precursor cell would be present in DDLS, with one or more containing supernumerary rings or giant marker chromosomes. Copyright© 2015 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.
Millán, F; Gracia, S; Sánchez-Martín, F M; Angerri, O; Rousaud, F; Villavicencio, H
2011-03-01
To evaluate a new approach to urinary stone analysis according to the combination of the components. A total of 7949 stones were analysed and their main components and combinations of components were classified according to gender and age. Statistical analysis was performed using the chi-square test. Calcium oxalate monohydrate (COM) was the most frequent component in both males (39%) and females (37.4%), followed by calcium oxalate dihydrate (COD) (28%) and uric acid (URI) (14.6%) in males and by phosphate (PHO) (22.2%) and COD (19.6%) in females (p=0.0001). In young people, COD and PHO were the most frequent components in males and females respectively (p=0.0001). In older patients, COM and URI (in that order) were the most frequent components in both genders (p=0.0001). COM is oxalate dependent and is related to diets with a high oxalate content and low water intake. The progressive increase in URI with age is related mainly to overweight and metabolic syndrome. Regarding the combinations of components, the most frequent were COM (26.3%), COD+Apatite (APA) (15.5%), URI (10%) and COM+COD (7.5%) (p=0.0001). This study reports not only the composition of stones but also the main combinations of components according to age and gender. The results prove that stone composition is related to the changes in dietary habits and life-style that occur over a lifetime, and the morphological structure of stones is indicative of the aetiopathogenic mechanisms. Copyright © 2010 AEU. Published by Elsevier Espana. All rights reserved.
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Hartle, M. S.; Mcknight, R. L.; Huang, H.; Holt, R.
1992-01-01
Described here are the accomplishments of a 5-year program to develop a methodology for coupled structural, thermal, electromagnetic analysis tailoring of graded component structures. The capabilities developed over the course of the program are the analyzer module and the tailoring module for the modeling of graded materials. Highlighted accomplishments for the past year include the addition of a buckling analysis capability, the addition of mode shape slope calculation for flutter analysis, verification of the analysis modules using simulated components, and verification of the tailoring module.
The Intercultural Component in Textbooks for Teaching a Service Technical Writing Course
ERIC Educational Resources Information Center
Matveeva, Natalia
2007-01-01
This research article investigates new developments in the representation of the intercultural component in textbooks for a service technical writing course. Through textual analysis, using quantitative and qualitative techniques, I report discourse analysis of 15 technical writing textbooks published during 1993-2006. The theoretical and…
The Evaluation and Research of Multi-Project Programs: Program Component Analysis.
ERIC Educational Resources Information Center
Baker, Eva L.
1977-01-01
It is difficult to base evaluations on concepts irrelevant to state policy making. Evaluation of a multiproject program requires both time and differentiation of method. Data from the California Early Childhood Program illustrate process variables for program component analysis, and research questions for intraprogram comparison. (CP)
76 FR 57982 - Building Energy Codes Cost Analysis
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-19
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy [Docket No. EERE-2011-BT-BC-0046] Building Energy Codes Cost Analysis Correction In notice document 2011-23236 beginning on page... heading ``Table 1. Cash flow components'' should read ``Table 7. Cash flow components''. [FR Doc. C1-2011...
The economics of project analysis: Optimal investment criteria and methods of study
NASA Technical Reports Server (NTRS)
Scriven, M. C.
1979-01-01
Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.
Deep overbite malocclusion: analysis of the underlying components.
El-Dawlatly, Mostafa M; Fayed, Mona M Salah; Mostafa, Yehya A
2012-10-01
A deepbite malocclusion should not be approached as a disease entity; instead, it should be viewed as a clinical manifestation of underlying discrepancies. The aim of this study was to investigate the various skeletal and dental components of deep bite malocclusion, the significance of the contribution of each, and whether there are certain correlations between them. Dental and skeletal measurements were made on lateral cephalometric radiographs and study models of 124 patients with deepbite. These measurements were statistically analyzed. An exaggerated curve of Spee was the greatest shared dental component (78%), significantly higher than any other component (P = 0.0335). A decreased gonial angle was the greatest shared skeletal component (37.1%), highly significant compared with the other components (P = 0.0019). A strong positive correlation was found between the ramus/Frankfort horizontal angle and the gonial angle; weaker correlations were found between various components. An exaggerated curve of Spee and a decreased gonial angle were the greatest contributing components. This analysis of deepbite components could help clinicians design individualized mechanotherapies based on the underlying cause, rather than being biased toward predetermined mechanics when treating patients with a deepbite malocclusion. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Fuchs, Marcus; Nouidui, Thierry
This paper discusses design decisions for exporting Modelica thermofluid flow components as Functional Mockup Units. The purpose is to provide guidelines that will allow building energy simulation programs and HVAC equipment manufacturers to effectively use FMUs for modeling of HVAC components and systems. We provide an analysis for direct input-output dependencies of such components and discuss how these dependencies can lead to algebraic loops that are formed when connecting thermofluid flow components. Based on this analysis, we provide recommendations that increase the computing efficiency of such components and systems that are formed by connecting multiple components. We explain what codemore » optimizations are lost when providing thermofluid flow components as FMUs rather than Modelica code. We present an implementation of a package for FMU export of such components, explain the rationale for selecting the connector variables of the FMUs and finally provide computing benchmarks for different design choices. It turns out that selecting temperature rather than specific enthalpy as input and output signals does not lead to a measurable increase in computing time, but selecting nine small FMUs rather than a large FMU increases computing time by 70%.« less
NASA Astrophysics Data System (ADS)
Pacholski, Michaeleen L.
2004-06-01
Principal component analysis (PCA) has been successfully applied to time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra, images and depth profiles. Although SIMS spectral data sets can be small (in comparison to datasets typically discussed in literature from other analytical techniques such as gas or liquid chromatography), each spectrum has thousands of ions resulting in what can be a difficult comparison of samples. Analysis of industrially-derived samples means the identity of most surface species are unknown a priori and samples must be analyzed rapidly to satisfy customer demands. PCA enables rapid assessment of spectral differences (or lack there of) between samples and identification of chemically different areas on sample surfaces for images. Depth profile analysis helps define interfaces and identify low-level components in the system.
Evaluation of Low-Voltage Distribution Network Index Based on Improved Principal Component Analysis
NASA Astrophysics Data System (ADS)
Fan, Hanlu; Gao, Suzhou; Fan, Wenjie; Zhong, Yinfeng; Zhu, Lei
2018-01-01
In order to evaluate the development level of the low-voltage distribution network objectively and scientifically, chromatography analysis method is utilized to construct evaluation index model of low-voltage distribution network. Based on the analysis of principal component and the characteristic of logarithmic distribution of the index data, a logarithmic centralization method is adopted to improve the principal component analysis algorithm. The algorithm can decorrelate and reduce the dimensions of the evaluation model and the comprehensive score has a better dispersion degree. The clustering method is adopted to analyse the comprehensive score because the comprehensive score of the courts is concentrated. Then the stratification evaluation of the courts is realized. An example is given to verify the objectivity and scientificity of the evaluation method.
In situ X-ray diffraction analysis of (CF x) n batteries: signal extraction by multivariate analysis
Rodriguez, Mark A.; Keenan, Michael R.; Nagasubramanian, Ganesan
2007-11-10
In this study, (CF x) n cathode reaction during discharge has been investigated using in situ X-ray diffraction (XRD). Mathematical treatment of the in situ XRD data set was performed using multivariate curve resolution with alternating least squares (MCR–ALS), a technique of multivariate analysis. MCR–ALS analysis successfully separated the relatively weak XRD signal intensity due to the chemical reaction from the other inert cell component signals. The resulting dynamic reaction component revealed the loss of (CF x) n cathode signal together with the simultaneous appearance of LiF by-product intensity. Careful examination of the XRD data set revealed an additional dynamicmore » component which may be associated with the formation of an intermediate compound during the discharge process.« less
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Chemical Mapping of Essential Oils, Flavonoids and Carotenoids in Citrus Peels by Raman Microscopy.
Yang, Ying; Wang, Xiaohe; Zhao, Chengying; Tian, Guifang; Zhang, Hua; Xiao, Hang; He, Lili; Zheng, Jinkai
2017-12-01
Citrus peels, by-products in large quantity, are rich in various functional and beneficial components which have wide applications. Chemical analysis of these components in citrus peels is an important step to determine the usefulness of the by-products for further applications. In this study, we explored Raman microscopy for rapid, nondestructive, and in situ chemical mapping of multiple main functional components from citrus peels. The relative amount and distribution in different locations (flavedo, albedo, and longitudinal section) of 3 main functional components (essential oils, carotenoids, and flavonoids) in citrus peels were systematically investigated. The distribution profiles of these components were heterogeneous on the peels and varied between different species of citrus peels. Essential oil was found mainly existed in the oil glands, while carotenoids were in the complementary location. Some flavonoids were observed in the oil glands. This study showed the capability of Raman microscopy for rapid and nondestructive analysis of multiple bio-components without extraction from plants. The information obtained from this study would assist the better production and application of the functional and beneficial components from citrus by products in an effective and sustainable manner. This study indicated the capability of Raman microscopy for rapid and nondestructive analysis of multiple bioactive components in plant tissues. The information obtained from the study would be valuable for developing effective and sustainable strategy of utilization of citrus peels for further applications. © 2017 Institute of Food Technologists®.
NASA Astrophysics Data System (ADS)
Jiang, Weiping; Ma, Jun; Li, Zhao; Zhou, Xiaohui; Zhou, Boye
2018-05-01
The analysis of the correlations between the noise in different components of GPS stations has positive significance to those trying to obtain more accurate uncertainty of velocity with respect to station motion. Previous research into noise in GPS position time series focused mainly on single component evaluation, which affects the acquisition of precise station positions, the velocity field, and its uncertainty. In this study, before and after removing the common-mode error (CME), we performed one-dimensional linear regression analysis of the noise amplitude vectors in different components of 126 GPS stations with a combination of white noise, flicker noise, and random walking noise in Southern California. The results show that, on the one hand, there are above-moderate degrees of correlation between the white noise amplitude vectors in all components of the stations before and after removal of the CME, while the correlations between flicker noise amplitude vectors in horizontal and vertical components are enhanced from un-correlated to moderately correlated by removing the CME. On the other hand, the significance tests show that, all of the obtained linear regression equations, which represent a unique function of the noise amplitude in any two components, are of practical value after removing the CME. According to the noise amplitude estimates in two components and the linear regression equations, more accurate noise amplitudes can be acquired in the two components.
Principal Component Analysis for Enhancement of Infrared Spectra Monitoring
NASA Astrophysics Data System (ADS)
Haney, Ricky Lance
The issue of air quality within the aircraft cabin is receiving increasing attention from both pilot and flight attendant unions. This is due to exposure events caused by poor air quality that in some cases may have contained toxic oil components due to bleed air that flows from outside the aircraft and then through the engines into the aircraft cabin. Significant short and long-term medical issues for aircraft crew have been attributed to exposure. The need for air quality monitoring is especially evident in the fact that currently within an aircraft there are no sensors to monitor the air quality and potentially harmful gas levels (detect-to-warn sensors), much less systems to monitor and purify the air (detect-to-treat sensors) within the aircraft cabin. The specific purpose of this research is to utilize a mathematical technique called principal component analysis (PCA) in conjunction with principal component regression (PCR) and proportionality constant calculations (PCC) to simplify complex, multi-component infrared (IR) spectra data sets into a reduced data set used for determination of the concentrations of the individual components. Use of PCA can significantly simplify data analysis as well as improve the ability to determine concentrations of individual target species in gas mixtures where significant band overlap occurs in the IR spectrum region. Application of this analytical numerical technique to IR spectrum analysis is important in improving performance of commercial sensors that airlines and aircraft manufacturers could potentially use in an aircraft cabin environment for multi-gas component monitoring. The approach of this research is two-fold, consisting of a PCA application to compare simulation and experimental results with the corresponding PCR and PCC to determine quantitatively the component concentrations within a mixture. The experimental data sets consist of both two and three component systems that could potentially be present as air contaminants in an aircraft cabin. In addition, experimental data sets are analyzed for a hydrogen peroxide (H2O2) aqueous solution mixture to determine H2O2 concentrations at various levels that could be produced during use of a vapor phase hydrogen peroxide (VPHP) decontamination system. After the PCA application to two and three component systems, the analysis technique is further expanded to include the monitoring of potential bleed air contaminants from engine oil combustion. Simulation data sets created from database spectra were utilized to predict gas components and concentrations in unknown engine oil samples at high temperatures as well as time-evolved gases from the heating of engine oils.
A guide to understanding meta-analysis.
Israel, Heidi; Richter, Randy R
2011-07-01
With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.
SaaS Platform for Time Series Data Handling
NASA Astrophysics Data System (ADS)
Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail
2018-02-01
The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
J. Sep. Sci. 2016, 39, 4147-4157 DOI: 10.1002/jssc.201600284 Yinchenhao decoction (YCHD) is a famous Chinese herbal formula recorded in the Shang Han Lun which was prescribed by Zhongjing Zhang during 150-219 AD. A novel quantitative analysis method was developed, based on ultrahigh performance liquid chromatography coupled with a diode array detector for the simultaneous determination of 14 main active components in Yinchenhao decoction. Furthermore, the method has been applied for compositional difference analysis of the 14 components in eight normal extraction samples of Yinchenhao decoction, with the aid of hierarchical clustering analysis and similarity analysis. The present research could help hospital, factory and lab choose the best way to make Yinchenhao decoction with better efficacy. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
Peng, Mingguo; Li, Huajie; Li, Dongdong; Du, Erdeng; Li, Zhihong
2017-06-01
Carbon nanotubes (CNTs) were utilized to adsorb DOM in micro-polluted water. The characteristics of DOM adsorption on CNTs were investigated based on UV 254 , TOC, and fluorescence spectrum measurements. Based on PARAFAC (parallel factor) analysis, four fluorescent components were extracted, including one protein-like component (C4) and three humic acid-like components (C1, C2, and C3). The adsorption isotherms, kinetics, and thermodynamics of DOM adsorption on CNTs were further investigated. A Freundlich isotherm model fit the adsorption data well with high values of correlation. As a type of macro-porous and meso-porous adsorbent, CNTs preferably adsorb humic acid-like substances rather than protein-like substances. The increasing temperature will speed up the adsorption process. The self-organizing map (SOM) analysis further explains the fluorescent properties of water samples. The results provide a new insight into the adsorption behaviour of DOM fluorescent components on CNTs.
Genome-wide selection components analysis in a fish with male pregnancy.
Flanagan, Sarah P; Jones, Adam G
2017-04-01
A major goal of evolutionary biology is to identify the genome-level targets of natural and sexual selection. With the advent of next-generation sequencing, whole-genome selection components analysis provides a promising avenue in the search for loci affected by selection in nature. Here, we implement a genome-wide selection components analysis in the sex role reversed Gulf pipefish, Syngnathus scovelli. Our approach involves a double-digest restriction-site associated DNA sequencing (ddRAD-seq) technique, applied to adult females, nonpregnant males, pregnant males, and their offspring. An F ST comparison of allele frequencies among these groups reveals 47 genomic regions putatively experiencing sexual selection, as well as 468 regions showing a signature of differential viability selection between males and females. A complementary likelihood ratio test identifies similar patterns in the data as the F ST analysis. Sexual selection and viability selection both tend to favor the rare alleles in the population. Ultimately, we conclude that genome-wide selection components analysis can be a useful tool to complement other approaches in the effort to pinpoint genome-level targets of selection in the wild. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.
Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C
2016-02-01
Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p < 0.05) difference in sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.
Ji, Tao; Su, Shu-Lan; Guo, Sheng; Qian, Da-Wei; Ouyang, Zhen; Duan, Jin-Ao
2016-06-01
Column chromatography was used for enrichment and separation of flavonoids, alkaloids and polysaccharides from the extracts of Morus alba leaves; glucose oxidase method was used with sucrose as the substrate to evaluate the multi-components of M. alba leaves in α-glucosidase inhibitory models; isobole method, Chou-Talalay combination index analysis and isobolographic analysis were used to evaluate the interaction effects and dose-effect characteristics of two components, providing scientific basis for revealing the hpyerglycemic mechanism of M. alba leaves. The components analysis showed that flavonoid content was 5.3%; organic phenolic acids content was 10.8%; DNJ content was 39.4%; and polysaccharide content was 18.9%. Activity evaluation results demonstrated that flavonoids, alkaloids and polysaccharides of M. alba leaves had significant inhibitory effects on α-glucosidase, and the inhibitory rate was increased with the increasing concentration. Alkaloids showed most significant inhibitory effects among these three components. Both compatibility of alkaloids and flavonoids, and the compatibility of alkaloids and polysaccharides demonstrated synergistic effects, but the compatibility of flavonoids and polysaccharides showed no obvious synergistic effects. The results have confirmed the interaction of multi-components from M. alba leaves to regulate blood sugar, and provided scientific basis for revealing hpyerglycemic effectiveness and mechanism of the multi-components from M. alba leaves. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas
2014-05-01
Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.
NASA Technical Reports Server (NTRS)
Gao, Shou-Ting; Ping, Fan; Li, Xiao-Fan; Tao, Wei-Kuo
2004-01-01
Although dry/moist potential vorticity is a useful physical quantity for meteorological analysis, it cannot be applied to the analysis of 2D simulations. A convective vorticity vector (CVV) is introduced in this study to analyze 2D cloud-resolving simulation data associated with 2D tropical convection. The cloud model is forced by the vertical velocity, zonal wind, horizontal advection, and sea surface temperature obtained from the TOGA COARE, and is integrated for a selected 10-day period. The CVV has zonal and vertical components in the 2D x-z frame. Analysis of zonally-averaged and mass-integrated quantities shows that the correlation coefficient between the vertical component of the CVV and the sum of the cloud hydrometeor mixing ratios is 0.81, whereas the correlation coefficient between the zonal component and the sum of the mixing ratios is only 0.18. This indicates that the vertical component of the CVV is closely associated with tropical convection. The tendency equation for the vertical component of the CVV is derived and the zonally-averaged and mass-integrated tendency budgets are analyzed. The tendency of the vertical component of the CVV is determined by the interaction between the vorticity and the zonal gradient of cloud heating. The results demonstrate that the vertical component of the CVV is a cloud-linked parameter and can be used to study tropical convection.
ENVIRONMENTAL ANALYSIS OF GASOLINE BLENDING COMPONENTS THROUGH THEIR LIFE CYCLE
The contributions of three major gasoline blending components (reformate, alkylate and cracked gasoline) to potential environmental impacts are assessed. This study estimates losses of the gasoline blending components due to evaporation and leaks through their life cycle, from pe...
Gürgen, Fikret; Gürgen, Nurgül
2003-01-01
This study proposes an intelligent data analysis approach to investigate and interpret the distinctive factors of diabetes mellitus patients with and without ischemic (non-embolic type) stroke in a small population. The database consists of a total of 16 features collected from 44 diabetic patients. Features include age, gender, duration of diabetes, cholesterol, high density lipoprotein, triglyceride levels, neuropathy, nephropathy, retinopathy, peripheral vascular disease, myocardial infarction rate, glucose level, medication and blood pressure. Metric and non-metric features are distinguished. First, the mean and covariance of the data are estimated and the correlated components are observed. Second, major components are extracted by principal component analysis. Finally, as common examples of local and global classification approach, a k-nearest neighbor and a high-degree polynomial classifier such as multilayer perceptron are employed for classification with all the components and major components case. Macrovascular changes emerged as the principal distinctive factors of ischemic-stroke in diabetes mellitus. Microvascular changes were generally ineffective discriminators. Recommendations were made according to the rules of evidence-based medicine. Briefly, this case study, based on a small population, supports theories of stroke in diabetes mellitus patients and also concludes that the use of intelligent data analysis improves personalized preventive intervention. PMID:12685939
Three-Dimensional Modeling of Aircraft High-Lift Components with Vehicle Sketch Pad
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
Vehicle Sketch Pad (OpenVSP) is a parametric geometry modeler that has been used extensively for conceptual design studies of aircraft, including studies using higher-order analysis. OpenVSP can model flap and slat surfaces using simple shearing of the airfoil coordinates, which is an appropriate level of complexity for lower-order aerodynamic analysis methods. For three-dimensional analysis, however, there is not a built-in method for defining the high-lift components in OpenVSP in a realistic manner, or for controlling their complex motions in a parametric manner that is intuitive to the designer. This paper seeks instead to utilize OpenVSP's existing capabilities, and establish a set of best practices for modeling high-lift components at a level of complexity suitable for higher-order analysis methods. Techniques are described for modeling the flap and slat components as separate three-dimensional surfaces, and for controlling their motion using simple parameters defined in the local hinge-axis frame of reference. To demonstrate the methodology, an OpenVSP model for the Energy-Efficient Transport (EET) AR12 wind-tunnel model has been created, taking advantage of OpenVSP's Advanced Parameter Linking capability to translate the motions of the high-lift components from the hinge-axis coordinate system to a set of transformations in OpenVSP's frame of reference.
Principal component analysis of dynamic fluorescence images for diagnosis of diabetic vasculopathy
NASA Astrophysics Data System (ADS)
Seo, Jihye; An, Yuri; Lee, Jungsul; Ku, Taeyun; Kang, Yujung; Ahn, Chulwoo; Choi, Chulhee
2016-04-01
Indocyanine green (ICG) fluorescence imaging has been clinically used for noninvasive visualizations of vascular structures. We have previously developed a diagnostic system based on dynamic ICG fluorescence imaging for sensitive detection of vascular disorders. However, because high-dimensional raw data were used, the analysis of the ICG dynamics proved difficult. We used principal component analysis (PCA) in this study to extract important elements without significant loss of information. We examined ICG spatiotemporal profiles and identified critical features related to vascular disorders. PCA time courses of the first three components showed a distinct pattern in diabetic patients. Among the major components, the second principal component (PC2) represented arterial-like features. The explained variance of PC2 in diabetic patients was significantly lower than in normal controls. To visualize the spatial pattern of PCs, pixels were mapped with red, green, and blue channels. The PC2 score showed an inverse pattern between normal controls and diabetic patients. We propose that PC2 can be used as a representative bioimaging marker for the screening of vascular diseases. It may also be useful in simple extractions of arterial-like features.
Bremner, P D; Blacklock, C J; Paganga, G; Mullen, W; Rice-Evans, C A; Crozier, A
2000-06-01
After minimal sample preparation, two different HPLC methodologies, one based on a single gradient reversed-phase HPLC step, the other on multiple HPLC runs each optimised for specific components, were used to investigate the composition of flavonoids and phenolic acids in apple and tomato juices. The principal components in apple juice were identified as chlorogenic acid, phloridzin, caffeic acid and p-coumaric acid. Tomato juice was found to contain chlorogenic acid, caffeic acid, p-coumaric acid, naringenin and rutin. The quantitative estimates of the levels of these compounds, obtained with the two HPLC procedures, were very similar, demonstrating that either method can be used to analyse accurately the phenolic components of apple and tomato juices. Chlorogenic acid in tomato juice was the only component not fully resolved in the single run study and the multiple run analysis prior to enzyme treatment. The single run system of analysis is recommended for the initial investigation of plant phenolics and the multiple run approach for analyses where chromatographic resolution requires improvement.
[A study of Boletus bicolor from different areas using Fourier transform infrared spectrometry].
Zhou, Zai-Jin; Liu, Gang; Ren, Xian-Pei
2010-04-01
It is hard to differentiate the same species of wild growing mushrooms from different areas by macromorphological features. In this paper, Fourier transform infrared (FTIR) spectroscopy combined with principal component analysis was used to identify 58 samples of boletus bicolor from five different areas. Based on the fingerprint infrared spectrum of boletus bicolor samples, principal component analysis was conducted on 58 boletus bicolor spectra in the range of 1 350-750 cm(-1) using the statistical software SPSS 13.0. According to the result, the accumulated contributing ratio of the first three principal components accounts for 88.87%. They included almost all the information of samples. The two-dimensional projection plot using first and second principal component is a satisfactory clustering effect for the classification and discrimination of boletus bicolor. All boletus bicolor samples were divided into five groups with a classification accuracy of 98.3%. The study demonstrated that wild growing boletus bicolor at species level from different areas can be identified by FTIR spectra combined with principal components analysis.
Beekman, Alice; Shan, Daxian; Ali, Alana; Dai, Weiguo; Ward-Smith, Stephen; Goldenberg, Merrill
2005-04-01
This study evaluated the effect of the imaginary component of the refractive index on laser diffraction particle size data for pharmaceutical samples. Excipient particles 1-5 microm in diameter (irregular morphology) were measured by laser diffraction. Optical parameters were obtained and verified based on comparison of calculated vs. actual particle volume fraction. Inappropriate imaginary components of the refractive index can lead to inaccurate results, including false peaks in the size distribution. For laser diffraction measurements, obtaining appropriate or "effective" imaginary components of the refractive index was not always straightforward. When the recommended criteria such as the concentration match and the fit of the scattering data gave similar results for very different calculated size distributions, a supplemental technique, microscopy with image analysis, was used to decide between the alternatives. Use of effective optical parameters produced a good match between laser diffraction data and microscopy/image analysis data. The imaginary component of the refractive index can have a major impact on particle size results calculated from laser diffraction data. When performed properly, laser diffraction and microscopy with image analysis can yield comparable results.
NASA Astrophysics Data System (ADS)
Naghibolhosseini, Maryam; Long, Glenis
2011-11-01
The distortion product otoacoustic emission (DPOAE) input/output (I/O) function may provide a potential tool for evaluating cochlear compression. Hearing loss causes an increase in the level of the sound that is just audible for the person, which affects the cochlea compression and thus the dynamic range of hearing. Although the slope of the I/O function is highly variable when the total DPOAE is used, separating the nonlinear-generator component from the reflection component reduces this variability. We separated the two components using least squares fit (LSF) analysis of logarithmic sweeping tones, and confirmed that the separated generator component provides more consistent I/O functions than the total DPOAE. In this paper we estimated the slope of the I/O functions of the generator components at different sound levels using LSF analysis. An artificial neural network (ANN) was used to estimate psychophysical thresholds using the estimated slopes of the I/O functions. DPOAE I/O functions determined in this way may help to estimate hearing thresholds and cochlear health.
Periodic component analysis as a spatial filter for SSVEP-based brain-computer interface.
Kiran Kumar, G R; Reddy, M Ramasubba
2018-06-08
Traditional Spatial filters used for steady-state visual evoked potential (SSVEP) extraction such as minimum energy combination (MEC) require the estimation of the background electroencephalogram (EEG) noise components. Even though this leads to improved performance in low signal to noise ratio (SNR) conditions, it makes such algorithms slow compared to the standard detection methods like canonical correlation analysis (CCA) due to the additional computational cost. In this paper, Periodic component analysis (πCA) is presented as an alternative spatial filtering approach to extract the SSVEP component effectively without involving extensive modelling of the noise. The πCA can separate out components corresponding to a given frequency of interest from the background electroencephalogram (EEG) by capturing the temporal information and does not generalize SSVEP based on rigid templates. Data from ten test subjects were used to evaluate the proposed method and the results demonstrate that the periodic component analysis acts as a reliable spatial filter for SSVEP extraction. Statistical tests were performed to validate the results. The experimental results show that πCA provides significant improvement in accuracy compared to standard CCA and MEC in low SNR conditions. The results demonstrate that πCA provides better detection accuracy compared to CCA and on par with that of MEC at a lower computational cost. Hence πCA is a reliable and efficient alternative detection algorithm for SSVEP based brain-computer interface (BCI). Copyright © 2018. Published by Elsevier B.V.
Ceramic Matrix Composites for Rotorcraft Engines
NASA Technical Reports Server (NTRS)
Halbig, Michael C.
2011-01-01
Ceramic matrix composite (CMC) components are being developed for turbine engine applications. Compared to metallic components, the CMC components offer benefits of higher temperature capability and less cooling requirements which correlates to improved efficiency and reduced emissions. This presentation discusses a technology develop effort for overcoming challenges in fabricating a CMC vane for the high pressure turbine. The areas of technology development include small component fabrication, ceramic joining and integration, material and component testing and characterization, and design and analysis of concept components.
Wachowiak, Roman; Strach, Bogna
2006-01-01
The study takes advantage of the presently available effective physicochemical methods (isolation, crystallization, determination of melting point, TLC, GLC and UV spectrophotometry) for an objective and reliable qualitative and quantitative analysis of frequently abused drugs. The authors determined the conditions for qualitative and quantitative analysis of active components of the secured evidence materials containing amphetamine sulphate, methylamphetamine hydrochloride, 3,4-me-tylenedioxy-methamphetamine hydrochloride (MDMA, Ecstasy), as well as delta(9)-tetrahydrocannabinol (delta(9)-THC) as an active component of cannabis (marihuana, hashish). The usefulness of physicochemical tests of evidence materials for opinionating purposes is subject to a detailed forensic toxicological interpretation.
Structural response of SSME turbine blade airfoils
NASA Technical Reports Server (NTRS)
Arya, V. K.; Abdul-Aziz, A.; Thompson, R. L.
1988-01-01
Reusable space propulsion hot gas-path components are required to operate under severe thermal and mechanical loading conditions. These operating conditions produce elevated temperature and thermal transients which results in significant thermally induced inelastic strains, particularly, in the turbopump turbine blades. An inelastic analysis for this component may therefore be necessary. Anisotropic alloys such as MAR M-247 or PWA-1480 are being considered to meet the safety and durability requirements of this component. An anisotropic inelastic structural analysis for an SSME fuel turbopump turbine blade was performed. The thermal loads used resulted from a transient heat transfer analysis of a turbine blade. A comparison of preliminary results from the elastic and inelastic analyses is presented.
Development of STS/Centaur failure probabilities liftoff to Centaur separation
NASA Technical Reports Server (NTRS)
Hudson, J. M.
1982-01-01
The results of an analysis to determine STS/Centaur catastrophic vehicle response probabilities for the phases of vehicle flight from STS liftoff to Centaur separation from the Orbiter are presented. The analysis considers only category one component failure modes as contributors to the vehicle response mode probabilities. The relevant component failure modes are grouped into one of fourteen categories of potential vehicle behavior. By assigning failure rates to each component, for each of its failure modes, the STS/Centaur vehicle response probabilities in each phase of flight can be calculated. The results of this study will be used in a DOE analysis to ascertain the hazard from carrying a nuclear payload on the STS.
Method and apparatus for ceramic analysis
Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr
2003-04-01
The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.
Structural reliability analysis of laminated CMC components
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.
1991-01-01
For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.
Carty, Cara L; Bhattacharjee, Samsiddhi; Haessler, Jeff; Cheng, Iona; Hindorff, Lucia A; Aroda, Vanita; Carlson, Christopher S; Hsu, Chun-Nan; Wilkens, Lynne; Liu, Simin; Selvin, Elizabeth; Jackson, Rebecca; North, Kari E; Peters, Ulrike; Pankow, James S; Chatterjee, Nilanjan; Kooperberg, Charles
2014-08-01
Metabolic syndrome (MetS) refers to the clustering of cardiometabolic risk factors, including dyslipidemia, central adiposity, hypertension, and hyperglycemia, in individuals. Identification of pleiotropic genetic factors associated with MetS traits may shed light on key pathways or mediators underlying MetS. Using the Metabochip array in 15 148 African Americans from the Population Architecture using Genomics and Epidemiology (PAGE) study, we identify susceptibility loci and investigate pleiotropy among genetic variants using a subset-based meta-analysis method, ASsociation-analysis-based-on-subSETs (ASSET). Unlike conventional models that lack power when associations for MetS components are null or have opposite effects, Association-analysis-based-on-subsets uses 1-sided tests to detect positive and negative associations for components separately and combines tests accounting for correlations among components. With Association-analysis-based-on-subsets, we identify 27 single nucleotide polymorphisms in 1 glucose and 4 lipids loci (TCF7L2, LPL, APOA5, CETP, and APOC1/APOE/TOMM40) significantly associated with MetS components overall, all P<2.5e-7, the Bonferroni adjusted P value. Three loci replicate in a Hispanic population, n=5172. A novel African American-specific variant, rs12721054/APOC1, and rs10096633/LPL are associated with ≥3 MetS components. We find additional evidence of pleiotropy for APOE, TOMM40, TCF7L2, and CETP variants, many with opposing effects (eg, the same rs7901695/TCF7L2 allele is associated with increased odds of high glucose and decreased odds of central adiposity). We highlight a method to increase power in large-scale genomic association analyses and report a novel variant associated with all MetS components in African Americans. We also identify pleiotropic associations that may be clinically useful in patient risk profiling and for informing translational research of potential gene targets and medications. © 2014 American Heart Association, Inc.
The Distressed Brain: A Group Blind Source Separation Analysis on Tinnitus
De Ridder, Dirk; Vanneste, Sven; Congedo, Marco
2011-01-01
Background Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress. Methodology In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ), an electroencephalography (EEG) is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30) and high distress (N = 25). The groups are homogeneous for tinnitus type, tinnitus duration or tinnitus laterality. A group blind source separation (BSS) analysis is performed using a large normative sample (N = 84), generating seven normative components to which high and low tinnitus patients are compared. A correlation analysis of the obtained normative components' relative power and distress is performed. Furthermore, the functional connectivity as reflected by lagged phase synchronization is analyzed between the brain areas defined by the components. Finally, a group BSS analysis on the Tinnitus group as a whole is performed. Conclusions Tinnitus can be characterized by at least four BSS components, two of which are posterior cingulate based, one based on the subgenual anterior cingulate and one based on the parahippocampus. Only the subgenual component correlates with distress. When performed on a normative sample, group BSS reveals that distress is characterized by two anterior cingulate based components. Spectral analysis of these components demonstrates that distress in tinnitus is related to alpha and beta changes in a network consisting of the subgenual anterior cingulate cortex extending to the pregenual and dorsal anterior cingulate cortex as well as the ventromedial prefrontal cortex/orbitofrontal cortex, insula, and parahippocampus. This network overlaps partially with brain areas implicated in distress in patients suffering from pain, functional somatic syndromes and posttraumatic stress disorder, and might therefore represent a specific distress network. PMID:21998628
Analysis system for characterisation of simple, low-cost microfluidic components
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Naidoo, Thegaran; Nxumalo, Zandile; Land, Kevin; Davies, Emlyn; Fourie, Louis; Marais, Philip; Roux, Pieter
2014-06-01
There is an inherent trade-off between cost and operational integrity of microfluidic components, especially when intended for use in point-of-care devices. We present an analysis system developed to characterise microfluidic components for performing blood cell counting, enabling the balance between function and cost to be established quantitatively. Microfluidic components for sample and reagent introduction, mixing and dispensing of fluids were investigated. A simple inlet port plugging mechanism is used to introduce and dispense a sample of blood, while a reagent is released into the microfluidic system through compression and bursting of a blister pack. Mixing and dispensing of the sample and reagent are facilitated via air actuation. For these microfluidic components to be implemented successfully, a number of aspects need to be characterised for development of an integrated point-of-care device design. The functional components were measured using a microfluidic component analysis system established in-house. Experiments were carried out to determine: 1. the force and speed requirements for sample inlet port plugging and blister pack compression and release using two linear actuators and load cells for plugging the inlet port, compressing the blister pack, and subsequently measuring the resulting forces exerted, 2. the accuracy and repeatability of total volumes of sample and reagent dispensed, and 3. the degree of mixing and dispensing uniformity of the sample and reagent for cell counting analysis. A programmable syringe pump was used for air actuation to facilitate mixing and dispensing of the sample and reagent. Two high speed cameras formed part of the analysis system and allowed for visualisation of the fluidic operations within the microfluidic device. Additional quantitative measures such as microscopy were also used to assess mixing and dilution accuracy, as well as uniformity of fluid dispensing - all of which are important requirements towards the successful implementation of a blood cell counting system.
Research on distributed heterogeneous data PCA algorithm based on cloud platform
NASA Astrophysics Data System (ADS)
Zhang, Jin; Huang, Gang
2018-05-01
Principal component analysis (PCA) of heterogeneous data sets can solve the problem that centralized data scalability is limited. In order to reduce the generation of intermediate data and error components of distributed heterogeneous data sets, a principal component analysis algorithm based on heterogeneous data sets under cloud platform is proposed. The algorithm performs eigenvalue processing by using Householder tridiagonalization and QR factorization to calculate the error component of the heterogeneous database associated with the public key to obtain the intermediate data set and the lost information. Experiments on distributed DBM heterogeneous datasets show that the model method has the feasibility and reliability in terms of execution time and accuracy.
The Influence Function of Principal Component Analysis by Self-Organizing Rule.
Higuchi; Eguchi
1998-07-28
This article is concerned with a neural network approach to principal component analysis (PCA). An algorithm for PCA by the self-organizing rule has been proposed and its robustness observed through the simulation study by Xu and Yuille (1995). In this article, the robustness of the algorithm against outliers is investigated by using the theory of influence function. The influence function of the principal component vector is given in an explicit form. Through this expression, the method is shown to be robust against any directions orthogonal to the principal component vector. In addition, a statistic generated by the self-organizing rule is proposed to assess the influence of data in PCA.
The Influence Of Component Alignment On The Life Of Total Knee Prostheses
NASA Astrophysics Data System (ADS)
Bugariu, Delia; Bereteu, Liviu
2012-12-01
An arthritic knee affects the patient's life by causing pain and limiting movement. If the cartilage and the bone surfaces are severely affected, the natural joint is replaced with an artificial joint. The procedure is called total knee arthroplasty (TKA). Lately, the numbers of implanted total knee prostheses grow steadily. An important factor in TKA is the perfect alignment of the total knee prosthesis (TKP) components. Component misalignment can lead to the prosthesis loss by producing wear particles. The paper proposes a study on mechanical behaviors of a TKP based on numerical analysis, using ANSYS software. The numerical analysis is based on both the normal and the changed angle of the components alignment.
Grizzly Usage and Theory Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Backman, M.; Chakraborty, P.
2016-03-01
Grizzly is a multiphysics simulation code for characterizing the behavior of nuclear power plant (NPP) structures, systems and components (SSCs) subjected to a variety of age-related aging mechanisms. Grizzly simulates both the progression of aging processes, as well as the capacity of aged components to safely perform. This initial beta release of Grizzly includes capabilities for engineering-scale thermo-mechanical analysis of reactor pressure vessels (RPVs). Grizzly will ultimately include capabilities for a wide range of components and materials. Grizzly is in a state of constant development, and future releases will broaden the capabilities of this code for RPV analysis, as wellmore » as expand it to address degradation in other critical NPP components.« less
Analysis and Evaluation of the Characteristic Taste Components in Portobello Mushroom.
Wang, Jinbin; Li, Wen; Li, Zhengpeng; Wu, Wenhui; Tang, Xueming
2018-05-10
To identify the characteristic taste components of the common cultivated mushroom (brown; Portobello), Agaricus bisporus, taste components in the stipe and pileus of Portobello mushroom harvested at different growth stages were extracted and identified, and principal component analysis (PCA) and taste active value (TAV) were used to reveal the characteristic taste components during the each of the growth stages of Portobello mushroom. In the stipe and pileus, 20 and 14 different principal taste components were identified, respectively, and they were considered as the principal taste components of Portobello mushroom fruit bodies, which included most amino acids and 5'-nucleotides. Some taste components that were found at high levels, such as lactic acid and citric acid, were not detected as Portobello mushroom principal taste components through PCA. However, due to their high content, Portobello mushroom could be used as a source of organic acids. The PCA and TAV results revealed that 5'-GMP, glutamic acid, malic acid, alanine, proline, leucine, and aspartic acid were the characteristic taste components of Portobello mushroom fruit bodies. Portobello mushroom was also found to be rich in protein and amino acids, so it might also be useful in the formulation of nutraceuticals and functional food. The results in this article could provide a theoretical basis for understanding and regulating the characteristic flavor components synthesis process of Portobello mushroom. © 2018 Institute of Food Technologists®.
Wang, Jinxu; Tong, Xin; Li, Peibo; Liu, Menghua; Peng, Wei; Cao, Hui; Su, Weiwei
2014-08-08
Shenqi Fuzheng Injection (SFI) is an injectable traditional Chinese herbal formula comprised of two Chinese herbs, Radix codonopsis and Radix astragali, which were commonly used to improve immune functions against chronic diseases in an integrative and holistic way in China and other East Asian countries for thousands of years. This present study was designed to explore the bioactive components on immuno-enhancement effects in SFI using the relevance analysis between chemical fingerprints and biological effects in vivo. According to a four-factor, nine-level uniform design, SFI samples were prepared with different proportions of the four portions separated from SFI via high speed counter current chromatography (HSCCC). SFI samples were assessed with high performance liquid chromatography (HPLC) for 23 identified components. For the immunosuppressed murine experiments, biological effects in vivo were evaluated on spleen index (E1), peripheral white blood cell counts (E2), bone marrow cell counts (E3), splenic lymphocyte proliferation (E4), splenic natural killer cell activity (E5), peritoneal macrophage phagocytosis (E6) and the amount of interleukin-2 (E7). Based on the hypothesis that biological effects in vivo varied with differences in components, multivariate relevance analysis, including gray relational analysis (GRA), multi-linear regression analysis (MLRA) and principal component analysis (PCA), were performed to evaluate the contribution of each identified component. The results indicated that the bioactive components of SFI on immuno-enhancement activities were calycosin-7-O-β-d-glucopyranoside (P9), isomucronulatol-7,2'-di-O-glucoside (P11), biochanin-7-glucoside (P12), 9,10-dimethoxypterocarpan-3-O-xylosylglucoside (P15) and astragaloside IV (P20), which might have positive effects on spleen index (E1), splenic lymphocyte proliferation (E4), splenic natural killer cell activity (E5), peritoneal macrophage phagocytosis (E6) and the amount of interleukin-2 (E7), while 5-hydroxymethyl-furaldehyde (P5) and lobetyolin (P13) might have negative effects on E1, E4, E5, E6 and E7. Finally, the bioactive HPLC fingerprint of SFI based on its bioactive components on immuno-enhancement effects was established for quality control of SFI. In summary, this study provided a perspective to explore the bioactive components in a traditional Chinese herbal formula with a series of HPLC and animal experiments, which would be helpful to improve quality control and inspire further clinical studies of traditional Chinese medicines. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Maneshi, Mona; Vahdat, Shahabeddin; Gotman, Jean; Grova, Christophe
2016-01-01
Independent component analysis (ICA) has been widely used to study functional magnetic resonance imaging (fMRI) connectivity. However, the application of ICA in multi-group designs is not straightforward. We have recently developed a new method named “shared and specific independent component analysis” (SSICA) to perform between-group comparisons in the ICA framework. SSICA is sensitive to extract those components which represent a significant difference in functional connectivity between groups or conditions, i.e., components that could be considered “specific” for a group or condition. Here, we investigated the performance of SSICA on realistic simulations, and task fMRI data and compared the results with one of the state-of-the-art group ICA approaches to infer between-group differences. We examined SSICA robustness with respect to the number of allowable extracted specific components and between-group orthogonality assumptions. Furthermore, we proposed a modified formulation of the back-reconstruction method to generate group-level t-statistics maps based on SSICA results. We also evaluated the consistency and specificity of the extracted specific components by SSICA. The results on realistic simulated and real fMRI data showed that SSICA outperforms the regular group ICA approach in terms of reconstruction and classification performance. We demonstrated that SSICA is a powerful data-driven approach to detect patterns of differences in functional connectivity across groups/conditions, particularly in model-free designs such as resting-state fMRI. Our findings in task fMRI show that SSICA confirms results of the general linear model (GLM) analysis and when combined with clustering analysis, it complements GLM findings by providing additional information regarding the reliability and specificity of networks. PMID:27729843
ERIC Educational Resources Information Center
Ackermann, Margot Elise; Morrow, Jennifer Ann
2008-01-01
The present study describes the development and initial validation of the Coping with the College Environment Scale (CWCES). Participants included 433 college students who took an online survey. Principal Components Analysis (PCA) revealed six coping strategies: planning and self-management, seeking support from institutional resources, escaping…
NASA Astrophysics Data System (ADS)
Kistenev, Yu. V.; Shapovalov, A. V.; Borisov, A. V.; Vrazhnov, D. A.; Nikolaev, V. V.; Nikiforova, O. Yu.
2015-11-01
The comparison results of different mother wavelets used for de-noising of model and experimental data which were presented by profiles of absorption spectra of exhaled air are presented. The impact of wavelets de-noising on classification quality made by principal component analysis are also discussed.
Evaluation of skin melanoma in spectral range 450-950 nm using principal component analysis
NASA Astrophysics Data System (ADS)
Jakovels, D.; Lihacova, I.; Kuzmina, I.; Spigulis, J.
2013-06-01
Diagnostic potential of principal component analysis (PCA) of multi-spectral imaging data in the wavelength range 450- 950 nm for distant skin melanoma recognition is discussed. Processing of the measured clinical data by means of PCA resulted in clear separation between malignant melanomas and pigmented nevi.
ERIC Educational Resources Information Center
Wilson, Mark V.; Wilson, Erin
2017-01-01
In this work we describe an authentic performance project for Instrumental Analysis in which students designed, built, and tested spectrophotometers made from simple components. The project addressed basic course content such as instrument design principles, UV-vis spectroscopy, and spectroscopic instrument components as well as skills such as…
ERIC Educational Resources Information Center
Linting, Marielle; Meulman, Jacqueline J.; Groenen, Patrick J. F.; van der Kooij, Anita J.
2007-01-01
Principal components analysis (PCA) is used to explore the structure of data sets containing linearly related numeric variables. Alternatively, nonlinear PCA can handle possibly nonlinearly related numeric as well as nonnumeric variables. For linear PCA, the stability of its solution can be established under the assumption of multivariate…
Principal Component Analysis: Resources for an Essential Application of Linear Algebra
ERIC Educational Resources Information Center
Pankavich, Stephen; Swanson, Rebecca
2015-01-01
Principal Component Analysis (PCA) is a highly useful topic within an introductory Linear Algebra course, especially since it can be used to incorporate a number of applied projects. This method represents an essential application and extension of the Spectral Theorem and is commonly used within a variety of fields, including statistics,…
Energy efficient engine. Volume 1: Component development and integration program
NASA Technical Reports Server (NTRS)
1981-01-01
Technology for achieving lower installed fuel consumption and lower operating costs in future commercial turbofan engines are developed, evaluated, and demonstrated. The four program objectives are: (1) propulsion system analysis; (2) component analysis, design, and development; (3) core design, fabrication, and test; and (4) integrated core/low spoon design, fabrication, and test.
Learning Principal Component Analysis by Using Data from Air Quality Networks
ERIC Educational Resources Information Center
Perez-Arribas, Luis Vicente; Leon-González, María Eugenia; Rosales-Conrado, Noelia
2017-01-01
With the final objective of using computational and chemometrics tools in the chemistry studies, this paper shows the methodology and interpretation of the Principal Component Analysis (PCA) using pollution data from different cities. This paper describes how students can obtain data on air quality and process such data for additional information…
Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...
NASA Technical Reports Server (NTRS)
Duong, T. A.
2004-01-01
In this paper, we present a new, simple, and optimized hardware architecture sequential learning technique for adaptive Principle Component Analysis (PCA) which will help optimize the hardware implementation in VLSI and to overcome the difficulties of the traditional gradient descent in learning convergence and hardware implementation.
Applications of Nonlinear Principal Components Analysis to Behavioral Data.
ERIC Educational Resources Information Center
Hicks, Marilyn Maginley
1981-01-01
An empirical investigation of the statistical procedure entitled nonlinear principal components analysis was conducted on a known equation and on measurement data in order to demonstrate the procedure and examine its potential usefulness. This method was suggested by R. Gnanadesikan and based on an early paper of Karl Pearson. (Author/AL)
10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.
Code of Federal Regulations, 2010 CFR
2010-01-01
... analysis of the structures, systems, and components of the reactor to be manufactured, with emphasis upon... assumed for this evaluation should be based upon a major accident, hypothesized for purposes of site... structures, systems, and components with the objective of assessing the risk to public health and safety...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
ERIC Educational Resources Information Center
Hendrix, Dean
2010-01-01
This study analyzed 2005-2006 Web of Science bibliometric data from institutions belonging to the Association of Research Libraries (ARL) and corresponding ARL statistics to find any associations between indicators from the two data sets. Principal components analysis on 36 variables from 103 universities revealed obvious associations between…
Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)
NASA Astrophysics Data System (ADS)
De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.
1993-01-01
The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.
Discriminant analysis of resting-state functional connectivity patterns on the Grassmann manifold
NASA Astrophysics Data System (ADS)
Fan, Yong; Liu, Yong; Jiang, Tianzi; Liu, Zhening; Hao, Yihui; Liu, Haihong
2010-03-01
The functional networks, extracted from fMRI images using independent component analysis, have been demonstrated informative for distinguishing brain states of cognitive functions and neurological diseases. In this paper, we propose a novel algorithm for discriminant analysis of functional networks encoded by spatial independent components. The functional networks of each individual are used as bases for a linear subspace, referred to as a functional connectivity pattern, which facilitates a comprehensive characterization of temporal signals of fMRI data. The functional connectivity patterns of different individuals are analyzed on the Grassmann manifold by adopting a principal angle based subspace distance. In conjunction with a support vector machine classifier, a forward component selection technique is proposed to select independent components for constructing the most discriminative functional connectivity pattern. The discriminant analysis method has been applied to an fMRI based schizophrenia study with 31 schizophrenia patients and 31 healthy individuals. The experimental results demonstrate that the proposed method not only achieves a promising classification performance for distinguishing schizophrenia patients from healthy controls, but also identifies discriminative functional networks that are informative for schizophrenia diagnosis.
Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs
NASA Technical Reports Server (NTRS)
Somani, Arun K.
1996-01-01
Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.
Lü, Gui-Cai; Zhao, Wei-Hong; Wang, Jiang-Tao
2011-01-01
The identification techniques for 10 species of red tide algae often found in the coastal areas of China were developed by combining the three-dimensional fluorescence spectra of fluorescence dissolved organic matter (FDOM) from the cultured red tide algae with principal component analysis. Based on the results of principal component analysis, the first principal component loading spectrum of three-dimensional fluorescence spectrum was chosen as the identification characteristic spectrum for red tide algae, and the phytoplankton fluorescence characteristic spectrum band was established. Then the 10 algae species were tested using Bayesian discriminant analysis with a correct identification rate of more than 92% for Pyrrophyta on the level of species, and that of more than 75% for Bacillariophyta on the level of genus in which the correct identification rates were more than 90% for the phaeodactylum and chaetoceros. The results showed that the identification techniques for 10 species of red tide algae based on the three-dimensional fluorescence spectra of FDOM from the cultured red tide algae and principal component analysis could work well.
[New method of mixed gas infrared spectrum analysis based on SVM].
Bai, Peng; Xie, Wen-Jun; Liu, Jun-Hua
2007-07-01
A new method of infrared spectrum analysis based on support vector machine (SVM) for mixture gas was proposed. The kernel function in SVM was used to map the seriously overlapping absorption spectrum into high-dimensional space, and after transformation, the high-dimensional data could be processed in the original space, so the regression calibration model was established, then the regression calibration model with was applied to analyze the concentration of component gas. Meanwhile it was proved that the regression calibration model with SVM also could be used for component recognition of mixture gas. The method was applied to the analysis of different data samples. Some factors such as scan interval, range of the wavelength, kernel function and penalty coefficient C that affect the model were discussed. Experimental results show that the component concentration maximal Mean AE is 0.132%, and the component recognition accuracy is higher than 94%. The problems of overlapping absorption spectrum, using the same method for qualitative and quantitative analysis, and limit number of training sample, were solved. The method could be used in other mixture gas infrared spectrum analyses, promising theoretic and application values.
González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio
2015-03-01
A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.
Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J
2015-01-01
In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron.
Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J.
2015-01-01
In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron. PMID:25849483
Gruen, D.M.; Young, C.E.; Pellin, M.J.
1989-12-26
A charged particle spectrometer is described for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode. 12 figs.
NASA Technical Reports Server (NTRS)
Knuth, Kevin H.; Shah, Ankoor S.; Truccolo, Wilson; Ding, Ming-Zhou; Bressler, Steven L.; Schroeder, Charles E.
2003-01-01
Electric potentials and magnetic fields generated by ensembles of synchronously active neurons in response to external stimuli provide information essential to understanding the processes underlying cognitive and sensorimotor activity. Interpreting recordings of these potentials and fields is difficult as each detector records signals simultaneously generated by various regions throughout the brain. We introduce the differentially Variable Component Analysis (dVCA) algorithm, which relies on trial-to-trial variability in response amplitude and latency to identify multiple components. Using simulations we evaluate the importance of response variability to component identification, the robustness of dVCA to noise, and its ability to characterize single-trial data. Finally, we evaluate the technique using visually evoked field potentials recorded at incremental depths across the layers of cortical area VI, in an awake, behaving macaque monkey.
Handheld CZT radiation detector
Murray, William S.; Butterfield, Kenneth B.; Baird, William
2004-08-24
A handheld CZT radiation detector having a CZT gamma-ray sensor, a multichannel analyzer, a fuzzy-logic component, and a display component is disclosed. The CZT gamma-ray sensor may be a coplanar grid CZT gamma-ray sensor, which provides high-quality gamma-ray analysis at a wide range of operating temperatures. The multichannel analyzer categorizes pulses produce by the CZT gamma-ray sensor into channels (discrete energy levels), resulting in pulse height data. The fuzzy-logic component analyzes the pulse height data and produces a ranked listing of radioisotopes. The fuzzy-logic component is flexible and well-suited to in-field analysis of radioisotopes. The display component may be a personal data assistant, which provides a user-friendly method of interacting with the detector. In addition, the radiation detector may be equipped with a neutron sensor to provide an enhanced mechanism of sensing radioactive materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritsky, K.J.; Miller, D.L.; Cernansky, N.P.
1994-09-01
A methodology was introduced for modeling the devolatilization characteristics of refuse-derived fuel (RFD) in terms of temperature-dependent weight loss. The basic premise of the methodology is that RDF is modeled as a combination of select municipal solid waste (MSW) components. Kinetic parameters are derived for each component from thermogravimetric analyzer (TGA) data measured at a specific set of conditions. These experimentally derived parameters, along with user-derived parameters, are inputted to model equations for the purpose of calculating thermograms for the components. The component thermograms are summed to create a composite thermogram that is an estimate of the devolatilization for themore » as-modeled RFD. The methodology has several attractive features as a thermal analysis tool for waste fuels. 7 refs., 10 figs., 3 tabs.« less
Screening of polar components of petroleum products by electrospray ionization mass spectrometry
Rostad, Colleen E.
2005-01-01
The polar components of fuels may enable differentiation between fuel types or commercial fuel sources. Screening for these components in the hydrocarbon product is difficult due to their very low concentrations in such a complex matrix. Various commercial fuels from several sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at very low concentrations in commercial hydrocarbon products. This analysis was then applied to hydrocarbon samples collected from the subsurface with a different extent of biodegradation or weathering. Although the alkane and isoprenoid portion had begun to biodegrade or weather, the polar components had changed little over time. Because these polar compounds are unique in different fuels, this screening technique can provide source information on hydrocarbons released into the environment.
A Component Analysis of Positive Behaviour Support Plans
ERIC Educational Resources Information Center
McClean, Brian; Grey, Ian
2012-01-01
Background: Positive behaviour support (PBS) emphasises multi-component interventions by natural intervention agents to help people overcome challenging behaviours. This paper investigates which components are most effective and which factors might mediate effectiveness. Method: Sixty-one staff working with individuals with intellectual disability…
Bouhlel, Jihéne; Jouan-Rimbaud Bouveresse, Delphine; Abouelkaram, Said; Baéza, Elisabeth; Jondreville, Catherine; Travel, Angélique; Ratel, Jérémy; Engel, Erwan; Rutledge, Douglas N
2018-02-01
The aim of this work is to compare a novel exploratory chemometrics method, Common Components Analysis (CCA), with Principal Components Analysis (PCA) and Independent Components Analysis (ICA). CCA consists in adapting the multi-block statistical method known as Common Components and Specific Weights Analysis (CCSWA or ComDim) by applying it to a single data matrix, with one variable per block. As an application, the three methods were applied to SPME-GC-MS volatolomic signatures of livers in an attempt to reveal volatile organic compounds (VOCs) markers of chicken exposure to different types of micropollutants. An application of CCA to the initial SPME-GC-MS data revealed a drift in the sample Scores along CC2, as a function of injection order, probably resulting from time-related evolution in the instrument. This drift was eliminated by orthogonalization of the data set with respect to CC2, and the resulting data are used as the orthogonalized data input into each of the three methods. Since the first step in CCA is to norm-scale all the variables, preliminary data scaling has no effect on the results, so that CCA was applied only to orthogonalized SPME-GC-MS data, while, PCA and ICA were applied to the "orthogonalized", "orthogonalized and Pareto-scaled", and "orthogonalized and autoscaled" data. The comparison showed that PCA results were highly dependent on the scaling of variables, contrary to ICA where the data scaling did not have a strong influence. Nevertheless, for both PCA and ICA the clearest separations of exposed groups were obtained after autoscaling of variables. The main part of this work was to compare the CCA results using the orthogonalized data with those obtained with PCA and ICA applied to orthogonalized and autoscaled variables. The clearest separations of exposed chicken groups were obtained by CCA. CCA Loadings also clearly identified the variables contributing most to the Common Components giving separations. The PCA Loadings did not highlight the most influencing variables for each separation, whereas the ICA Loadings highlighted the same variables as did CCA. This study shows the potential of CCA for the extraction of pertinent information from a data matrix, using a procedure based on an original optimisation criterion, to produce results that are complementary, and in some cases may be superior, to those of PCA and ICA. Copyright © 2017 Elsevier B.V. All rights reserved.
Principal component analysis and the locus of the Fréchet mean in the space of phylogenetic trees.
Nye, Tom M W; Tang, Xiaoxian; Weyenberg, Grady; Yoshida, Ruriko
2017-12-01
Evolutionary relationships are represented by phylogenetic trees, and a phylogenetic analysis of gene sequences typically produces a collection of these trees, one for each gene in the analysis. Analysis of samples of trees is difficult due to the multi-dimensionality of the space of possible trees. In Euclidean spaces, principal component analysis is a popular method of reducing high-dimensional data to a low-dimensional representation that preserves much of the sample's structure. However, the space of all phylogenetic trees on a fixed set of species does not form a Euclidean vector space, and methods adapted to tree space are needed. Previous work introduced the notion of a principal geodesic in this space, analogous to the first principal component. Here we propose a geometric object for tree space similar to the [Formula: see text]th principal component in Euclidean space: the locus of the weighted Fréchet mean of [Formula: see text] vertex trees when the weights vary over the [Formula: see text]-simplex. We establish some basic properties of these objects, in particular showing that they have dimension [Formula: see text], and propose algorithms for projection onto these surfaces and for finding the principal locus associated with a sample of trees. Simulation studies demonstrate that these algorithms perform well, and analyses of two datasets, containing Apicomplexa and African coelacanth genomes respectively, reveal important structure from the second principal components.
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz
2010-08-06
Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.
Neural Networks for Rapid Design and Analysis
NASA Technical Reports Server (NTRS)
Sparks, Dean W., Jr.; Maghami, Peiman G.
1998-01-01
Artificial neural networks have been employed for rapid and efficient dynamics and control analysis of flexible systems. Specifically, feedforward neural networks are designed to approximate nonlinear dynamic components over prescribed input ranges, and are used in simulations as a means to speed up the overall time response analysis process. To capture the recursive nature of dynamic components with artificial neural networks, recurrent networks, which use state feedback with the appropriate number of time delays, as inputs to the networks, are employed. Once properly trained, neural networks can give very good approximations to nonlinear dynamic components, and by their judicious use in simulations, allow the analyst the potential to speed up the analysis process considerably. To illustrate this potential speed up, an existing simulation model of a spacecraft reaction wheel system is executed, first conventionally, and then with an artificial neural network in place.
Inversion of gravity gradient tensor data: does it provide better resolution?
NASA Astrophysics Data System (ADS)
Paoletti, V.; Fedi, M.; Italiano, F.; Florio, G.; Ialongo, S.
2016-04-01
The gravity gradient tensor (GGT) has been increasingly used in practical applications, but the advantages and the disadvantages of the analysis of GGT components versus the analysis of the vertical component of the gravity field are still debated. We analyse the performance of joint inversion of GGT components versus separate inversion of the gravity field alone, or of one tensor component. We perform our analysis by inspection of the Picard Plot, a Singular Value Decomposition tool, and analyse both synthetic data and gradiometer measurements carried out at the Vredefort structure, South Africa. We show that the main factors controlling the reliability of the inversion are algebraic ambiguity (the difference between the number of unknowns and the number of available data points) and signal-to-noise ratio. Provided that algebraic ambiguity is kept low and the noise level is small enough so that a sufficient number of SVD components can be included in the regularized solution, we find that: (i) the choice of tensor components involved in the inversion is not crucial to the overall reliability of the reconstructions; (ii) GGT inversion can yield the same resolution as inversion with a denser distribution of gravity data points, but with the advantage of using fewer measurement stations.
Widjaja, Effendi; Garland, Marc
2008-02-01
Raman microscopy was used in mapping mode to collect more than 1000 spectra in a 100 microm x 100 microm area from a commercial stamp. Band-target entropy minimization (BTEM) was then employed to unmix the mixture spectra in order to extract the pure component spectra of the samples. Three pure component spectral patterns with good signal-to-noise ratios were recovered, and their spatial distributions were determined. The three pure component spectral patterns were then identified as copper phthalocyanine blue, calcite-like material, and yellow organic dye material by comparison to known spectral libraries. The present investigation, consisting of (1) advanced curve resolution (blind-source separation) followed by (2) spectral data base matching, readily suggests extensions to authenticity and counterfeit studies of other types of commercial objects. The presence or absence of specific observable components form the basis for assessment. The present spectral analysis (BTEM) is applicable to highly overlapping spectral information. Since a priori information such as the number of components present and spectral libraries are not needed in BTEM, and since minor signals arising from trace components can be reconstructed, this analysis offers a robust approach to a wide variety of material problems involving authenticity and counterfeit issues.
NASA Astrophysics Data System (ADS)
Zhao, Ying; Song, Kaishan; Wen, Zhidan; Fang, Chong; Shang, Yingxin; Lv, Lili
2017-07-01
The spatial distributions of the fluorescence intensities Fmax for chromophoric dissolved organic matter (CDOM) components, the fluorescence indices (FI370 and FI310) and their correlations with water quality of 19 lakes in the Songhua River Basin (SHRB) across semiarid regions of Northeast China were examined with the data collected in September 2012 and 2015. The 19 lakes were divided into two groups according to EC (threshold value = 800 μS cm-1): fresh water (N = 13) and brackish water lakes (N = 6). The fluorescent characteristics of CDOM in the 19 lakes were investigated using excitation-emission matrix fluorescence spectroscopy (EEM) coupled with parallel factor (PARAFAC) and multivariate analysis. Two humic-like components (C1 and C3), one tryptophan-like component (C2), and one tyrosine-like component (C4) were identified by PARAFAC. The component C4 was not included in subsequent analyses due to the strong scatter in some colloidal water samples from brackish water lakes. The correlations between Fmax for the three EEM-PARAFAC extracted CDOM components C1-C3, the fluorescence indices (FI370 and FI310) and the water quality parameters (i.e., TN, TP, Chl-a, pH, EC, turbidity (Turb) and dissolved organic carbon (DOC)) were determined by redundancy analysis (RDA). The results of RDA analysis showed that spatial variation in land cover, pollution sources, and salinity/EC gradients in water quality affected Fmax for the fluorescent components C1-C3 and the fluorescence indices (FI370 and FI310). Further examination indicated that the CDOM fluorescent components and the fluorescence indices (FI370 and FI310) did not significantly differ (t-test, p > 0.05) in fresh water (N = 13) and brackish water lakes (N = 6). There was a difference in the distribution of the average Fmax for the CDOM fluorescent components between C1 to C3 from agricultural sources and urban wastewater sources in hypereutrophic brackish water lakes. The Fmax for humic-like components C1 and C3 spatially varied with land cover among the 19 lakes. Our results indicated that the spatial distributions of Fmax for CDOM fluorescent components and their correlations with water quality can be evaluated by EEM-PARAFAC and multivariate analysis among the 19 lakes across semiarid regions of Northeast China, which has potential implication for lakes with similar genesis.
Zhao, Ying; Song, Kaishan; Li, Sijia; Ma, Jianhang; Wen, Zhidan
2016-08-01
Chromophoric dissolved organic matter (CDOM) plays an important role in aquatic systems, but high concentrations of organic materials are considered pollutants. The fluorescent component characteristics of CDOM in urban waters sampled from Northern and Northeastern China were examined by excitation-emission matrix fluorescence and parallel factor analysis (EEM-PARAFAC) to investigate the source and compositional changes of CDOM on both space and pollution levels. One humic-like (C1), one tryptophan-like component (C2), and one tyrosine-like component (C3) were identified by PARAFAC. Mean fluorescence intensities of the three CDOM components varied spatially and by pollution level in cities of Northern and Northeastern China during July-August, 2013 and 2014. Principal components analysis (PCA) was conducted to identify the relative distribution of all water samples. Cluster analysis (CA) was also used to categorize the samples into groups of similar pollution levels within a study area. Strong positive linear relationships were revealed between the CDOM absorption coefficients a(254) (R (2) = 0.89, p < 0.01); a(355) (R (2) = 0.94, p < 0.01); and the fluorescence intensity (F max) for the humic-like C1 component. A positive linear relationship (R (2) = 0.77) was also exhibited between dissolved organic carbon (DOC) and the F max for the humic-like C1 component, but a relatively weak correlation (R (2) = 0.56) was detected between DOC and the F max for the tryptophan-like component (C2). A strong positive correlation was observed between the F max for the tryptophan-like component (C2) and total nitrogen (TN) (R (2) = 0.78), but moderate correlations were observed with ammonium-N (NH4-N) (R (2) = 0.68), and chemical oxygen demand (CODMn) (R (2) = 0.52). Therefore, the fluorescence intensities of CDOM components can be applied to monitor water quality in real time compared to that of traditional approaches. These results demonstrate that EEM-PARAFAC is useful to evaluate the dynamics of CDOM fluorescent components in urban waters from Northern and Northeastern China and this method has potential applications for monitoring urban water quality in different regions with various hydrological conditions and pollution levels.
Sun, Hui; Wang, Huiyu; Zhang, Aihua; Yan, Guangli; Han, Ying; Li, Yuan; Wu, Xiuhong; Meng, Xiangcai; Wang, Xijun
2016-01-01
As herbal medicines have an important position in health care systems worldwide, their current assessment, and quality control are a major bottleneck. Cortex Phellodendri chinensis (CPC) and Cortex Phellodendri amurensis (CPA) are widely used in China, however, how to identify species of CPA and CPC has become urgent. In this study, multivariate analysis approach was performed to the investigation of chemical discrimination of CPA and CPC. Principal component analysis showed that two herbs could be separated clearly. The chemical markers such as berberine, palmatine, phellodendrine, magnoflorine, obacunone, and obaculactone were identified through the orthogonal partial least squared discriminant analysis, and were identified tentatively by the accurate mass of quadruple-time-of-flight mass spectrometry. A total of 29 components can be used as the chemical markers for discrimination of CPA and CPC. Of them, phellodenrine is significantly higher in CPC than that of CPA, whereas obacunone and obaculactone are significantly higher in CPA than that of CPC. The present study proves that multivariate analysis approach based chemical analysis greatly contributes to the investigation of CPA and CPC, and showed that the identified chemical markers as a whole should be used to discriminate the two herbal medicines, and simultaneously the results also provided chemical information for their quality assessment. Multivariate analysis approach was performed to the investigate the herbal medicineThe chemical markers were identified through multivariate analysis approachA total of 29 components can be used as the chemical markers. UPLC-Q/TOF-MS-based multivariate analysis method for the herbal medicine samples Abbreviations used: CPC: Cortex Phellodendri chinensis, CPA: Cortex Phellodendri amurensis, PCA: Principal component analysis, OPLS-DA: Orthogonal partial least squares discriminant analysis, BPI: Base peaks ion intensity.
Characterization of Strombolian events by using independent component analysis
NASA Astrophysics Data System (ADS)
Ciaramella, A.; de Lauro, E.; de Martino, S.; di Lieto, B.; Falanga, M.; Tagliaferri, R.
2004-10-01
We apply Independent Component Analysis (ICA) to seismic signals recorded at Stromboli volcano. Firstly, we show how ICA works considering synthetic signals, which are generated by dynamical systems. We prove that Strombolian signals, both tremor and explosions, in the high frequency band (>0.5 Hz), are similar in time domain. This seems to give some insights to the organ pipe model generation for the source of these events. Moreover, we are able to recognize in the tremor signals a low frequency component (<0.5 Hz), with a well defined peak corresponding to 30s.
NASA Technical Reports Server (NTRS)
White, A. L.
1983-01-01
This paper examines the reliability of three architectures for six components. For each architecture, the probabilities of the failure states are given by algebraic formulas involving the component fault rate, the system recovery rate, and the operating time. The dominant failure modes are identified, and the change in reliability is considered with respect to changes in fault rate, recovery rate, and operating time. The major conclusions concern the influence of system architecture on failure modes and parameter requirements. Without this knowledge, a system designer may pick an inappropriate structure.
Computing Reliabilities Of Ceramic Components Subject To Fracture
NASA Technical Reports Server (NTRS)
Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.
1992-01-01
CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Lark, R. F.; Sinclair, J. H.
1977-01-01
An integrated theory is developed for predicting the hydrothermomechanical (HDTM) response of fiber composite components. The integrated theory is based on a combined theoretical and experimental investigation. In addition to predicting the HDTM response of components, the theory is structured to assess the combined hydrothermal effects on the mechanical properties of unidirectional composites loaded along the material axis and off-axis, and those of angleplied laminates. The theory developed predicts values which are in good agreement with measured data at the micromechanics, macromechanics, laminate analysis and structural analysis levels.
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, A.; Duffy, S. F.; Gyekenyesi, J. P.
1992-01-01
An updated version of the integrated design program C/CARES (composite ceramic analysis and reliability evaluation of structures) was developed for the reliability evaluation of CMC laminated shell components. The algorithm is now split in two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The new interface program from the finite-element code MARC also includes the option of using hybrid laminates and allows for variations in temperature fields throughout the component.
An ICT Adoption Framework for Education: A Case Study in Public Secondary School of Indonesia
NASA Astrophysics Data System (ADS)
Nurjanah, S.; Santoso, H. B.; Hasibuan, Z. A.
2017-01-01
This paper presents preliminary research findings on the ICT adoption framework for education. Despite many studies have been conducted on ICT adoption framework in education at various countries, they are lack of analysis on the degree of component contribution to the success to the framework. In this paper a set of components that link to ICT adoption in education is observed based on literatures and explorative analysis. The components are Infrastructure, Application, User Skills, Utilization, Finance, and Policy. The components are used as a basis to develop a questionnaire to capture the current ICT adoption condition in schools. The data from questionnaire are processed using Structured Equation Model (SEM). The results show that each component contributes differently to the ICT adoption framework. Finance provides the strongest affect to Infrastructure readiness, whilst User Skills provides the strongest affect to Utilization. The study concludes that development of ICT adoption framework should consider components contribution weights among the components that can be used to guide the implementation of ICT adoption in education.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Grimbergen, M C M; van Swol, C F P; Kendall, C; Verdaasdonk, R M; Stone, N; Bosch, J L H R
2010-01-01
The overall quality of Raman spectra in the near-infrared region, where biological samples are often studied, has benefited from various improvements to optical instrumentation over the past decade. However, obtaining ample spectral quality for analysis is still challenging due to device requirements and short integration times required for (in vivo) clinical applications of Raman spectroscopy. Multivariate analytical methods, such as principal component analysis (PCA) and linear discriminant analysis (LDA), are routinely applied to Raman spectral datasets to develop classification models. Data compression is necessary prior to discriminant analysis to prevent or decrease the degree of over-fitting. The logical threshold for the selection of principal components (PCs) to be used in discriminant analysis is likely to be at a point before the PCs begin to introduce equivalent signal and noise and, hence, include no additional value. Assessment of the signal-to-noise ratio (SNR) at a certain peak or over a specific spectral region will depend on the sample measured. Therefore, the mean SNR over the whole spectral region (SNR(msr)) is determined in the original spectrum as well as for spectra reconstructed from an increasing number of principal components. This paper introduces a method of assessing the influence of signal and noise from individual PC loads and indicates a method of selection of PCs for LDA. To evaluate this method, two data sets with different SNRs were used. The sets were obtained with the same Raman system and the same measurement parameters on bladder tissue collected during white light cystoscopy (set A) and fluorescence-guided cystoscopy (set B). This method shows that the mean SNR over the spectral range in the original Raman spectra of these two data sets is related to the signal and noise contribution of principal component loads. The difference in mean SNR over the spectral range can also be appreciated since fewer principal components can reliably be used in the low SNR data set (set B) compared to the high SNR data set (set A). Despite the fact that no definitive threshold could be found, this method may help to determine the cutoff for the number of principal components used in discriminant analysis. Future analysis of a selection of spectral databases using this technique will allow optimum thresholds to be selected for different applications and spectral data quality levels.
Carbon-carbon primary structure for SSTO vehicles
NASA Astrophysics Data System (ADS)
Croop, Harold C.; Lowndes, Holland B.
1997-01-01
A hot structures development program is nearing completion to validate use of carbon-carbon composite structure for primary load carrying members in a single-stage-to-orbit, or SSTO, vehicle. A four phase program was pursued which involved design development and fabrication of a full-scale wing torque box demonstration component. The design development included vehicle and component selection, design criteria and approach, design data development, demonstration component design and analysis, test fixture design and analysis, demonstration component test planning, and high temperature test instrumentation development. The fabrication effort encompassed fabrication of structural elements for mechanical property verification as well as fabrication of the demonstration component itself and associated test fixturing. The demonstration component features 3D woven graphite preforms, integral spars, oxidation inhibited matrix, chemical vapor deposited (CVD) SiC oxidation protection coating, and ceramic matrix composite fasteners. The demonstration component has been delivered to the United States Air Force (USAF) for testing in the Wright Laboratory Structural Test Facility, WPAFB, OH. Multiple thermal-mechanical load cycles will be applied simulating two atmospheric cruise missions and one orbital mission. This paper discusses the overall approach to validation testing of the wing box component and presents some preliminary analytical test predictions.
Law, Emily F.; Beals-Erickson, Sarah E.; Fisher, Emma; Lang, Emily A.; Palermo, Tonya M.
2017-01-01
Internet-delivered treatment has the potential to expand access to evidence-based cognitive-behavioral therapy (CBT) for pediatric headache, and has demonstrated efficacy in small trials for some youth with headache. We used a mixed methods approach to identify effective components of CBT for this population. In Study 1, component profile analysis identified common interventions delivered in published RCTs of effective CBT protocols for pediatric headache delivered face-to-face or via the Internet. We identified a core set of three treatment components that were common across face-to-face and Internet protocols: 1) headache education, 2) relaxation training, and 3) cognitive interventions. Biofeedback was identified as an additional core treatment component delivered in face-to-face protocols only. In Study 2, we conducted qualitative interviews to describe the perspectives of youth with headache and their parents on successful components of an Internet CBT intervention. Eleven themes emerged from the qualitative data analysis, which broadly focused on patient experiences using the treatment components and suggestions for new treatment components. In the Discussion, these mixed methods findings are integrated to inform the adaptation of an Internet CBT protocol for youth with headache. PMID:29503787
Law, Emily F; Beals-Erickson, Sarah E; Fisher, Emma; Lang, Emily A; Palermo, Tonya M
2017-01-01
Internet-delivered treatment has the potential to expand access to evidence-based cognitive-behavioral therapy (CBT) for pediatric headache, and has demonstrated efficacy in small trials for some youth with headache. We used a mixed methods approach to identify effective components of CBT for this population. In Study 1, component profile analysis identified common interventions delivered in published RCTs of effective CBT protocols for pediatric headache delivered face-to-face or via the Internet. We identified a core set of three treatment components that were common across face-to-face and Internet protocols: 1) headache education, 2) relaxation training, and 3) cognitive interventions. Biofeedback was identified as an additional core treatment component delivered in face-to-face protocols only. In Study 2, we conducted qualitative interviews to describe the perspectives of youth with headache and their parents on successful components of an Internet CBT intervention. Eleven themes emerged from the qualitative data analysis, which broadly focused on patient experiences using the treatment components and suggestions for new treatment components. In the Discussion, these mixed methods findings are integrated to inform the adaptation of an Internet CBT protocol for youth with headache.
Kazemi-Lomedasht, Fatemeh; Khalaj, Vahid; Bagheri, Kamran Pooshang; Behdani, Mahdi; Shahbazzadeh, Delavar
2017-01-01
Hemiscorpius lepturus scorpion is one of the most venomous members of the Hemiscorpiidae family. H. lepturus is distributed in Iran, Iraq and Yemen. The prevalence and severity of scorpionism is high and health services are not able to control it. Scorpionism in Iran especially in the southern regions (Khuzestan, Sistan and Baluchestan, Hormozgan, Ilam) is one of the main health challenges. Due to the medical and health importance of scorpionism, the focus of various studies has been on the identification of H. lepturus venom components. Nevertheless, until now, only a few percent of H. lepturus venom components have been identified and there is no complete information about the venom components of H. lepturus. The current study reports transcriptome analysis of the venom gland of H. lepturus scorpion. Illumina Next Generation Sequencing results identified venom components of H. lepturus. When compared with other scorpion's venom, the venom of H. lepturus consists of mixtures of peptides, proteins and enzymes such as; phospholipases, metalloproteases, hyaluronidases, potassium channel toxins, calcium channel toxins, antimicrobial peptides (AMPs), venom proteins, venom toxins, allergens, La1-like peptides, proteases and scorpine-like peptides. Comparison of identified components of H. lepturus venom was carried out with venom components of reported scorpions and various identities and similarities between them were observed. With transcriptome analysis of H. lepturus venom unique sequences, coding venom components were investigated. Moreover, our study confirmed transcript expression of previously reported peptides; Hemitoxin, Hemicalcin and Hemilipin. The gene sequences of venom components were investigated employing transcriptome analysis of venom gland of H. lepturus. In summary, new bioactive molecules identified in this study, provide basis for venomics studies of scorpions of Hemiscorpiidae family and promises development of novel biotherapeutics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tidal analysis of surface currents in the Porsanger fjord in northern Norway
NASA Astrophysics Data System (ADS)
Stramska, Malgorzata; Jankowski, Andrzej; Cieszyńska, Agata
2016-04-01
In this presentation we describe surface currents in the Porsanger fjord (Porsangerfjorden) located in the European Arctic in the vicinity of the Barents Sea. Our analysis is based on data collected in the summer of 2014 using High Frequency radar system. Our interest in this fjord comes from the fact that this is a region of high climatic sensitivity. One of our long-term goals is to develop an improved understanding of the undergoing changes and interactions between this fjord and the large-scale atmospheric and oceanic conditions. In order to derive a better understanding of the ongoing changes one must first improve the knowledge about the physical processes that create the environment of the fjord. The present study is the first step in this direction. Our main objective in this presentation is to evaluate the importance of tidal forcing. Tides in the Porsanger fjord are substantial, with tidal range on the order of about 3 meters. Tidal analysis attributes to tides about 99% of variance in sea level time series recorded in Honningsvåg. The most important tidal component based on sea level data is the M2 component (amplitude of ~90 cm). The S2 and N2 components (amplitude of ~ 20 cm) also play a significant role in the semidiurnal sea level oscillations. The most important diurnal component is K1 with amplitude of about 8 cm. Tidal analysis lead us to the conclusion that the most important tidal component in observed surface currents is also the M2 component. The second most important component is the S2 component. Our results indicate that in contrast to sea level, only about 10 - 20% of variance in surface currents can be attributed to tidal currents. This means that about 80-90% of variance can be credited to wind-induced and geostrophic currents. This work was funded by the Norway Grants (NCBR contract No. 201985, project NORDFLUX). Partial support for MS comes from the Institute of Oceanology (IO PAN).
Nam, Se Jin; Yoo, Jaeheung; Lee, Hye Sun; Kim, Eun-Kyung; Moon, Hee Jung; Yoon, Jung Hyun; Kwak, Jin Young
2016-04-01
To evaluate the diagnostic value of histogram analysis using grayscale sonograms for differentiation of malignant and benign thyroid nodules. From July 2013 through October 2013, 579 nodules in 563 patients who had undergone ultrasound-guided fine-needle aspiration were included. For the grayscale histogram analysis, pixel echogenicity values in regions of interest were measured as 0 to 255 (0, black; 255, white) with in-house software. Five parameters (mean, skewness, kurtosis, standard deviation, and entropy) were obtained for each thyroid nodule. With principal component analysis, an index was derived. Diagnostic performance rates for the 5 histogram parameters and the principal component analysis index were calculated. A total of 563 patients were included in the study (mean age ± SD, 50.3 ± 12.3 years;range, 15-79 years). Of the 579 nodules, 431 were benign, and 148 were malignant. Among the 5 parameters and the principal component analysis index, the standard deviation (75.546 ± 14.153 versus 62.761 ± 16.01; P < .001), kurtosis (3.898 ± 2.652 versus 6.251 ± 9.102; P < .001), entropy (0.16 ± 0.135 versus 0.239 ± 0.185; P < .001), and principal component analysis index (-0.386±0.774 versus 0.134 ± 0.889; P < .001) were significantly different between the malignant and benign nodules. With the calculated cutoff values, the areas under the curve were 0.681 (95% confidence interval, 0.643-0.721) for standard deviation, 0.661 (0.620-0.703) for principal component analysis index, 0.651 (0.607-0.691) for kurtosis, 0.638 (0.596-0.681) for entropy, and 0.606 (0.563-0.647) for skewness. The subjective analysis of grayscale sonograms by radiologists alone showed an area under the curve of 0.861 (0.833-0.888). Grayscale histogram analysis was feasible for differentiating malignant and benign thyroid nodules but did not show better diagnostic performance than subjective analysis performed by radiologists. Further technical advances will be needed to objectify interpretations of thyroid grayscale sonograms. © 2016 by the American Institute of Ultrasound in Medicine.
ENVIRONMENTAL ANALYSIS OF GASOLINE BLENDING COMPONENTS THROUGH THEIR LIFE CYCLE
The purpose of this study is to access the contribution of the three major gasoline blending components to the potential environmental impacts (PEI), which are the reformate, alkylate and cracked gasoline. This study accounts for losses of the gasoline blending components due to...
ENVIRONMENTAL ANALYSIS OF GASOLINE BLENDING COMPONENTS THROUGH THEIR LIFE CYCLE
The purpose of this study is to assess the contribution of the three major gasoline blending components to the potential environmental impacts (PEI), which are the reformate, alkylate and cracked gasoline. This study accounts for losses of the gasoline blending components due to ...
The Layer-Based, Pragmatic Model of the Communication Process.
ERIC Educational Resources Information Center
Targowski, Andrew S.; Bowman, Joel P.
1988-01-01
Presents the Targowski/Bowman model of the communication process, which introduces a new paradigm that isolates the various components for individual measurement and analysis, places these components into a unified whole, and places communication and its business component into a larger cultural context. (MM)
Data analysis using a combination of independent component analysis and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Lin, Shih-Lin; Tung, Pi-Cheng; Huang, Norden E.
2009-06-01
A combination of independent component analysis and empirical mode decomposition (ICA-EMD) is proposed in this paper to analyze low signal-to-noise ratio data. The advantages of ICA-EMD combination are these: ICA needs few sensory clues to separate the original source from unwanted noise and EMD can effectively separate the data into its constituting parts. The case studies reported here involve original sources contaminated by white Gaussian noise. The simulation results show that the ICA-EMD combination is an effective data analysis tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucia, M., E-mail: mlucia@pppl.gov; Kaita, R.; Majeski, R.
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
ERIC Educational Resources Information Center
Chou, Yeh-Tai; Wang, Wen-Chung
2010-01-01
Dimensionality is an important assumption in item response theory (IRT). Principal component analysis on standardized residuals has been used to check dimensionality, especially under the family of Rasch models. It has been suggested that an eigenvalue greater than 1.5 for the first eigenvalue signifies a violation of unidimensionality when there…
An Inquiry-Based Project Focused on the X-Ray Powder Diffraction Analysis of Common Household Solids
ERIC Educational Resources Information Center
Hulien, Molly L.; Lekse, Jonathan W.; Rosmus, Kimberly A.; Devlin, Kasey P.; Glenn, Jennifer R.; Wisneski, Stephen D.; Wildfong, Peter; Lake, Charles H.; MacNeil, Joseph H.; Aitken, Jennifer A.
2015-01-01
While X-ray powder diffraction (XRPD) is a fundamental analytical technique used by solid-state laboratories across a breadth of disciplines, it is still underrepresented in most undergraduate curricula. In this work, we incorporate XRPD analysis into an inquiry-based project that requires students to identify the crystalline component(s) of…
Aaron Weiskittel; Jereme Frank; David Walker; Phil Radtke; David Macfarlane; James Westfall
2015-01-01
Prediction of forest biomass and carbon is becoming important issues in the United States. However, estimating forest biomass and carbon is difficult and relies on empirically-derived regression equations. Based on recent findings from a national gap analysis and comprehensive assessment of the USDA Forest Service Forest Inventory and Analysis (USFS-FIA) component...
2007-01-01
found in this commodity. This conclusion is further supported by a study of sucrose pyrolysis products that listed furfural and 2-hydroxy-3-methyl-2...study that investigated the aroma compounds from citrus honey, and only furfural was found to be a major component in both sample matrices [40]. Analysis
Psychometric Evaluation of a Triage Decision Making Inventory
2011-06-27
the correlation matrix and inter-item correlations were reviewed. The Bartlett’s test of sphericity and the Kaiser - Meyer Olkin (KMO) were examined to...nursing experience. Principal component factor analysis with Varimax rotation was conducted using SPSS version 16. The Kaiser - Meyer - Olkin Measure of...Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 7 iterations
Analysis of Component of Aggression in the Stories of Elementary School Aggressive Children
ERIC Educational Resources Information Center
Chamandar, Fateme; Jabbari, D. Susan
2017-01-01
The purpose of this study is the content analysis of children's stories based on the components of aggression. Participants are 66 elementary school students (16 girls and 50 boys) selected from fourth and fifth grades, using the Relational and Overt Aggression Questionnaire; completed by the teachers. Draw a Story Test (Silver, 2005) is…
An RFI Detection Algorithm for Microwave Radiometers Using Sparse Component Analysis
NASA Technical Reports Server (NTRS)
Mohammed-Tano, Priscilla N.; Korde-Patel, Asmita; Gholian, Armen; Piepmeier, Jeffrey R.; Schoenwald, Adam; Bradley, Damon
2017-01-01
Radio Frequency Interference (RFI) is a threat to passive microwave measurements and if undetected, can corrupt science retrievals. The sparse component analysis (SCA) for blind source separation has been investigated to detect RFI in microwave radiometer data. Various techniques using SCA have been simulated to determine detection performance with continuous wave (CW) RFI.
ERIC Educational Resources Information Center
Caruso, John C.; Witkiewitz, Katie
2002-01-01
As an alternative to equally weighted difference scores, examined an orthogonal reliable component analysis (RCA) solution and an oblique principal components analysis (PCA) solution for the standardization sample of the Kaufman Assessment Battery for Children (KABC; A. Kaufman and N. Kaufman, 1983). Discusses the practical implications of the…
ERIC Educational Resources Information Center
Brusco, Michael J.; Singh, Renu; Steinley, Douglas
2009-01-01
The selection of a subset of variables from a pool of candidates is an important problem in several areas of multivariate statistics. Within the context of principal component analysis (PCA), a number of authors have argued that subset selection is crucial for identifying those variables that are required for correct interpretation of the…
Nonlinear seismic analysis of a reactor structure impact between core components
NASA Technical Reports Server (NTRS)
Hill, R. G.
1975-01-01
The seismic analysis of the FFTF-PIOTA (Fast Flux Test Facility-Postirradiation Open Test Assembly), subjected to a horizontal DBE (Design Base Earthquake) is presented. The PIOTA is the first in a set of open test assemblies to be designed for the FFTF. Employing the direct method of transient analysis, the governing differential equations describing the motion of the system are set up directly and are implicitly integrated numerically in time. A simple lumped-nass beam model of the FFTF which includes small clearances between core components is used as a "driver" for a fine mesh model of the PIOTA. The nonlinear forces due to the impact of the core components and their effect on the PIOTA are computed.
Principal components analysis of the photoresponse nonuniformity of a matrix detector.
Ferrero, Alejandro; Alda, Javier; Campos, Joaquín; López-Alonso, Jose Manuel; Pons, Alicia
2007-01-01
The principal component analysis is used to identify and quantify spatial distributions of relative photoresponse as a function of the exposure time for a visible CCD array. The analysis shows a simple way to define an invariant photoresponse nonuniformity and compare it with the definition of this invariant pattern as the one obtained for long exposure times. Experimental data of radiant exposure from levels of irradiance obtained in a stable and well-controlled environment are used.
Hakimzadeh, Neda; Parastar, Hadi; Fattahi, Mohammad
2014-01-24
In this study, multivariate curve resolution (MCR) and multivariate classification methods are proposed to develop a new chemometric strategy for comprehensive analysis of high-performance liquid chromatography-diode array absorbance detection (HPLC-DAD) fingerprints of sixty Salvia reuterana samples from five different geographical regions. Different chromatographic problems occurred during HPLC-DAD analysis of S. reuterana samples, such as baseline/background contribution and noise, low signal-to-noise ratio (S/N), asymmetric peaks, elution time shifts, and peak overlap are handled using the proposed strategy. In this way, chromatographic fingerprints of sixty samples are properly segmented to ten common chromatographic regions using local rank analysis and then, the corresponding segments are column-wise augmented for subsequent MCR analysis. Extended multivariate curve resolution-alternating least squares (MCR-ALS) is used to obtain pure component profiles in each segment. In general, thirty-one chemical components were resolved using MCR-ALS in sixty S. reuterana samples and the lack of fit (LOF) values of MCR-ALS models were below 10.0% in all cases. Pure spectral profiles are considered for identification of chemical components by comparing their resolved spectra with the standard ones and twenty-four components out of thirty-one components were identified. Additionally, pure elution profiles are used to obtain relative concentrations of chemical components in different samples for multivariate classification analysis by principal component analysis (PCA) and k-nearest neighbors (kNN). Inspection of the PCA score plot (explaining 76.1% of variance accounted for three PCs) showed that S. reuterana samples belong to four clusters. The degree of class separation (DCS) which quantifies the distance separating clusters in relation to the scatter within each cluster is calculated for four clusters and it was in the range of 1.6-5.8. These results are then confirmed by kNN. In addition, according to the PCA loading plot and kNN dendrogram of thirty-one variables, five chemical constituents of luteolin-7-o-glucoside, salvianolic acid D, rosmarinic acid, lithospermic acid and trijuganone A are identified as the most important variables (i.e., chemical markers) for clusters discrimination. Finally, the effect of different chemical markers on samples differentiation is investigated using counter-propagation artificial neural network (CP-ANN) method. It is concluded that the proposed strategy can be successfully applied for comprehensive analysis of chromatographic fingerprints of complex natural samples. Copyright © 2013 Elsevier B.V. All rights reserved.
Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K
2010-12-01
Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.
Model-free fMRI group analysis using FENICA.
Schöpf, V; Windischberger, C; Robinson, S; Kasess, C H; Fischmeister, F PhS; Lanzenberger, R; Albrecht, J; Kleemann, A M; Kopietz, R; Wiesmann, M; Moser, E
2011-03-01
Exploratory analysis of functional MRI data allows activation to be detected even if the time course differs from that which is expected. Independent Component Analysis (ICA) has emerged as a powerful approach, but current extensions to the analysis of group studies suffer from a number of drawbacks: they can be computationally demanding, results are dominated by technical and motion artefacts, and some methods require that time courses be the same for all subjects or that templates be defined to identify common components. We have developed a group ICA (gICA) method which is based on single-subject ICA decompositions and the assumption that the spatial distribution of signal changes in components which reflect activation is similar between subjects. This approach, which we have called Fully Exploratory Network Independent Component Analysis (FENICA), identifies group activation in two stages. ICA is performed on the single-subject level, then consistent components are identified via spatial correlation. Group activation maps are generated in a second-level GLM analysis. FENICA is applied to data from three studies employing a wide range of stimulus and presentation designs. These are an event-related motor task, a block-design cognition task and an event-related chemosensory experiment. In all cases, the group maps identified by FENICA as being the most consistent over subjects correspond to task activation. There is good agreement between FENICA results and regions identified in prior GLM-based studies. In the chemosensory task, additional regions are identified by FENICA and temporal concatenation ICA that we show is related to the stimulus, but exhibit a delayed response. FENICA is a fully exploratory method that allows activation to be identified without assumptions about temporal evolution, and isolates activation from other sources of signal fluctuation in fMRI. It has the advantage over other gICA methods that it is computationally undemanding, spotlights components relating to activation rather than artefacts, allows the use of familiar statistical thresholding through deployment of a higher level GLM analysis and can be applied to studies where the paradigm is different for all subjects. Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.
2016-01-01
Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.
ECOPASS - a multivariate model used as an index of growth performance of poplar clones
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceulemans, R.; Impens, I.
The model (ECOlogical PASSport) reported was constructed by principal component analysis from a combination of biochemical, anatomical/morphological and ecophysiological gas exchange parameters measured on 5 fast growing poplar clones. Productivity data were 10 selected trees in 3 plantations in Belgium and given as m.a.i.(b.a.). The model is shown to be able to reflect not only genetic origin and the relative effects of the different parameters of the clones, but also their production potential. Multiple regression analysis of the 4 principal components showed a high cumulative correlation (96%) between the 3 components related to ecophysiological, biochemical and morphological parameters, and productivity;more » the ecophysiological component alone correlated 85% with productivity.« less
Hierarchical Regularity in Multi-Basin Dynamics on Protein Landscapes
NASA Astrophysics Data System (ADS)
Matsunaga, Yasuhiro; Kostov, Konstatin S.; Komatsuzaki, Tamiki
2004-04-01
We analyze time series of potential energy fluctuations and principal components at several temperatures for two kinds of off-lattice 46-bead models that have two distinctive energy landscapes. The less-frustrated "funnel" energy landscape brings about stronger nonstationary behavior of the potential energy fluctuations at the folding temperature than the other, rather frustrated energy landscape at the collapse temperature. By combining principal component analysis with an embedding nonlinear time-series analysis, it is shown that the fast fluctuations with small amplitudes of 70-80% of the principal components cause the time series to become almost "random" in only 100 simulation steps. However, the stochastic feature of the principal components tends to be suppressed through a wide range of degrees of freedom at the transition temperature.
Maisuradze, Gia G; Leitner, David M
2007-05-15
Dihedral principal component analysis (dPCA) has recently been developed and shown to display complex features of the free energy landscape of a biomolecule that may be absent in the free energy landscape plotted in principal component space due to mixing of internal and overall rotational motion that can occur in principal component analysis (PCA) [Mu et al., Proteins: Struct Funct Bioinfo 2005;58:45-52]. Another difficulty in the implementation of PCA is sampling convergence, which we address here for both dPCA and PCA using a tetrapeptide as an example. We find that for both methods the sampling convergence can be reached over a similar time. Minima in the free energy landscape in the space of the two largest dihedral principal components often correspond to unique structures, though we also find some distinct minima to correspond to the same structure. 2007 Wiley-Liss, Inc.
CARES/Life Software for Designing More Reliable Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.
SCADA alarms processing for wind turbine component failure detection
NASA Astrophysics Data System (ADS)
Gonzalez, E.; Reder, M.; Melero, J. J.
2016-09-01
Wind turbine failure and downtime can often compromise the profitability of a wind farm due to their high impact on the operation and maintenance (O&M) costs. Early detection of failures can facilitate the changeover from corrective maintenance towards a predictive approach. This paper presents a cost-effective methodology to combine various alarm analysis techniques, using data from the Supervisory Control and Data Acquisition (SCADA) system, in order to detect component failures. The approach categorises the alarms according to a reviewed taxonomy, turning overwhelming data into valuable information to assess component status. Then, different alarms analysis techniques are applied for two purposes: the evaluation of the SCADA alarm system capability to detect failures, and the investigation of the relation between components faults being followed by failure occurrences in others. Various case studies are presented and discussed. The study highlights the relationship between faulty behaviour in different components and between failures and adverse environmental conditions.
Choo, Yuen May; Ng, Mei Han; Ma, Ah Ngan; Chuah, Cheng Hock; Hashim, Mohd Ali
2005-04-01
The application of supercritical fluid chromatography (SFC) coupled with a UV variable-wavelength detector to isolate the minor components (carotenes, vitamin E, sterols, and squalene) in crude palm oil (CPO) and the residual oil from palm-pressed fiber is reported. SFC is a good technique for the isolation and analysis of these compounds from the sources mentioned. The carotenes, vitamin E, sterols, and squalene were isolated in less than 20 min. The individual vitamin E isomers present in palm oil were also isolated into their respective components, alpha-tocopherol, alpha-tocotrienol, gamma-tocopherol, gamma-tocotrienol, and delta-tocotrienol. Calibration of all the minor components of palm as well as the individual components of palm vitamin E was carried out and was found to be comparable to those analyzed by other established analytical methods.
Interpreting the results of chemical stone analysis in the era of modern stone analysis techniques
Gilad, Ron; Williams, James C.; Usman, Kalba D.; Holland, Ronen; Golan, Shay; Ruth, Tor; Lifshitz, David
2017-01-01
Introduction and Objective Stone analysis should be performed in all first-time stone formers. The preferred analytical procedures are Fourier-transform infrared spectroscopy (FT-IR) or X-ray diffraction (XRD). However, due to limited resources, chemical analysis (CA) is still in use throughout the world. The aim of the study was to compare FT-IR and CA in well matched stone specimens and characterize the pros and cons of CA. Methods In a prospective bi-center study, urinary stones were retrieved from 60 consecutive endoscopic procedures. In order to assure that identical stone samples were sent for analyses, the samples were analyzed initially by micro-computed tomography to assess uniformity of each specimen before submitted for FTIR and CA. Results Overall, the results of CA did not match with the FTIR results in 56% of the cases. In 16% of the cases CA missed the major stone component and in 40% the minor stone component. 37 of the 60 specimens contained CaOx as major component by FTIR, and CA reported major CaOx in 47/60, resulting in high sensitivity, but very poor specificity. CA was relatively accurate for UA and cystine. CA missed struvite and calcium phosphate as a major component in all cases. In mixed stones the sensitivity of CA for the minor component was poor, generally less than 50%. Conclusions Urinary stone analysis using CA provides only limited data that should be interpreted carefully. Urinary stone analysis using CA is likely to result in clinically significant errors in its assessment of stone composition. Although the monetary costs of CA are relatively modest, this method does not provide the level of analytical specificity required for proper management of patients with metabolic stones. PMID:26956131
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Le; Timbie, Peter T.; Bunn, Emory F.
In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less
NASA Astrophysics Data System (ADS)
Lim, Hoong-Ta; Murukeshan, Vadakke Matham
2017-06-01
Hyperspectral imaging combines imaging and spectroscopy to provide detailed spectral information for each spatial point in the image. This gives a three-dimensional spatial-spatial-spectral datacube with hundreds of spectral images. Probe-based hyperspectral imaging systems have been developed so that they can be used in regions where conventional table-top platforms would find it difficult to access. A fiber bundle, which is made up of specially-arranged optical fibers, has recently been developed and integrated with a spectrograph-based hyperspectral imager. This forms a snapshot hyperspectral imaging probe, which is able to form a datacube using the information from each scan. Compared to the other configurations, which require sequential scanning to form a datacube, the snapshot configuration is preferred in real-time applications where motion artifacts and pixel misregistration can be minimized. Principal component analysis is a dimension-reducing technique that can be applied in hyperspectral imaging to convert the spectral information into uncorrelated variables known as principal components. A confidence ellipse can be used to define the region of each class in the principal component feature space and for classification. This paper demonstrates the use of the snapshot hyperspectral imaging probe to acquire data from samples of different colors. The spectral library of each sample was acquired and then analyzed using principal component analysis. Confidence ellipse was then applied to the principal components of each sample and used as the classification criteria. The results show that the applied analysis can be used to perform classification of the spectral data acquired using the snapshot hyperspectral imaging probe.
Spatio-Chromatic Adaptation via Higher-Order Canonical Correlation Analysis of Natural Images
Gutmann, Michael U.; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús
2014-01-01
Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation. PMID:24533049
Spatio-chromatic adaptation via higher-order canonical correlation analysis of natural images.
Gutmann, Michael U; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús
2014-01-01
Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation.
The rate of change in declining steroid hormones: a new parameter of healthy aging in men?
Walther, Andreas; Philipp, Michel; Lozza, Niclà; Ehlert, Ulrike
2016-09-20
Research on healthy aging in men has increasingly focused on age-related hormonal changes. Testosterone (T) decline is primarily investigated, while age-related changes in other sex steroids (dehydroepiandrosterone [DHEA], estradiol [E2], progesterone [P]) are mostly neglected. An integrated hormone parameter reflecting aging processes in men has yet to be identified. 271 self-reporting healthy men between 40 and 75 provided both psychometric data and saliva samples for hormone analysis. Correlation analysis between age and sex steroids revealed negative associations for the four sex steroids (T, DHEA, E2, and P). Principal component analysis including ten salivary analytes identified a principal component mainly unifying the variance of the four sex steroid hormones. Subsequent principal component analysis including the four sex steroids extracted the principal component of declining steroid hormones (DSH). Moderation analysis of the association between age and DSH revealed significant moderation effects for psychosocial factors such as depression, chronic stress and perceived general health. In conclusion, these results provide further evidence that sex steroids decline in aging men and that the integrated hormone parameter DSH and its rate of change can be used as biomarkers for healthy aging in men. Furthermore, the negative association of age and DSH is moderated by psychosocial factors.
The rate of change in declining steroid hormones: a new parameter of healthy aging in men?
Walther, Andreas; Philipp, Michel; Lozza, Niclà; Ehlert, Ulrike
2016-01-01
Research on healthy aging in men has increasingly focused on age-related hormonal changes. Testosterone (T) decline is primarily investigated, while age-related changes in other sex steroids (dehydroepiandrosterone [DHEA], estradiol [E2], progesterone [P]) are mostly neglected. An integrated hormone parameter reflecting aging processes in men has yet to be identified. 271 self-reporting healthy men between 40 and 75 provided both psychometric data and saliva samples for hormone analysis. Correlation analysis between age and sex steroids revealed negative associations for the four sex steroids (T, DHEA, E2, and P). Principal component analysis including ten salivary analytes identified a principal component mainly unifying the variance of the four sex steroid hormones. Subsequent principal component analysis including the four sex steroids extracted the principal component of declining steroid hormones (DSH). Moderation analysis of the association between age and DSH revealed significant moderation effects for psychosocial factors such as depression, chronic stress and perceived general health. In conclusion, these results provide further evidence that sex steroids decline in aging men and that the integrated hormone parameter DSH and its rate of change can be used as biomarkers for healthy aging in men. Furthermore, the negative association of age and DSH is moderated by psychosocial factors. PMID:27589836
Investigation of domain walls in PPLN by confocal raman microscopy and PCA analysis
NASA Astrophysics Data System (ADS)
Shur, Vladimir Ya.; Zelenovskiy, Pavel; Bourson, Patrice
2017-07-01
Confocal Raman microscopy (CRM) is a powerful tool for investigation of ferroelectric domains. Mechanical stresses and electric fields existed in the vicinity of neutral and charged domain walls modify frequency, intensity and width of spectral lines [1], thus allowing to visualize micro- and nanodomain structures both at the surface and in the bulk of the crystal [2,3]. Stresses and fields are naturally coupled in ferroelectrics due to inverse piezoelectric effect and hardly can be separated in Raman spectra. PCA is a powerful statistical method for analysis of large data matrix providing a set of orthogonal variables, called principal components (PCs). PCA is widely used for classification of experimental data, for example, in crystallization experiments, for detection of small amounts of components in solid mixtures etc. [4,5]. In Raman spectroscopy PCA was applied for analysis of phase transitions and provided critical pressure with good accuracy [6]. In the present work we for the first time applied Principal Component Analysis (PCA) method for analysis of Raman spectra measured in periodically poled lithium niobate (PPLN). We found that principal components demonstrate different sensitivity to mechanical stresses and electric fields in the vicinity of the domain walls. This allowed us to separately visualize spatial distribution of fields and electric fields at the surface and in the bulk of PPLN.
Exploring Two Approaches for an End-to-End Scientific Analysis Workflow
NASA Astrophysics Data System (ADS)
Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba
2015-12-01
The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.
Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao
2015-12-01
In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.
Computational model for the analysis of cartilage and cartilage tissue constructs
Smith, David W.; Gardiner, Bruce S.; Davidson, John B.; Grodzinsky, Alan J.
2013-01-01
We propose a new non-linear poroelastic model that is suited to the analysis of soft tissues. In this paper the model is tailored to the analysis of cartilage and the engineering design of cartilage constructs. The proposed continuum formulation of the governing equations enables the strain of the individual material components within the extracellular matrix (ECM) to be followed over time, as the individual material components are synthesized, assembled and incorporated within the ECM or lost through passive transport or degradation. The material component analysis developed here naturally captures the effect of time-dependent changes of ECM composition on the deformation and internal stress states of the ECM. For example, it is shown that increased synthesis of aggrecan by chondrocytes embedded within a decellularized cartilage matrix initially devoid of aggrecan results in osmotic expansion of the newly synthesized proteoglycan matrix and tension within the structural collagen network. Specifically, we predict that the collagen network experiences a tensile strain, with a maximum of ~2% at the fixed base of the cartilage. The analysis of an example problem demonstrates the temporal and spatial evolution of the stresses and strains in each component of a self-equilibrating composite tissue construct, and the role played by the flux of water through the tissue. PMID:23784936
Centrality Evolution of pt and yt Spectra from Au-Au Collisions at √ {sNN} = 200 GeV
NASA Astrophysics Data System (ADS)
Trainor, Thomas A.
A two-component analysis of spectra to pt = 12 GeV/c for identified pions and protons from 200 GeV Au-Au collisions is presented. The method is similar to an analysis of the nch dependence of pt spectra from p-p collisions at 200 GeV, but applied to Au-Au centrality dependence. The soft-component reference is a Lévy distribution on transverse mass mt. The hard-component reference is a Gaussian on transverse rapidity yt with exponential (pt power-law) tail. Deviations of data from the reference are described by hard-component ratio rAA, which generalizes nuclear modification factor RAA. The analysis suggests that centrality evolution of pion and proton spectra is dominated by changes in parton fragmentation. The structure of rAA suggests that parton energy loss produces a negative boost Δyt of a large fraction (but not all) of the minimum-bias fragment distribution, and that lower-energy partons suffer relatively less energy loss, possibly due to color screening. The analysis also suggests that the anomalous p/π ratio may be due to differences in the parton energy-loss process experienced by the two hadron species. This analysis provides no evidence for radial flow.
NASA Astrophysics Data System (ADS)
Beiden, Sergey V.; Wagner, Robert F.; Campbell, Gregory; Metz, Charles E.; Chan, Heang-Ping; Nishikawa, Robert M.; Schnall, Mitchell D.; Jiang, Yulei
2001-06-01
In recent years, the multiple-reader, multiple-case (MRMC) study paradigm has become widespread for receiver operating characteristic (ROC) assessment of systems for diagnostic imaging and computer-aided diagnosis. We review how MRMC data can be analyzed in terms of the multiple components of the variance (case, reader, interactions) observed in those studies. Such information is useful for the design of pivotal studies from results of a pilot study and also for studying the effects of reader training. Recently, several of the present authors have demonstrated methods to generalize the analysis of multiple variance components to the case where unaided readers of diagnostic images are compared with readers who receive the benefit of a computer assist (CAD). For this case it is necessary to model the possibility that several of the components of variance might be reduced when readers incorporate the computer assist, compared to the unaided reading condition. We review results of this kind of analysis on three previously published MRMC studies, two of which were applications of CAD to diagnostic mammography and one was an application of CAD to screening mammography. The results for the three cases are seen to differ, depending on the reader population sampled and the task of interest. Thus, it is not possible to generalize a particular analysis of variance components beyond the tasks and populations actually investigated.
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
Engineering Analysis of Thermal-Load Components in the Process of Heating of Pet Preforms
NASA Astrophysics Data System (ADS)
Sidorov, D. É.; Kolosov, A. E.; Kazak, I. A.; Pogorelyi, A. V.
2018-05-01
The influence of thermal-load components (convection, collimated and uncollimated components of infrared radiation) in the process of production of PET packaging on the heating of PET preforms has been assessed. It has been established that the collimated component of infrared radiation ensures most (up to 70%) of the thermal energy in the process of heating of a PET preform.
The Development and Validation of the Empathy Components Questionnaire (ECQ).
Batchelder, Laurie; Brosnan, Mark; Ashwin, Chris
2017-01-01
Key research suggests that empathy is a multidimensional construct comprising of both cognitive and affective components. More recent theories and research suggest even further factors within these components of empathy, including the ability to empathize with others versus the drive towards empathizing with others. While numerous self-report measures have been developed to examine empathy, none of them currently index all of these wider components together. The aim of the present research was to develop and validate the Empathy Components Questionnaire (ECQ) to measure cognitive and affective components, as well as ability and drive components within each. Study one utilized items measuring cognitive and affective empathy taken from various established questionnaires to create an initial version of the ECQ. Principal component analysis (PCA) was used to examine the underlying components of empathy within the ECQ in a sample of 101 typical adults. Results revealed a five-component model consisting of cognitive ability, cognitive drive, affective ability, affective drive, and a fifth factor assessing affective reactivity. This five-component structure was then validated and confirmed using confirmatory factor analysis (CFA) in an independent sample of 211 typical adults. Results also showed that females scored higher than males overall on the ECQ, and on specific components, which is consistent with previous findings of a female advantage on self-reported empathy. Findings also showed certain components predicted scores on an independent measure of social behavior, which provided good convergent validity of the ECQ. Together, these findings validate the newly developed ECQ as a multidimensional measure of empathy more in-line with current theories of empathy. The ECQ provides a useful new tool for quick and easy measurement of empathy and its components for research with both healthy and clinical populations.
The Development and Validation of the Empathy Components Questionnaire (ECQ)
Batchelder, Laurie; Brosnan, Mark; Ashwin, Chris
2017-01-01
Key research suggests that empathy is a multidimensional construct comprising of both cognitive and affective components. More recent theories and research suggest even further factors within these components of empathy, including the ability to empathize with others versus the drive towards empathizing with others. While numerous self-report measures have been developed to examine empathy, none of them currently index all of these wider components together. The aim of the present research was to develop and validate the Empathy Components Questionnaire (ECQ) to measure cognitive and affective components, as well as ability and drive components within each. Study one utilized items measuring cognitive and affective empathy taken from various established questionnaires to create an initial version of the ECQ. Principal component analysis (PCA) was used to examine the underlying components of empathy within the ECQ in a sample of 101 typical adults. Results revealed a five-component model consisting of cognitive ability, cognitive drive, affective ability, affective drive, and a fifth factor assessing affective reactivity. This five-component structure was then validated and confirmed using confirmatory factor analysis (CFA) in an independent sample of 211 typical adults. Results also showed that females scored higher than males overall on the ECQ, and on specific components, which is consistent with previous findings of a female advantage on self-reported empathy. Findings also showed certain components predicted scores on an independent measure of social behavior, which provided good convergent validity of the ECQ. Together, these findings validate the newly developed ECQ as a multidimensional measure of empathy more in-line with current theories of empathy. The ECQ provides a useful new tool for quick and easy measurement of empathy and its components for research with both healthy and clinical populations. PMID:28076406
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
Imaging of polysaccharides in the tomato cell wall with Raman microspectroscopy
2014-01-01
Background The primary cell wall of fruits and vegetables is a structure mainly composed of polysaccharides (pectins, hemicelluloses, cellulose). Polysaccharides are assembled into a network and linked together. It is thought that the percentage of components and of plant cell wall has an important influence on mechanical properties of fruits and vegetables. Results In this study the Raman microspectroscopy technique was introduced to the visualization of the distribution of polysaccharides in cell wall of fruit. The methodology of the sample preparation, the measurement using Raman microscope and multivariate image analysis are discussed. Single band imaging (for preliminary analysis) and multivariate image analysis methods (principal component analysis and multivariate curve resolution) were used for the identification and localization of the components in the primary cell wall. Conclusions Raman microspectroscopy supported by multivariate image analysis methods is useful in distinguishing cellulose and pectins in the cell wall in tomatoes. It presents how the localization of biopolymers was possible with minimally prepared samples. PMID:24917885